So when the implementation mandates that all returned memory should be zero'd... it should be. In other words, if the compiler distributes memory via a routine that does not zero memory a'la malloc(), the compiler should wrap the memory allocator and zero data, or use calloc(). The result can be a non-deterministic executable.
I wonder, since Bell's theorem suggests that even if we account for all the local hidden variables, we still cannot account for the non-determinism seen in quantum mechanics. Therefore, Heisenberg's Uncertainty Principle seems to say that, yes, despite what makes sense to me, the universe does have non-deterministic properties. So, I suppose we could bubble-this up to the compiler world and memory management (this reduction might be grossly wrong) and suggest that non-determinism really does exist if a memory management routine, such as malloc, can induce non-deterministic behavior in an executable.