This article is divided into 2 parts. Skip to the second part below if you are simply looking for “social proof” that wavefunction collapse is not necessary in QM - quotes from some of the greatest geniuses of the last century.
Quantum mechanics is conventionally formulated (Copenhagen interpretation) with a “collapse postulate”: measurement of the state of a system causes it to collapse to a specific eigenstate of the measurement operator. We can trace this all the way back to von Neumann. But he was only searching for a mathematical formulation of QM that could be used for practical purposes. He did not claim that wavefunction collapse was absolutely necessary for QM to reproduce the phenomena we observe. In that era (~100 years ago!) there was no thought that the observer and measurement apparatus could be described, together with the system under observation, as a single quantum state in a very large Hilbert space.
Almost a century later, we know that quantum measurements (including the experience of the observer) can be described in terms of decoherence of distinct outcome states. See this elementary lecture (suitable for students):
Decoherence and Quantum Measurement: The Missing Lecture
arXiv:2212.02391 [quant-ph]
We give an elementary account of quantum measurement and related topics from the modern perspective of decoherence. The discussion should be comprehensible to students who have completed a basic course in quantum mechanics with exposure to concepts such as Hilbert space, density matrices, and von Neumann projection (``wavefunction collapse'').
For a historical perspective, see this discussion with Hugh Everett’s biographer. I believe Everett, Dieter Zeh (who coined the term decoherence), and some others deserve credit for this mature version of QM.
Discovering the Multiverse: Quantum Mechanics and Hugh Everett III, with Peter Byrne — #22
Related links:
Ten Years of Quantum Coherence and Decoherence
Macroscopic Superposition States in Isolated Quantum Systems
Macroscopic Superpositions in Isolated Systems (talk video + slides)
Part 2
Now to the social proof. Sadly, even in the Physics community very few individuals can think for themselves, from first principles. Hence, a collective update of our understanding of QM depends on sociological forces.
Q: Does pure state evolution in a closed system (e.g., the universe) reproduce the conventional von Neumann phenomenology for observers in the system?
This is a question about dynamical evolution: of the system as a whole, and of various interacting subsystems. It's not a philosophical question and, in my opinion, it is what theorists should focus on first. Although complicated, it is still reasonably well-posed from a mathematical perspective, at least as far as foundational physics questions go.
I believe the evidence is strong that the answer to #1 is Yes, although the issue of the Born rule lingers (too complicated to discuss here, but see various papers I have written on the topic, along with other people like Deutsch, Zurek, etc.). It is clear that Weinberg and I agree that the answer is Yes, modulo the Born rule.
Einstein's Mistakes
Steve Weinberg, Physics Today, November 2005
Bohr's version of quantum mechanics was deeply flawed, but not for the reason Einstein thought. The Copenhagen interpretation describes what happens when an observer makes a measurement, but the observer and the act of measurement are themselves treated classically. This is surely wrong: Physicists and their apparatus must be governed by the same quantum mechanical rules that govern everything else in the universe. But these rules are expressed in terms of a wavefunction (or, more precisely, a state vector) that evolves in a perfectly deterministic way. So where do the probabilistic rules of the Copenhagen interpretation come from?
Define this position to be
Y* := "Yes, possibly modulo Born"
I believe (based on published remarks or from my personal interactions) that the following theorists have opinions that are Y* or stronger: Julian Schwinger, Bryce DeWitt, John Wheeler, David Deutsch, Stephen Hawking, Richard Feynman, Murray Gell-Mann, Dieter Zeh, Jim Hartle, Steven Weinberg, W. Zurek, Alan Guth, John Preskill, Don Page, Leon Cooper (BCS), Sidney Coleman, Charles Misner, Nima Arkani-Hamed, etc.
But there is a generational issue, with many older (some now deceased!) theorists being reticent about expressing Y* even if they believe it. This is shifting over time and, for example, a poll of younger string theorists or quantum cosmologists would likely find a strong majority expressing Y*.
The excerpt below is from the excellent biography of Julian Schwinger: Climbing the Mountain by Mehra and Milton.
(p.369) Schwinger: "To me, the formalism of quantum mechanics is not just mathematics; rather it is a symbolic account of the realities of atomic measurements. That being so, no independent quantum theory of measurement is required -- it is part and parcel of the formalism.
[ ... recapitulates usual von Neumann formulation: unitary evolution of wavefunction under "normal" circumstances; non-unitary collapse due to measurement ... discusses paper hypothesizing stochastic (dynamical) wavefunction collapse ... ]
In my opinion, this is a desperate attempt to solve a non-existent problem, one that flows from a false premise, namely the vN dichotomization of quantum mechanics. Surely physicists can agree that a microscopic measurement is a physical process, to be described as would any physical process, that is distinguished only by the effective irreversibility produced by amplification to the macroscopic level. ..."
Similar views have been expressed by Feynman and Gell-Mann.
Feynman had read Everett’s thesis before it was published - they were both PhD students of John Wheeler, although Feynman was older. At a 1957 conference in Chapel Hill NC, he discusses the implications of “no-collapse” QM (noting the existence of many branches of the wavefunction), but does NOT question whether a formulation without collapse would reproduce the experience of the observer in the conventional formulation, with collapse. In other words, he accepted the dynamical aspects of decoherence.
Murray Gell-Mann wrote many papers on his “decoherent histories” formulation of QM, which he claimed to have developed without knowledge of Everett.
... although the so-called Copenhagen interpretation is perfectly correct for all laboratory physics, laboratory experiments and so on, it's too special otherwise to be fundamental and it sort of strains credulity. It's… it’s not a convincing fundamental presentation … and as far as quantum cosmology is concerned it's hopeless. We were just saying, we were just quoting that old saw: describe the universe and give three examples. Well, to apply the… the Copenhagen interpretation to quantum cosmology, you'd need a physicist outside the universe making repeated experiments, preferably on multiple copies of the universe and so on and so on. It's absurd. Clearly there is a definition to things happening independent of human observers. So I think that as this point of view is perfected it should be included in… in teaching fairly early, so that students aren't convinced that in order to understand quantum mechanics deeply they have to swallow some of this…very… some of these things that are very difficult to believe. But in the end of course, one can use the Copenhagen interpretations perfectly okay for experiments.
There are many more examples. I’ll leave you with a quote from Stephen Hawking:
“I regard [no-collapse quantum mechanics] as self-evidently correct.”
In fact he said something even more enlightening (but I can’t find the reference):
“Many Worlds follows trivially if we require that quantum mechanics applies to every object in the universe.”
APPENDIX
This is from the comments (discussion with P. Gerdes) and is only for experts!
1. I'm not sure what you mean by "observers" here. There MIGHT be "brains" which operate (process information) using entanglement (which we probably do not). Whether those exist or not (or how much probability measure they occupy) in the Everettian multiverse is unclear.
2. When the collapse postulate is introduced it's usually in a context where the surrounding semiclassical world is assumed and attention is focused on, e.g., measurement of a spin state or other simple system. Then the question is very specific: Could a no-collapse version of QM reproduce the usual measurement phenomenology (for the already postulated semiclassical measuring device and observer)? I think the modern view among almost everyone who is familiar with decoherence is that the answer is yes. (Modulo the Born rule which governs the relative probabilities of the possible outcomes. The fact that an apparent collapse (= decoherence) occurs is not controversial.
3. It is much more ambitious to ask: What does the multiverse (Everett's universal wavefunction) look like once one removes collapse?
But even there we have the vN Quantum Ergodic Theorem and my results with Buniy that say that (assuming a typical Hamiltonian with local interactions, etc.) the wavefunction breaks up into a superposition of branches that look like semiclassical worlds. (See Schmidt decomposition expression in our paper...) So this approach goes a bit further in explaining why the measurement setup in #2 above can appear in the multiverse in the first place.
4. Now, suppose you accept that the usual vN collapse + Born rule version of QM (Copenhagen) results in semiclassical worlds like the ones we observe around us. This means you think that the net result of all the probabilistic collapses, imposing the Born rule each time, leads to quantum outcomes that look like our universe. If you DON'T believe this you should reject traditional QM already as a candidate theory of physics!
5. IF you accept #4 above, THEN a suitable measure applied to the universal wavefunction (ie - something that is roughly like the Born measure, or that excludes "maverick" branches which are too improbable under the Born measure - see my work on discrete Hilbert space with Buniy and Zee) says that "almost all" branches look (in the sense of being semiclassical, etc.) like ours.
I confined the OP to #2 because #3-5 are too complicated to explain, except to experts. But you seem to keep wanting to go there!
Those talks are interesting and make some valuable points but I don't think they really address the elephant in the room. For a no collapse interpretation of QM to be correct the importance of macroscopic superposition states to empirical validity must be derived not assumed and the theory on its own can't explain why these states should play any special role in predictions.
And it's not enough merely to just identify some special mathematical property these states have with respect to the dynamics but -- unless it wants to add that as an extra physical principle undermining the claimed simplicity of no-collapse views -- you need to explain why that property is the one that should be relevant to making predictions.
To sharpen the problem note that given some complete wave-function in some Hilbert space describing the universe I can decompose the wavefunction into an infinitary sum in a whole lots of ways. Indeed, it's relatively trivial to decompose a function under an L^2 (and I suspect L^p for any p \geq 1) metric into a sum of functions which intuitively implement whatever Turing machine I want*. And it's not a priori obvious that the decomposition into something like semi-classical histories is the one you should use to figure out what observations to predict rather than some jury rigged decomposition into Turing machines implementing whatever observer you feel like.
---
*: Basically, we use translations, negations and scalings of a function which intuitively describes the operation of the Turing machine to sum to the desired function. The fact that it's possible is basically just the same result about the fact that Lebesque measure lets us approximate integrals with square functions. Just make sure that the level you use to encode a 1 decreases as both t and the square of the tape it's on increase so it's like integrating using square approximations. And with a linear dynamics if we converge to the overall solution in norm then the infinite sum is a solution and we can regard the partial sums as approximate solutions with everything behaving nicely.
I'm a physics experimentalist and have never worried (thought much) about QM interpretations. Use which ever model makes it easier for you to do a calculation. (whatever works for you.) As far as measurement and wave function collapse; The simplest process for me to think about is starting with a photon and an electron, and then ending with no photon and the electron in a higher energy state.* And as far as I know we have no model for that process.... how the heck do you cram a big photon into an electron state? Or am I somehow misunderstanding the wavefunction collapse?
*Isn't there also a wave function creation problem? Where there is first no photon and then an electron in a lower energy state and one more photon. A new wavefunction.