Can we please, for the love of god, stop pretending it is just “laypeople” who push quantum mysticism? “Consciousness causes collapse” literally originated from academia. I know you hold physicists up on a pedestal so they can do no wrong and it’s only the dumb laypeople, but the majority of times their crackpot quantum mystical claims are traceable directly back to a physicist holding a PhD. Quantum mysticism is rampant in academia as well, and people need to stop denying this fact.
pcalau12i
- 0 Posts
- 15 Comments
I’m going to reiterate my original claim because much of your comment misses the point. In the comment above I argued that quantum theory has interesting philosophical implications.
You didn’t read my original comment, then, since the whole point in my reply was to demonstrate that QM does not change the situation at all when it comes to the metaphysics, i.e. it does not have philosophical implications which classical mechanics did not have.
So when you assert materialism this is intellectual honesty, but when someone argues for an anti-materalist stance, based on observable evidence as strange as quantum entanglement (which you are quick to explain away) this is just personal metaphysics?
I don’t know if your reading comprehension really is that poor or you are just intentionally misinterpreting what I stated.
No, I did not claim that materialism is being “intellectually honest” here, I claimed that the ones being intellectually honest are the ones who do not pretend like quantum mechanics supports their metaphysics, which includes materialists, at least not any more than classical physics did.
Occam’s razor doesn’t allow us to flippantly dismiss positions we deem unintuitive.
Sure, but Sagan’s razor does, if you present your mystical claims without a shred of evidence.
Again, you’re familiar with the physics side but are incapable of considering alternate philosophical points of view.
You are incapable of being intellectually honest and want to desperately pretend that quantum physics proves idealism. I at least have the intellectual honesty to not pretend quantum mechanics is relevant to such questions of metaphysics.
Gatekeeping? If you want to believe in crackpot mysticism, be my guest. Just don’t expect me to believe it or not to criticize you for it if you attempt to spread those crank views on a public forum.
Furthermore, Bell-type experiments, which are a part of the broader quantum theory, display quantum entanglement such that measuring one half of the experiment decides the outcome of the other.
That is just non-locality. It also doesn’t “decide the outcome” of the other. It is more complicated than that. Bell’s theorem is about a locally stochastic theory having to obey Reichenbachian factorization, which is the idea that a joint probability distribution between two objects should be factorizable if you condition on a common cause in their backwards light cone where they locally interacted. If you assume this, it places certain statistical bounds on what results you can expect, which is broken in practice.
If you interpret quantum mechanics as a stochastic theory without altering its mathematics, then the outcomes are just random so nothing determines them by definition, but what one observer does in their lab does affect the kind of statistical correlations they would expect to find with another person’s lab if they later compare results. In a deterministic model that does add something, like Bohmian mechanics, this model is also contextual, so the deterministic trajectories depend upon the full experimental context. Ultimately, the particle’s trajectory is still ultimately determined by its initial state, but the observer changing the configuration of the measurement devices while the particle is mid-flight does alter the physical context of the experiment and thus can alter those trajectories.
To be clear, Bernard does not promote skepticism about reality or its objectivity. But he argues convincingly that the evidence is inconsistent with materialism.
If you presented him accurately then he undeniably does. You cannot claim X then turn around saying you’re not claiming X. If there are no facts about things until you look at them then there is no objectivity. That is literally solipsism.
Whether you agree with Bernard is immaterial (pun intended). The larger point here is that reasonable people can disagree with materialism giving the probabilistic, relational, and epistemologically problematic nature of subatomic particles.
I don’t see what is non-materialistic about statistics. One of the most famous and influential materialists in history, Friedrich Engels, heavily criticized causality in his writings, viewing cause-and-effect as an abstraction such that the same system could be described in a different context where what is considered the cause and what is considered the effect swap places. The physicist Dmitry Blokhintsev, the man who invented the concept of the graviton, was personally inspired by Engels’ writings and even cited this in a paper he wrote criticizing the Copenhagenists for thinking lack of “Laplacian determinism” as he called it implies a contradiction with materialism, saying that materialist of his school had already rejected Laplacian determinism since the 1800s.
Again, the arguments you’re making have nothing to do with quantum mechanics at all. If they have literally no relevance to quantum mechanics, then it makes no sense to try and use quantum mechanics as an argument in your favor. One can also imagine existing in a universe where the laws of physics are classical without quantum mechanics at all, but systems still undergo fundamentally random perturbations. These are classical perturbations which cannot violate Bell inequalities, but would still disallow you from tracking the definite states of particles and they could only be tracked with a vector in configuration space that is a linear combination of basis states.
If one wants to argue that randomness somehow contradicts with materialism, then the same argument could be made in that universe, and so the argument must have nothing to do with quantum mechanics.
These insights obviously conflict with our understanding of materialism! We cannot simply presume the truth of materialism because we find it more intuitive. At best, scientists can justify their assumption of materialism on practical grounds.
Sagan’s razor. “Extraordinary claims require extraordinary evidence.” “Intuitive” refers to things which are blatantly obvious and self-evident and are supported by all of our observations. To deny it thus requires a much greater burden of evidence. If you want to claim everything we perceive is a lie, that we all live inside of a grand illusion and reality actually works fundamentally differently than to what we perceive, then this is, indeed, quite an extraordinary claim, and I am simply going to dismiss it unless you can provide extraordinary evidence for it.
Yet, no extraordinary evidence is ever presented. Only vague loose philosophical arguments. That is just not convincing to me. The reality is that we already know you can fit the predictions of special relativity and quantum mechanics to simple theories point particles moving deterministically in 3D space with well-defined values at all times evolving in an absolute space and time. The point is, again, not that we should necessarily believe such a model, but the fact we know such models can be constructed disproves any claim that we cannot interpret quantum mechanics as a realist theory. If you don’t add anything to it, you have to interpret it as a stochastic theory, but I have no issue with statistics. My issue only arises when people claim a system described by a statistical distribution has “no fact” about it in the real world.
That is just mysticism not backed by anything.
I take a very “conservative” approach to philosophy. If you are going to introduce some brand new world-shattering “paradigm shift” metaphysics, then I am going to be your biggest skeptic. I will want you to demonstrate that this is a necessity, either a logical or empirical necessity, such that all more trivial ways to conceive of the world have been exhausted.
Our belief in objective reality and object permanence isn’t just something we farted out one day for fun because we have an “unreasonable bias.” People believe these things because they fit our day-to-day self-evident empirical observations and do a great job to make sense of things. If you are going to throw them out, you therefore better have a damned good reason, rather than just complaining that we’re being “biased” based on our “intuition.”
That’s just a cop-out.
2/2
You may have good arguments for one camp within this discussion (e.g., sophisticated materialism) but to dismiss the philosophical implications outright prima facie indicates either a lack of familiarity with the philosophy of physics or perhaps a dismissal of metaphysics as a fruitful enterprise.
No, it reflects something called intellectual honesty. It is always possible for two different groups of people, given the same predictive body of mathematics, to draw different metaphysical conclusions from them. The idea that the mathematics necessitate someone’s particular metaphysics is just intellectual dishonesty pushed by people with bizarre views who can’t defend them on any other grounds other than to dishonestly pretend that the mathematics somehow proves them.
Call this “strong objectivity”. In contrast, Bernard d’Espagnat, theoretical physicist and philosopher of science, argues against materialism on the grounds that standard quantum mechanics is only “weakly objective”. (See his book, “On Physics and Philosophy”.) Although our observations are intersubjectively valid, quantum mechanics is predictive rather than descriptive: it does not describe the world as consisting of mind-independent entities that have determinate properties before they are observed/measured.
This is blatantly obviously his personal metaphysical interpretation which is in no way necessitated from the mathematics. I can just look at the exact same body of mathematics and interpret it as describing an objective but stochastic world. Even in a purely classical world, but one which evolves through random perturbations, we would find that we cannot track the definite states of objects at a given time. We could thus only track an evolving probability distribution. But it is understood, typically, that when it comes to probability, that there is an underlying configuration of the system in the real world, but we just do not know which one it is.
To deny this is to deny object permanence. These properties are not invisible, they are directly observable. We just happen to not be observing them in the moment, but they still possess observable properties and thus are observable under a counterfactually conceived circumstance. This is the basis of object permanence, that we don’t reject the existence of observable things just because we are not observing them in the precise moment, as long as they can be observed under a counterfactual.
There is no fact of the matter concerning the state of the system before we measure it.
This is to devolve into crackpot solipsism. Humans are made out of particles. If particles have no fact of the matter about them until you look, then other humans also have no fact about them either before you look. This was Schrodinger’s point about his “cat” thought experiment. He was trying to point out that your beliefs about fundamental particles cannot be confined to fundamental particles, that they necessarily also imply things about macroscopic objects as well, like cats, or other people.
There is, again, literally nothing in the theory that forces you to accept this premise. The delusion goes back to John von Neumann who was a brilliant mathematician but also a crackpot who originated the “consciousness causes collapse” interpretation of quantum mechanics and was a major advocate for starting a WW3 nuclear holocaust. In one of his books on the mathematics of quantum mechanics, he tries to offer a mathematical “proof” that objective reality doesn’t exist, by showing that, if quantum mechanics is just a stochastic theory, then it should follow certain statistical laws, and shows that it violates those laws.
However, John Bell would later debunk von Neumann’s “proof” in his own response paper, published at the same time he published his famous theorem. Since von Neumann was a brilliant mathematician, there were no mathematical flaws in his “proof,” and so it had a major impact and caused many physicists to start agreeing with von Neumann’s mysticism. But Bell pointed out that the issue is not in the mathematics, but the premises. von Neumann’s assumptions about statistics are not just rules underlying pure statistics, but also include physical assumptions as well, specifically he adopted an assumption of additivity which only makes sense if the underlying physics are classical. If the underlying physics are not classical, then there is no reason for such an assumption to hold.
All von Neumann really proved was that the underlying statistical dynamics cannot be governed by classical physics. This is why Bell also published his other paper in the same year published his paper in response to the EPR paper as well, showing that Einstein, Podolsky, and Rosen’s beliefs that the underlying physics can be reduced to a classical stochastic theory are false. These physicists with crackpot beliefs love to present a false dichotomy where the only two possibilities are (1) quantum mechanics is a classically stochastic theory or (2) objective reality doesn’t exist. What Bell was trying to argue was that quantum mechanics is a non-classically stochastic theory.
What is “non-classical” about it is debatable, but the most trivial answer which was the one Bell identified is that it is simply not a local theory. In the modern day literature, this non-locality is sometimes more accurately referred to as contextuality. The stochastic dynamics simply depend upon the full experimental context. For example, consider the Elitzur-Vaidman experiment: https://arxiv.org/abs/hep-th/9305002
This experiment proves that the mere presence or absence of a barrier alters the statistical behavior of a photon which never interacts with the barrier, because the photon’s stochastic evolution depends upon the entire experimental context, not just what it directly interacts with at the moment. This is why von Neumann’s additivity assumption does not hold. It assumes that if we only consider the photon that passed through path A while B is blocked, and path B while A is blocked, then the statistics of the photons passing through A or B when neither is blocked should just be Pr(A)+Pr(B). But, as shown from the Elitzur-Vaidman setup, this is obviously not the case, because the photon, even in the individual case, is influenced by the presence or absence of a barrier it does not interact with, so even if a photon takes path A, if there is no barrier on path B, it can influence its statistical behavior differently than if a barrier were present. You therefore cannot meaningfully add together Pr(A_barrier)+Pr(B_barrier) and expect it to yield Pr(A_nobarrier)+Pr(B_nobarrier). They are not the same.
But, despite von Neumann’s proof being debunked by Bell, these same crackpots in physics academia took Bell’s theorem and started to run around claiming Bell’s new theorem is proof objective reality doesn’t exist, even though Bell never claimed that. Bell was literally a major proponent of realist models, publishing a paper trying to develop Bohm’s pilot wave theory, as well as published a stochastic model that could reproduce quantum field theory. Non-locality isn’t the only option. It’s just the simplest and most intuitive one where all the supposed “paradoxes” disappear in a puff of smoke when you accept that it’s just a contextual stochastic theory. However, there have been arguments made to drop other assumptions, like temporality rather than locality, based on the Two-State Vector Formalism. I am not a fan of non-temporality but I still respect such a position way better than denying objective reality even exists.
1/2
It really does not. Physics academia is just filled with crackpot mystics. I like to call them the metaphysical-physicists, the physicists who do not just immerse their mind in practical work but start talking metaphysics.
In 1964, the physicist John Bell proved that if you assume (1) that objective reality exists, (2) quantum mechanics is correct, and (3) special relativity is correct, then you run into a contradiction, and so one of the assumptions must be wrong. Deranged physicists in academia concluded #1 one is wrong and started to promote the crackpot mystical views that objective reality doesn’t actually exist. Like 90% of the quantum mysticism you see these does not originate from non-physicists like Deepak Chopra but from actual PhD physicists.
This is, at least, the story the mystics like to tell, that Bell’s theorem “proved” there is no objective reality. But this is a historical falsification, because if you actually check the historical record, you find that physicists in academia started to come to the “consensus” that objective reality isn’t real back in the 1927 Solvay conference, decades before John Bell ever published his theorem, and many more decades before it was ever confirmed in experiment, with Albert Einstein pretty much the last major holdout criticizing this turn of events, once asking Abraham Pais, “do you really believe that the moon doesn’t exist when you’re not looking at it?”
They already decided it doesn’t exist before they had any theorem or any empirical evidence that the theorem was correct. Bell’s theorem genuinely has nothing to do with this turn of events.
What is even more absurd is that we have known since the day special relativity was introduced in 1905 that it is not even necessary to make the right predictions of special relativity. Lorentz had proposed a theory in 1904 which is mathematically equivalent to special relativity without special relativity, and hence we know you can drop #3 without actually dropping the empirical predictions of #3. There is zero empirical necessity for premise #3.
Metaphysical-physicists love historical falsification. They make up this completely bologna narrative that we should accept the truth of special relativity because “it is the most tested theory in the history of physics,” but the statement is nonsensical, because it is mathematically equivalent to Lorentz’s theory. Hence, every “test” for special relativity is also a test of Lorentz’s theory.
You see this dishonest line of argumentation pushed a lot by the metaphysical-physicist crowd. They will push the most absurd metaphysics you can imagine that is entirely incoherent and when you say you don’t agree with that, they accuse you of denying the science because it is “well-tested.” But none of their crackpot metaphysics has been tested at all. There is no experiment you can conduct that proves a particle doesn’t have a definite value when you are not looking at it. This is just a delusion.
pcalau12i@lemmygrad.mlto
Science@mander.xyz•Are the Mysteries of Quantum Mechanics Beginning To Dissolve? | Quanta MagazineEnglish
1·26 days agodeleted by creator
pcalau12i@lemmygrad.mlto
Science@mander.xyz•Are the Mysteries of Quantum Mechanics Beginning To Dissolve? | Quanta MagazineEnglish
5·27 days agoThere isn’t a mystery, there is just physicists being mystical for no justified reason.
There is no evidence quantum systems exist in two states until you look. Physicists just endlessly gaslight each other into believing a clearly statistical theory describing a stochastic process and thus only gives you probability distributions for the results somehow has nothing to do with probability at all and only “collapses” into the probabilities when you look. It’s all a grand delusion.
The probabilities are always there from the get-go. If you separate the quantum state into its real and imaginary parts, then translate to polar form, you will see that it contains two degrees of freedom, one of those degrees of freedom is just a vector of real-valued probabilities for the current configuration of the system, and the other vector is a real-valued vector of relational phases between the objects in the system.
The latter evolves deterministically whereas the former evolves stochastically, and the stochastic evolution of the former can be influenced by the deterministic state of the latter.
The update rule for a classical probabilistic information is:
- p⃗′ = Γp⃗
Where p⃗ is your probability distribution and Γ is a stochastic matrix.
The update rule for a quantum computer can literally just be expressed as:
- p⃗′ = Γp⃗ + c⃗
Where the additional c⃗ is a coherence term that is derivative of a function on φ⃗ where φ⃗ is the deterministically evolving vector of relational phases. Quantum computers are not magic, they are just bits that evolve stochastically according to a modified stochastic rule that deviates from classical stochastic processes by the non-linear coherence term with dependence upon the deterministic evolution of φ⃗, requiring you to have to both keep track both of φ⃗ and p⃗. (Both are real-valued! Imaginary numbers aren’t magic either!)
There is no physical “collapse” of anything when you make a measurement. Since p⃗ is a probability distribution then you can perform a Bayesian knowledge update on it when you make a measurement, using Bayes’ theorem. Nothing is mysterious about that. That’s literally all it is. The quantum state ψ is complex-valued, meaning it represents two degrees of freedom in the system, one of those degrees of freedom is p⃗=|ψ|², and the other is arg(ψ)=φ⃗. When you perform a measurement, you ONLY have to update the degree of freedom associated with p⃗. You don’t have to touch the other, and this fact is guaranteed by U(1) gauge symmetry.
There is only a mystery if you delude yourself into believing that quantum mechanics is not just a non-classical probabilistic theory. If you just accept it from the get-go, then the only difficult question is how is it that classical stochastic dynamics arises from quantum stochastic dynamics on macroscopic scales, but this is already solved via decoherence. You can prove that the “+ c⃗” becomes less relevant on macroscscales.
But if you delude yourself into believing that quantum mechanics is not a stochastic theory at all, when it clearly is, then decoherence doesn’t suffice, because you would make the mistake of interpreting the entirety of ψ, both of its degrees of freedom, as physical, meaning you would be interpreting a probability distribution of p⃗ as physical. If you interpret a probability distribution as physical, then you end up interpreting the branching paths in the probability tree as physically branching paths, even though that’s clearly not what we observe as we only ever observe a single outcome, and decoherence does not get you to a single outcome, only a classical distribution of outcomes.
This transformation of p⃗ into a physical object is then “resolved” either by proposing p⃗ “collapses” down into a definite outcome when you look at it based on values of p⃗, or by claiming that the observer themselves physically branches as well into a multiverse. You end up with two absurdities because p⃗ is not a physical object. It’s a probability distribution of the system’s configuration.
The mass delusion that quantum mechanics has nothing to do with statistics or probability theory has somewhat of its origins in Bell’s theorem, where Bell proved that it is impossible for there to be an ontic state of the system when you’re not looking at it that is compatible with special relativity, and only if you marginalize out everything you aren’t looking at to isolate the measurement readout itself, only the measurement readouts are compatible with special relativity.
This led physicists to then argue for dropping anything from the model that is not the measurement readouts, so the only ontic states are the measurement readouts and ψ. They thus have to claim ψ is the physical state of the system when you are not looking or else their theory no longer has objective reality in it at all.
But this argument fails for a simple reason. When special relativity was first introduced by Einstein in 1905, it was mathematically equivalent to a theory without relativity proposed by Hendrik Lorentz in 1904. Hence, we know for a fact that a theory does not need to be relativistic to make all the same empirical predictions as relativity. Thus, you can get around Bell’s theorem and build a model with ontic states in a similar way and make all the same predictions as relativistic quantum mechanics, such as Hrvoje Nikolic’s model.
However, I don’t actually advocate for such models, but the fact that such models exist shows the very simple fact that a universe where particles have ontic states when you’re not looking at them is perfectly compatible with a universe that is relativistic when you marginalize out those ontic states and only look at measurement readouts. If the dynamics are stochastic, then this also explains why we do not include the ontic states in the model, not because they don’t exist, but because the stochastic dynamics prevents you from tracking them in the model.
Hence, you don’t need the ontic states in the model to admit that they still exist in physical reality. The particles do have real values in the real world when you’re not looking. The moon does exist when you’re not looking at it. The cat is either dead or alive, not both at the same time, before you open the box. It is just that the theory is a stochastic theory, a probabilistic theory, which does not track those states, only probability distributions for those states.
We’ve known this for ages, but people are obsessed with using quantum mechanics for their own springboard of mysticism, and so they want to pretend it is “mysterious” to justify their beliefs in multiverses, some special role for “consciousness,” or what-have-you. If you just accept the bloody obvious reality that the theory gives you a statistical distribution because it is a statistical theory, at least in part (φ⃗ evolves deterministically), then decoherence is the end of the story. There only seems like a mystery is still left when you adopt decoherence if you rejected that the theory was statistical to begin with.
Although, the physicist Jacob Barandes has pointed out that the theory can be interpreted as a purely statistical theory if you drop off the Markov assumption. This deterministic property of the system φ⃗ then becomes what he calls a “hidden Markov memory,” not a real physical property. The Markov assumption is the idea that the dynamics of a system depend solely on its present state. A non-Markovian system is one where there is also dependence upon its past state. A “hidden Markov model” is one where you include its past state as a hidden memory in the present to make the model appear Markovian. Barandes proves that you can interpret this deterministic property as actually just one of these hidden Markov memories, and you can drop it by just fitting the system’s statistical dynamics to non-Markovian stochastic laws.
pcalau12i@lemmygrad.mlto
Programmer Humor@programming.dev•disliking tech bros ≠ disliking techEnglish
1·2 months agohere’s even evidence for “warm” quantum processing happening within each neuron in the microtubules
No.
pcalau12i@lemmygrad.mlto
Science Memes@mander.xyz•it's a long distance relationshipEnglish
2·2 months agoQuantum mechanics is more weird than that. It’s not accurate to say things can be in two states at once, like a cat that is both dead and alive at the same time, or a qubit that is both 0 and 1 at the same time. If that were true, then the qubit’s mathematical description when in a superposition of states would be |0>+|1>, but it is not, it is a|0>+b|1> where the coefficients (a and b) are neither 0 or 1, and the coefficients cannot just be ignored if one were to give a physical interpretation as they are necessary for the system’s dynamics.
You talk about it being “half” a cat, so you might think the coefficients should be interpreted as proportions, but proportions are such that 0≤x≤1 and ∑x=1. But in quantum mechanics, the coefficients can be negative and even imaginary, and do not have to sum to 1. You can have 1/√2|0>-i/√2|1> as a valid superposition of states for a qubit. It does not make sense to interpret -i/√2 as a “half,” so you cannot meaningfully interpret the coefficients as a proportion.
Trying to actually interpret these quantum states ontologically is a nightmare and personally I recommend against even trying, as you will just confuse yourself, and any time you think you come up with something that makes sense, you will later find that it is wrong.
pcalau12i@lemmygrad.mlto
Science Memes@mander.xyz•it's a long distance relationshipEnglish
2·2 months agoThe point that Bell tried to point out in his “Against ‘Measurement’” article is that when you say “we start including atomic scale things we might as well just include everything up to and including the cat,” you have to place the line somewhere, sometimes called the “Heisenberg cut,” and where you place the line has empirically different implications, so wherever you choose to draw the line must necessarily constitute a different theory.
Deutsch also published a paper “Quantum theory as a universal physical theory” where he proves that drawing a line at all must constitute a different theory from quantum mechanics because it will necessarily make different empirical predictions than orthodox quantum theory.
A simple analogy is, let’s say, I claim the vial counts as an observer. The file is simple enough that I might be able to fully model it in quantum mechanics. A complete quantum mechanical model would consist of a quantum state in Hilbert space that can only evolve through physical interactions that are all described by unitary operators, and all unitary operators are reversible. So there is no possible interaction between the atom and the vial that could possibly lead to a non-reversible “collapse.”
Hence, if I genuinely had a complete model of the vial and could isolate it, I could subject it to an interaction with the cesium atom, and orthodox quantum mechanics would describe this using reversible unitary operators. If you claim it is an observer that causes a collapse, then the interaction would not be reversible. So I could then follow it up with an interaction corresponding to the Hermitian transpose of the operator describing the first interaction, which is should reverse it.
Orthodox quantum theory would predict that the reversal should succeed while your theory with observer-vials would not, and so it would ultimately predict a different statistical distribution if I tried to measure it after that interaction. Where you choose to draw the Heisenberg must necessarily make different predictions around that cut.
This is why there is so much debate over interpretation of quantum mechanics, because drawing a line feels necessary, but drawing one at all breaks the symmetry of the theory. So, either the theory is wrong, or how we think about nature is wrong.
pcalau12i@lemmygrad.mlto
World News@lemmy.ml•Video - Is Cuba a safe haven for Hamas and Hezbollah?English
2·2 months agoAnswer is probably no, as they have no nukes and no allies.
My main issue with Many Worlds is that it is always superfluous.
We know that the exponential complexity of the quantum state cannot be explained by saying every outcome simply occurs in another branch. That would make it mathematically equivalent to an ensemble, and ensembles can be decomposed into large collections of simple deterministic systems with only linear complexity. If that were how reality worked, quantum mechanics would be unnecessary. The theory could be reduced to classical statistical mechanics.
A quantum superposition, such as an electron being spin up and spin down, is not an electron doing both in some proportions. If it were, it would again be equivalent to an ensemble and fully describable using classical probability theory. If the quantum state has any ontology at all, it cannot merely represent particles doing multiple things at once. It must be something else, a distinct beable that either influences particles, as in pilot wave theories, or gives rise to them, as in collapse models.
Some Many Worlds advocates eventually concede this, but then argue that particles never really existed and are only subjective illusions, while the quantum state alone is real. Calling something a subjective illusion does not remove the need for explanation. Hallucinations are still physical processes with physical causes. You can explain them by analyzing the brain and its interactions.
Likewise, you still need a physical explanation for how the illusion of particles arises. Any such explanation ends up equivalent to explaining how real particles arise, and once you do that, Many Worlds becomes unnecessary. You can always replace the multiverse with a single universe by making the process stochastic instead of deterministic.
The crucial point is that we know a particle in a superposition of states cannot be a particle in multiple states at the same time. That is mathematically impossible and if that is what it was then it could be reduced to a classical description! Any interpretation which relies on thinking the quantum state represents an ensemble, i.e. it represents things “taking all possible paths” or “in multiple states at once,” is just confused as to the mathematics as this is not what the mathematics says.
I go into this in more detail here: https://medium.com/p/f67aacb622d5
A lot of the confusion around quantum mechanics comes from misleading cartoons about the double-slit experiment which don’t occur in reality. They usually depict it as if the particle produces a wave-like interference pattern when you’re not looking, and two separate blobs like you’d expect from particles when you look. But, again, you have never seen that, I have never seen that, no physicist has ever seen that. It only exists in cartoons.
In fact, it cannot occur because it would violate the uncertainty principle. The reason you get a spread out pattern at all is because the narrow slits constrain the particle’s position so its momentum spreads out, making its trajectory less predictable. There is simply no way you can possible have the particles both pass through narrow slits and form two neat blobs with predictable trajectories, because then you would know both their position and momentum simultaneously.
What actually happens if you run the calculation is that, in the case where you measure the which-way information of the particle, the particle still forms a wave-like pattern on the screen, but it is more akin to a wave-like single-slit diffraction pattern than an interference pattern. That is to say, it still gives you a wave-like pattern.
It is just not true that particles have two sets of behavior, “particle” and “wave” depending upon whether or not you’re looking at them. They have one set of equations that describes their stochastic motion which is always wave-like. All that measuring does is entangle your measurement device with the particle, and it is trivial to show that such entanglement prevents the particle from interfering with itself when considered in isolation from what it is entangled with.
That is all decoherence is. If you replace the measuring device with a single second particle and have it interact such that it becomes entangled with that particle, it will also make the interference pattern disappear. Entanglement spreads the interference effects across multiple systems, and if you then consider only subsystems of that entangled system in isolation then you would not observer interference effects.

It doesn’t tell us that at all. This is just bizarre metaphysics invented out of someone’s ass one day and became popular among academics, despite it having no empirical basis for it and not even being logically consistent if you take it seriously for more than five seconds.
Quantum mechanics is just a statistical theory. You literally superimpose states in classical statistical mechanics as well. The only difference is quantum mechanics has an extra degree of freedom in the state description of the system that includes phases, and those phases evolve deterministically and influence the stochastic dynamics of the system. This gives a kind of “memory” effect whereby the same operator can have different behavior if the history is different, such as, a photon having 50%/50% chance of being reflected/transmitted by a beam splitter, unless its immediate previous interaction was of a beam splitter as well, then it is 100%/0% because the state of the phases are different.
No, Sean Carroll is just wrong and he presents nothing to justify his position. The cat doesn’t stop existing when you’re not looking, nor is there is a multiverse, nor do things spread out as infinite-dimensional vectors in configuration space when you aren’t looking. You just do not know its state because it is statistical as quantum mechanics is a statistical theory. Multiverse believers love to put their idea side-by-side another idea which is even more absurd in order to make it look more viable, but they never bother to defend their ideas on their own merit, without a comparison. Any time you ever encounter a multiverse believer, they will constantly bring up Copenhagen even if you never mention it.
Carroll responds to a variant of Copenhagen that believes in a “spreading out” axiom that things diverge into a multiverse of every possibility represented by a vector in configuration space when you aren’t looking, but then suddenly “collapses” back down into a definite configuration in state space when you look. He then attacks the “collapse” as silly, and therefore we should believe things spread out as a multiverse forever. But nowhere does he ever give any convincing justification for the “spreading out” axiom to begin with. That axiom is not grounded in any empirical evidence or in the mathematics at all, and so multiverse believers can only make their position look coherent by putting it beside another silly belief which also presupposes that axiom, and thus they make it appear reasonable that they never justify it.
Just look at the awful slide 24:35. Someone can make this same argument in a perfectly classical universe. If we could not track the definite states of particles because they behaved randomly, but in a classical sense which did not violate Bell inequalities, we would also only be able to track the states of systems as vectors evolved by matrices. Someone could also come along and claim that particles do not have real states when you are not looking at them because they are not there in the mathematics, and that they are being the “reasonable” one for believing that the universe just evolves as a big deterministic vector.
We would all look at them as if they are silly. Yet, somehow, this is stated unironically among multiverse believers as if it is somehow made less silly by quantum mechanics, when absolutely nothing in the theory makes this a less silly position.