At atomic and subatomic scales, objects behave in ways that challenge the classical worldview based on everyday interactions with macroscopic reality. A familiar example is the discovery that electrons can behave both like particles and like waves, depending on the experimental context in which they are observed. To explain this phenomenon and others, which seem contrary to the laws of physics inherited from previous centuries, self-consistent models with contradictory interpretations have been proposed by scientists such as Louis de Broglie (1892-1987), Niels Bohr (1885 -1962), Erwin Schrödinger (1887-1961) and David Bohm (1917-1992), among others.

However, the great debates that accompanied the formulation of quantum theory, involving in particular Einstein and Bohr, did not lead to conclusive results. Most physicists of the next generation opted for equations from conflicting theoretical frameworks without much concern for the underlying philosophical concepts. The equations “worked”, and that was apparently enough. Various technological artifacts that are now trivial were based on practical applications of quantum theory.

It’s human nature to question everything, and a key question that arose later was why the strange, even counterintuitive behavior seen in quantum experiments didn’t show up in the macroscopic world. . To answer this question, or circumvent it, the Polish physicist Wojciech Zurek (born in 1951) developed the concept of “quantum Darwinism”.

Simply put, the assumption is that the interaction between a physical system and its environment selects certain behaviors and excludes others, and that the behaviors retained by this “natural selection” are precisely those that fit the classical description.

So, for example, when someone reads this text, their eyes receive photons that interact with the screen of their computer or smartphone. Another person, from a different point of view, will receive different photons, but although the particles on the screen behave strangely, potentially producing completely different images from each other, the interaction with the environment selects only one type of behavior and excludes the rest, so both reads end up accessing the same text.

This line of theoretical research was continued, with an even higher degree of abstraction and generalization, in a paper by Brazilian physicist Roberto Baldijão published in Quantum, a peer-reviewed open-access journal for quantum science and related fields. related.

The article reports the results that are part of Baldijão’s doctoral research, supervised by Marcelo Terra Cunha, professor at the Institute of Mathematics, Statistics and Scientific Computing of the University of Campinas (IMECC-UUNICAMP) in Brazil.

Co-authors of the paper include Markus Müller, who supervised Baldijão’s research internship at the Institute for Quantum Optics and Quantum Information (IQOQI) of the Austrian Academy of Sciences in Vienna.

“Quantum Darwinism has been proposed as a mechanism for obtaining the classical objectivity we are used to from inherent quantum systems. In our research, we investigated which physical principles could be behind the existence of such a mechanism,” Baldijão said.

In conducting his investigation, he adopted a formalism known as generalized probabilistic theories (GPTs). “This formalism makes it possible to produce mathematical descriptions of different physical theories, and therefore to compare them. It also helps to understand which theories obey certain physical principles. Quantum theory and classical theory are two examples of GPT, but many others can also be described,” he said.

According to Baldijão, working with GPTs is convenient because it allows valid results to be obtained even if quantum theory has to be abandoned at some point. Moreover, the framework makes it possible to better apprehend the quantum formalism by comparing it to what it is not. For example, it can be used to derive quantum theory from simpler physical principles without assuming the theory from scratch. “Based on the formalism of GPTs, we can discover which principles allow the existence of ‘Darwinism’ without the need to resort to quantum theory,” he said.

The paradoxical result that Baldijão arrived at in his theoretical investigation was that classical theory emerges by “natural selection” from theories with certain non-classical characteristics only if they involve “entanglement”.

“Amazingly, the manifestation of classical behaviors via Darwinism depends on such a remarkably unclassical property as entanglement,” he said.

Entanglement, which is a key concept in quantum theory, occurs when particles are created or interact in such a way that the quantum state of each particle cannot be described independently of the others but depends on the whole.

The most famous example of entanglement is the thought experiment known as EPR (Einstein-Podolsky-Rosen). Several paragraphs are needed to explain it. In a simplified version of the experiment, Bohm imagined a situation in which two electrons interact and are then separated by an arbitrarily large distance, such as the distance between the Earth and the Moon. If the spin of an electron is measured, it can be spin up or spin down, both having the same probability. Electron spins will always end up pointing up or down after a measurement – never at some angle in between. However, due to their mode of interaction, the electrons must be paired, that is, they spin and orbit in opposite directions, regardless of the direction of measurement. Which of the two will spin up or spin down is unknown, but the results will always be opposite due to their intertwining.

The experiment was supposed to show that the formalism of quantum theory was incomplete because the entanglement assumed that information traveled between the two particles at infinite speed, which was impossible according to the theory of relativity. How could distant particles “know” which way to turn to produce opposite results? The idea was that hidden variables acted locally behind the quantum stage and that the classical worldview would be vindicated if these variables were taken into account by a more global theory.

Albert Einstein died in 1955. Nearly a decade later, his argument was more or less refuted by John Bell (1928-1990), who constructed a theorem to show that the assumption that a particle has values definitive independent of the process of observation is incompatible with quantum theory, just like the impossibility of immediate communication at a distance. In other words, the non-locality that characterizes entanglement is not a defect but a key feature of quantum theory.

Whatever its theoretical interpretation, the empirical existence of entanglement has been demonstrated in several experiments conducted since. Preserving entanglement is now the main challenge in the development of quantum computing, because quantum systems tend to quickly lose their coherence if they interact with the environment. This brings us back to quantum Darwinism.

“In our study, we showed that if a GPT displays decoherence, it’s because there is a transformation in the theory capable of implementing the idealized process of Darwinism that we envisioned,” Baldijão said. “Similarly, if a theory has sufficient structure to allow a reversible computation – a computation that can be undone – then there is also a transformation capable of implementing Darwinism. This is very interesting, given the computer applications GPTs.

As a complementary result of the study, the authors offer an example of “non-quantum Darwinism” in the form of extensions to the Spekkens toy model, a theory proposed in 2004 by Canadian physicist Robert Spekkens, currently a senior researcher at the Institute. Perimeter. of theoretical physics in Waterloo, Ontario. This model is important for the in-depth study of the foundations of quantum physics because it reproduces many forms of quantum behavior based on classical concepts.

“The model does not exhibit any kind of nonlocality and is unable to violate Bell’s inequalities,” Baldijão said. “We demonstrate that it can exhibit Darwinism, and this example also shows that the conditions we have found to ensure the presence of Darwinism – decoherence or reversible computation – are sufficient but not necessary for this process to occur in GPTs. “

As the principal investigator of the FAPESP-funded project, Cunha had this to say: “Quantum theory can be seen as a generalization of probability theory, but it is far from the only possible one. The great challenges of our field of research include understanding the properties that distinguish classical theory from quantum theory in this ocean of possible theories. Baldijão’s doctoral thesis aimed to explain how quantum Darwinism could eliminate one of the most clearly unclassical features of quantum theory: contextuality, which encompasses the concept of entanglement.

“During his research internship with Markus Müller’s group in Vienna, Baldijão worked on something even more general: the process of Darwinism in general probabilistic theories. His findings help us better understand the dynamics of certain types of theories, showing that because Darwinism preserves only the fittest and therefore creates a classical world, it is not an exclusively quantum process.