I argue that scientific determinism is not supported by facts, but results from the elegance of the mathematical language physicists use, in particular from the so-called real numbers and their infinite series of digits. Classical physics can thus be interpreted in a deterministic or indeterministic way. However, using quantum physics, some experiments prove that nature is able to continually produce new information, hence support indeterminism in physics.
What is fundamentally quantum? We argue that most of the features, problems, and paradoxes -- such as the measurement problem, the Wigner's friend paradox and its proposed solutions, single particle nonlocality, and no-cloning -- allegedly attributed to quantum physics have a clear classical analogue if one is to interpret classical physics as fundamentally indeterministic. What really characterizes quantum physics boils down only to phenomena that involve $\hbar$, i.e., incompatible observables.
Despite their importance in quantum theory, joint quantum measurements remain poorly understood. An intriguing conceptual and practical question is whether joint quantum measurements on separated systems can be performed without bringing them together. Remarkably, by using shared entanglement, this can be achieved perfectly when disregarding the post-measurement state. However, existing localization protocols typically require unbounded entanglement. In this work, we address the fundamental question: "Which joint measurements can be localized with a finite amount of entanglement?" We develop finite-resource versions of teleportation-based schemes and analytically classify all two-qubit measurements that can be localized in the first steps of these hierarchies. These include several measurements with exceptional properties and symmetries, such as the Bell state measurement and the elegant joint measurement. This leads us to propose a systematic classification of joint measurements based on entanglement cost, which we argue directly connects to the complexity of implementing those measurements. We illustrate how to numerically explore higher levels and construct generalizations to higher dimensions and multipartite settings.
Characterizing the set of distributions that can be realized in the triangle network is a notoriously difficult problem. In this work, we investigate inner approximations of the set of local (classical) distributions of the triangle network. A quantum distribution that appears to be nonlocal is the Elegant Joint Measurement (EJM) [Entropy. 2019; 21(3):325], which motivates us to study distributions having the same symmetries as the EJM. We compare analytical and neural-network-based inner approximations and find a remarkable agreement between the two methods. Using neural network tools, we also conjecture network Bell inequalities that give a trade-off between the levels of correlation and symmetry that a local distribution may feature. Our results considerably strengthen the conjecture that the EJM is nonlocal.
We propose a distinction between two different concepts of time that play a role in physics: geometric time and creative time. The former is the time of deterministic physics and merely parametrizes a given evolution. The latter is instead characterized by real change, i.e. novel information that gets created when a non-necessary event becomes determined in a fundamentally indeterministic physics. This allows us to give a naturalistic characterization of the present as the moment that separates the potential future from the determined past. We discuss how these two concepts find natural applications in classical and intuitionistic mathematics, respectively, and in classical and multivalued tensed logic, as well as how they relate to the well-known A- and B-theories in the philosophy of time.
Ning-Ning Wang, Chao Zhang, Huan Cao, Kai Xu, Bi-Heng Liu, Yun-Feng Huang, Chuan-Feng Li, Guang-Can Guo, Nicolas Gisin, Tamás Kriváchy, Marc-Olivier Renou In the last decade, it was understood that quantum networks involving several independent sources of entanglement which are distributed and measured by several parties allowed for completely novel forms of nonclassical quantum correlations, when entangled measurements are performed. Here, we experimentally obtain quantum correlations in a triangle network structure, and provide solid evidence of its nonlocality. Specifically, we first obtain the elegant distribution proposed in (Entropy 21, 325) by performing a six-photon experiment. Then, we justify its nonlocality based on machine learning tools to estimate the distance of the experimentally obtained correlation to the local set, and through the violation of a family of conjectured inequalities tailored for the triangle network.
Naive attempts to put together relativity and quantum measurements lead to signaling between space-like separated regions. In QFT, these are known as impossible measurements. We show that the same problem arises in non-relativistic quantum physics, where joint nonlocal measurements (i.e., between systems kept spatially separated) in general lead to signaling, while one would expect no-signaling (based for instance on the principle of no-nonphysical communication). This raises the question: Which nonlocal quantum measurements are physically possible? We review and develop further a non-relativistic quantum information approach developed independently of the impossible measurements in QFT, and show that these two have been addressing virtually the same problem. The non-relativistic solution shows that all nonlocal measurements are localizable (i.e., they can be carried out at a distance without violating no-signaling) but they (i) may require arbitrarily large entangled resources and (ii) cannot in general be ideal, i.e., are not immediately reproducible. These considerations could help guide the development of a complete theory of measurement in QFT.
In the late 1960s, a young physicist was sailing along the coast of California towards Berkeley, where he got a post-doc position in astronomy. But his real goal was not astronomy, at least not immediately. First, John Clauser eagerly wanted to test some predictions of quantum theory that were at odds with a then recent and mostly ignored result by an Irish physicist John Stewart Bell, working at the celebrated CERN near Geneva.
While entanglement between distant parties has been extensively studied, entangled measurements have received relatively little attention despite their significance in understanding non-locality and their central role in quantum computation and networks. We present a systematic study of entangled measurements, providing a complete classification of all equivalence classes of iso-entangled bases for projective joint measurements on 2 qubits. The application of this classification to the triangular network reveals that the Elegant Joint Measurement, along with white noise, is the only measurement resulting in output permutation invariant probability distributions when the nodes are connected by Werner states. The paper concludes with a discussion of partial results in higher dimensions.
Network nonlocality allows one to demonstrate nonclassicality in networks with fixed joint measurements, that is without random measurement settings. The simplest network in a loop, the triangle, with 4 outputs per party is especially intriguing. The "elegant distribution" [N. Gisin, Entropy 21, 325 (2019)] still resists analytic proofs, despite its many symmetries. In particular, this distribution is invariant under any output permutation. The Finner inequality, which holds for all local and quantum distributions, has been conjectured to be also valid for all no-signalling distributions with independent sources (NSI distributions). Here we provide evidence that this conjecture is false by constructing a 4-output network box that violate the Finner inequality and prove that it satisfies all NSI inflations up to the enneagon. As a first step toward the proof of the nonlocality of the elegant distribution, we prove the nonlocality of the distributions that saturates the Finner inequality by using geometrical arguments.
We investigate network nonlocality in the triangle scenario when all three parties have no input and binary outputs. Through an explicit example, we prove that this minimal scenario supports nonlocal correlations compatible with no-signaling and independence of the three sources, but not with realisations based on independent quantum or classical sources. This nonlocality is robust to noise. Moreover, we identify the equivalent to a Popescu-Rohrlich box in the minimal triangle scenario.
We propose an interpretation of physics named potentiality realism. This view, which can be applied to classical as well as to quantum physics, regards potentialities (i.e. intrinsic, objective propensities for individual events to obtain) as elements of reality, thereby complementing the actual properties taken by physical variables. This allows one to naturally reconcile realism and fundamental indeterminism in any theoretical framework. We discuss our specific interpretation of propensities, that require them to depart from being probabilities at the formal level, though allowing for statistics and the law of large numbers. This view helps reconcile classical and quantum physics by showing that most of the conceptual problems that are customarily taken to be unique issues of the latter -- such as the measurement problem -- are actually in common to all indeterministic physical theories.
Networks composed of independent sources of entangled particles that connect distant users are a rapidly developing quantum technology and an increasingly promising test-bed for fundamental physics. Here we address the certification of their post-classical properties through demonstrations of full network nonlocality. Full network nonlocality goes beyond standard nonlocality in networks by falsifying any model in which at least one source is classical, even if all the other sources are limited only by the no-signaling principle. We report on the observation of full network nonlocality in a star-shaped network featuring three independent sources of photonic qubits and joint three-qubit entanglement-swapping measurements. Our results constitute the first experimental demonstration of full network nonlocality beyond the bilocal network.
I argue against the many-world interpretation (MWI) of quantum theory by emphasizing that when everything is entangled with everything else, in one big monstrous piece, there is no room left for creativity. Since the MWI was invented, it proves itself wrong (appeared first in French in [N. Gisin, L'épidémie du multivers, in Le plus grand des hasards, p. 184, eds J.F. Dars et A. Papillault, Belin 2010]).
Discussions on indeterminism in physics focus on the possibility of an open future, i.e. the possibility of having potential alternative future events, the realisation of one of which is not fully determined by the present state of affairs. Yet, can indeterminism affect also the past, making it open as well? We show that by upholding principles of finiteness of information one can entail such a possibility. We provide a toy model that shows how the past could be fundamentally indeterminate, while also explaining the intuitive (and observed) asymmetry between the past -- which can be remembered, at least partially -- and the future -- which is impossible to fully predict.
It is expected that quantum computers would enable solving various problems that are beyond the capabilities of the most powerful current supercomputers, which are based on classical technologies. In the last three decades, advances in quantum computing stimulated significant interest in this field from industry, investors, media, executives, and general public. However, the understanding of this technology, its current capabilities and its potential impact in these communities is still lacking. Closing this gap requires a complete picture of how to assess quantum computing devices' performance and estimate their potential, a task made harder by the variety of quantum computing models and physical platforms. Here we review the state of the art in quantum computing, promising computational models and the most developed physical platforms. We also discuss potential applications, the requirements posed by these applications and technological pathways towards addressing these requirements. Finally, we summarize and analyze the arguments for the quantum computing market's further exponential growth. The review is written in a simple language without equations, and should be accessible to readers with no advanced background in mathematics and physics.
The study of nonlocality in scenarios that depart from the bipartite Einstein-Podolsky-Rosen setup is allowing to uncover many fundamental features of quantum mechanics. Recently, an approach to building network-local models based on machine learning lead to the conjecture that the family of quantum triangle distributions of [arXiv:1905.04902] did not admit triangle-local models in a larger range than the original proof. We prove part of this conjecture in the affirmative. Our approach consists in reducing the family of original, four-outcome distributions to families of binary-outcome ones, and then using the inflation technique to prove that these families of binary-outcome distributions do not admit triangle-local models. This constitutes the first successful use of inflation in a proof of quantum nonlocality in networks whose nonlocality could not be proved with alternative methods. Moreover, we provide a method to extend proofs of network nonlocality in concrete distributions of a parametrized family to continuous ranges of the parameter. In the process, we produce a large collection of network Bell inequalities for the triangle scenario with binary outcomes, which are of independent interest.
Cen-Xiao Huang, Xiao-Min Hu, Yu Guo, Chao Zhang, Bi-Heng Liu, Yun-Feng Huang, Chuan-Feng Li, Guang-Can Guo, Nicolas Gisin, Cyril Branciard, Armin Tavakoli We use hyper-entanglement to experimentally realize deterministic entanglement swapping based on quantum Elegant Joint Measurements. These are joint projections of two qubits onto highly symmetric, iso-entangled, bases. We report measurement fidelities no smaller than $97.4\%$. We showcase the applications of these measurements by using the entanglement swapping procedure to demonstrate quantum correlations in the form of proof-of-principle violations of both bilocal Bell inequalities and more stringent correlation criteria corresponding to full network nonlocality. Our results are a foray into entangled measurements and nonlocality beyond the paradigmatic Bell state measurement and they show the relevance of more general measurements in entanglement swapping scenarios.
It has recently been discovered that the nonlocality of an entangled qubit pair can be recycled for several Bell experiments. Here, we go beyond standard Bell scenarios and investigate the recycling of nonlocal resources in a quantum network. We realise a photonic quantum 3-branch star network in which three sources of entangled pairs independently connect three outer parties with a central node. After measuring, each outer party respectively relays their system to an independent secondary measuring party. We experimentally demonstrate that the outer parties can perform unsharp measurements that are strong enough to violate a network Bell inequality with the central party, but weak enough to maintain sufficient entanglement in the network to allow the three secondary parties to do the same. Moreover, the violations are strong enough to exclude any model based on standard projective measurements on the EPR pairs emitted in the network. Our experiment brings together the research program of recycling quantum resources with that of Bell nonlocality in networks.
Zheng-Da Li, Ya-Li Mao, Mirjam Weilenmann, Armin Tavakoli, Hu Chen, Lixin Feng, Sheng-Jun Yang, Marc-Olivier Renou, David Trillo, Thinh P. Le, Nicolas Gisin, Antonio Acín, Miguel Navascués, Zizhu Wang, Jingyun Fan Quantum theory is commonly formulated in complex Hilbert spaces. However, the question of whether complex numbers need to be given a fundamental role in the theory has been debated since its pioneering days. Recently it has been shown that tests in the spirit of a Bell inequality can reveal quantum predictions in entanglement swapping scenarios that cannot be modelled by the natural real-number analog of standard quantum theory. Here, we tailor such tests for implementation in state-of-the-art photonic systems. We experimentally demonstrate quantum correlations in a network of three parties and two independent EPR sources that violate the constraints of real quantum theory by over $4.5$ standard deviations, hence disproving real quantum theory as a universal physical theory.
Networks have advanced the study of nonlocality beyond Bell's theorem. Here, we introduce the concept of full network nonlocality, which describes correlations that necessitate all links in a network to distribute nonlocal resources. Showcasing that this notion is stronger than standard network nonlocality, we prove that the most well-known network Bell test does not witness full network nonlocality. In contrast, we demonstrate that its generalisation to star networks is capable of detecting full network nonlocality in quantum theory. More generally, we point out that established methods for analysing local and theory-independent correlations in networks can be combined in order to systematically deduce sufficient conditions for full network nonlocality in any network and input/output scenario. We demonstrate the usefulness of these methods by constructing polynomial witnesses of full network nonlocality for the bilocal scenario. Then, we show that these inequalities can be violated via quantum Elegant Joint Measurements.
Nonlocal boxes are conceptual tools that capture the essence of the phenomenon of quantum non-locality, central to modern quantum theory and quantum technologies. We introduce network nonlocal boxes tailored for quantum networks under the natural assumption that these networks connect independent sources and do not allow signaling. Hence, these boxes satisfy the No-Signaling and Independence (NSI) principle. For the case of boxes without inputs, connecting pairs of bipartite sources and producing binary outputs, we prove that the sources and boxes producing local random outputs and maximal 2-box correlations, i.e. $E_2=\sqrt{2}-1$, $E_2^o=1$, are essentially unique.
While complex numbers are essential in mathematics, they are not needed to describe physical experiments, expressed in terms of probabilities, hence real numbers. Physics however aims to explain, rather than describe, experiments through theories. While most theories of physics are based on real numbers, quantum theory was the first to be formulated in terms of operators acting on complex Hilbert spaces. This has puzzled countless physicists, including the fathers of the theory, for whom a real version of quantum theory, in terms of real operators, seemed much more natural. In fact, previous works showed that such "real quantum theory" can reproduce the outcomes of any multipartite experiment, as long as the parts share arbitrary real quantum states. Thus, are complex numbers really needed in the quantum formalism? Here, we show this to be case by proving that real and complex quantum theory make different predictions in network scenarios comprising independent states and measurements. This allows us to devise a Bell-like experiment whose successful realization would disprove real quantum theory, in the same way as standard Bell experiments disproved local physics.
A long-standing tradition, largely present in both the physical and the philosophical literature, regards the advent of (special) relativity -- with its block-universe picture -- as the failure of any indeterministic program in physics. On the contrary, in this paper, we note that upholding reasonable principles of finiteness of information hints at a picture of the physical world that should be both relativistic and indeterministic. We thus rebut the block-universe picture by assuming that fundamental indeterminacy itself should as well be regarded as a relational property when considered in a relativistic scenario. We discuss the consequence that this view may have when correlated randomness is introduced, both in the classical case and in the quantum one.
Most physics theories are deterministic, with the notable exception of quantum mechanics which, however, comes plagued by the so-called measurement problem. This state of affairs might well be due to the inability of standard mathematics to "speak" of indeterminism, its inability to present us a worldview in which new information is created as time passes. In such a case, scientific determinism would only be an illusion due to the timeless mathematical language scientists use. To investigate this possibility it is necessary to develop an alternative mathematical language that is both powerful enough to allow scientists to compute predictions and compatible with indeterminism and the passage of time. We argue that intuitionistic mathematics provides such a language and we illustrate it in simple terms.
Increasingly sophisticated quantum computers motivate the exploration of their abilities in certifying genuine quantum phenomena. Here, we demonstrate the power of state-of-the-art IBM quantum computers in correlation experiments inspired by quantum networks. Our experiments feature up to 12 qubits and require the implementation of paradigmatic Bell-State Measurements for scalable entanglement-swapping. First, we demonstrate quantum correlations that defy classical models in up to nine-qubit systems while only assuming that the quantum computer operates on qubits. Harvesting these quantum advantages, we are able to certify 82 basis elements as entangled in a 512-outcome measurement. Then, we relax the qubit assumption and consider quantum nonlocality in a scenario with multiple independent entangled states arranged in a star configuration. We report quantum violations of source-independent Bell inequalities for up to ten qubits. Our results demonstrate the ability of quantum computers to outperform classical limitations and certify scalable entangled measurements.
In this short note we reply to a comment by Callegaro et al. [1] (arXiv:2009.11709) that points out some weakness of the model of indeterministic physics that we proposed in Ref. [2] (Physical Review A, 100(6), p.062107), based on what we named "finite information quantities" (FIQs). While we acknowledge the merit of their criticism, we maintain that it applies only to a concrete example that we discussed in [2], whereas the main concept of FIQ remains valid and suitable for describing indeterministic physical models. We hint at a more sophisticated way to define FIQs which, taking inspiration from intuitionistic mathematics, would allow to overcome the criticisms in [1].
Steering is a physical phenomenon which is not restricted to quantum theory, it is also present in more general, no-signalling theories. Here, we study steering from the point of view of no-signalling theories. First, we show that quantum steering involves a collection of different aspects, which need to be separated when considering steering in no-signalling theories. By deconstructing quantum steering, we learn more about the nature of the steering phenomenon itself. Second, we introduce a new concept, that we call "blind steering", which can be seen as the most basic form of steering, present both in quantum mechanics and no-signalling theories.
Symmetric informationally complete measurements (SICs) are elegant, celebrated and broadly useful discrete structures in Hilbert space. We introduce a more sophisticated discrete structure compounded by several SICs. A SIC-compound is defined to be a collection of $d^3$ vectors in $d$-dimensional Hilbert space that can be partitioned in two different ways: into $d$ SICs and into $d^2$ orthonormal bases. While a priori their existence may appear unlikely when $d>2$, we surprisingly answer it in the positive through an explicit construction for $d=4$. Remarkably this SIC-compound admits a close relation to mutually unbiased bases, as is revealed through quantum state discrimination. Going beyond fundamental considerations, we leverage these exotic properties to construct a protocol for quantum key distribution and analyze its security under general eavesdropping attacks. We show that SIC-compounds enable secure key generation in the presence of errors that are large enough to prevent the success of the generalisation of the six-state protocol.
Network Bell experiments give rise to a form of quantum nonlocality that conceptually goes beyond Bell's theorem. We investigate here the simplest network, known as the bilocality scenario. We depart from the typical use of the Bell State Measurement in the network central node and instead introduce a family of symmetric iso-entangled measurement bases that generalise the so-called Elegant Joint Measurement. This leads us to report noise-tolerant quantum correlations that elude bilocal variable models. Inspired by these quantum correlations, we introduce network Bell inequalities for the bilocality scenario and show that they admit noise-tolerant quantum violations. In contrast to many previous studies of network Bell inequalities, neither our inequalities nor their quantum violations are based on standard Bell inequalities and standard quantum nonlocality. Moreover, we pave the way for an experimental realisation by presenting a simple two-qubit quantum circuit for the implementation of the Elegant Joint Measurement and our generalisation.
Some properties of physical systems can be characterized from their correlations. In that framework, subsystems are viewed as abstract devices that receive measurement settings as inputs and produce measurement outcomes as outputs. The labeling convention used to describe these inputs and outputs does not affect the physics; and relabelings are easily implemented by rewiring the input and output ports of the devices. However, a more general class of operations can be achieved by using correlated preprocessing and postprocessing of the inputs and outputs. In contrast to relabelings, some of these operations irreversibly lose information about the underlying device. Other operations are reversible, but modify the number of cardinality of inputs and/or outputs. In this work, we single out the set of deterministic local maps as the one satisfying two equivalent constructions: an operational definition from causality, and an axiomatic definition reminiscent of the definition of quantum completely positive trace-preserving maps. We then study the algebraic properties of that set. Surprisingly, the study of these fundamental properties has deep and practical applications. First, the invariant subspaces of these transformations directly decompose the space of correlations/Bell inequalities into nonsignaling, signaling and normalization components. This impacts the classification of Bell and causal inequalities, and the construction of assemblages/witnesses in steering scenarios. Second, the left and right invertible deterministic local operations provide an operational generalization of the liftings introduced by Pironio [J. Math. Phys., 46(6):062112 (2005)]. Not only Bell-local, but also causal inequalities can be lifted; liftings also apply to correlation boxes in a variety of scenarios.
Physics is formulated in terms of timeless classical mathematics. A formulation on the basis of intuitionist mathematics, built on time-evolving processes, would offer a perspective that is closer to our experience of physical reality.
The Platonic solids is the name traditionally given to the five regular convex polyhedra, namely the tetradron, the octahedron, the cube, the icosahedron and the dodecahedron. Perhaps strongly boosted by the towering historical influence of their namesake, these beautiful solids have, in well over two millenia, transcended traditional boundaries and entered the stage in a range of disciplines. Examples include natural philosophy and mathematics from classical antiquity, scientific modeling during the days of the european scientific revolution and visual arts ranging from the renaissance to modernity. Motivated by mathematical beauty and a rich history, we consider the Platonic solids in the context of modern quantum mechanics. Specifically, we construct Bell inequalities whose maximal violations are achieved with measurements pointing to the vertices of the Platonic solids. These Platonic Bell inequalities are constructed only by inspecting the visible symmetries of the Platonic solids. We also construct Bell inequalities for more general polyhedra and find a Bell inequality that is more robust to noise than the celebrated Clauser-Horne-Shimony-Holt Bell inequality. Finally, we elaborate on the tension between mathematical beauty, which was our initial motivation, and experimental friendliness, which is necessary in all empirical sciences.
Quantum memories with long storage times are key elements in long-distance quantum networks. The atomic frequency comb (AFC) memory in particular has shown great promise to fulfill this role, having demonstrated multimode capacity and spin-photon quantum correlations. However, the memory storage times have so-far been limited to about one millisecond, realized in a Eu${}^{3+}$ doped Y${}_2$SiO${}_5$ crystal at zero applied magnetic field. Motivated by studies showing increased spin coherence times under applied magnetic field, we developed a AFC spin-wave memory utilizing a weak 15 mT magnetic field in a specific direction that allows efficient optical and spin manipulation for AFC memory operations. With this field configuration the AFC spin-wave storage time increased to 40 ms using a simple spin-echo sequence. Furthermore, by applying dynamical decoupling techniques the spin-wave coherence time reaches 530 ms, a 300-fold increase with respect to previous AFC spin-wave storage experiments. This result paves the way towards long duration storage of quantum information in solid-state ensemble memories.
Quantum communication leads to strong correlations, that can outperform classical ones. Complementary to previous works in this area, we investigate correlations in prepare-and-measure scenarios assuming a bound on the information content of the quantum communication, rather than on its Hilbert-space dimension. Specifically, we explore the extent of classical and quantum correlations given an upper bound on the one-shot accessible information. We provide a characterisation of the set of classical correlations and show that quantum correlations are stronger than classical ones. We also show that limiting information rather than dimension leads to stronger quantum correlations. Moreover, we present device-independent tests for placing lower bounds on the information given observed correlations. Finally, we show that quantum communication carrying $\log d$ bits of information is at least as strong a resource as $d$-dimensional classical communication assisted by pre-shared entanglement.
Do scientific theories limit human knowledge? In other words, are there physical variables hidden by essence forever? We argue for negative answers and illustrate our point on chaotic classical dynamical systems. We emphasize parallels with quantum theory and conclude that the common real numbers are, de facto, the hidden variables of classical physics. Consequently, real numbers should not be considered as "physically real" and classical mechanics, like quantum physics, is indeterministic.
Classical physics is generally regarded as deterministic, as opposed to quantum mechanics that is considered the first theory to have introduced genuine indeterminism into physics. We challenge this view by arguing that the alleged determinism of classical physics relies on the tacit, metaphysical assumption that there exists an actual value of every physical quantity, with its infinite predetermined digits (which we name \emphprinciple of infinite precision). Building on recent information-theoretic arguments showing that the principle of infinite precision (which translates into the attribution of a physical meaning to mathematical real numbers) leads to unphysical consequences, we consider possible alternative indeterministic interpretations of classical physics. We also link those to well-known interpretations of quantum mechanics. In particular, we propose a model of classical indeterminism based on \emphfinite information quantities (FIQs). Moreover, we discuss the perspectives that an indeterministic physics could open (such as strong emergence), as well as some potential problematic issues. Finally, we make evident that any indeterministic interpretation of physics would have to deal with the problem of explaining how the indeterminate values become determinate, a problem known in the context of quantum mechanics as (part of) the ``quantum measurement problem''. We discuss some similarities between the classical and the quantum measurement problems, and propose ideas for possible solutions (e.g., ``collapse models'' and ``top-down causation'').
Characterizing quantum nonlocality in networks is a challenging, but important problem. Using quantum sources one can achieve distributions which are unattainable classically. A key point in investigations is to decide whether an observed probability distribution can be reproduced using only classical resources. This causal inference task is challenging even for simple networks, both analytically and using standard numerical techniques. We propose to use neural networks as numerical tools to overcome these challenges, by learning the classical strategies required to reproduce a distribution. As such, the neural network acts as an oracle, demonstrating that a behavior is classical if it can be learned. We apply our method to several examples in the triangle configuration. After demonstrating that the method is consistent with previously known results, we give solid evidence that the distribution presented in [N. Gisin, Entropy 21(3), 325 (2019)] is indeed nonlocal as conjectured. Finally we examine the genuinely nonlocal distribution presented in [M.-O. Renou et al., PRL 123, 140401 (2019)], and, guided by the findings of the neural network, conjecture nonlocality in a new range of parameters in these distributions. The method allows us to get an estimate on the noise robustness of all examined distributions.
Generalising the concept of Bell nonlocality to networks leads to novel forms of correlations, the characterization of which is however challenging. Here we investigate constraints on correlations in networks under the two natural assumptions of no-signaling and independence of the sources. We consider the ``triangle network'', and derive strong constraints on correlations even though the parties receive no input, i.e. each party performs a fixed measurement. We show that some of these constraints are tight, by constructing explicit local models (i.e. where sources distribute classical variables) that can saturate them. However, we also observe that other constraints can apparently not be saturated by local models, which opens the possibility of having nonlocal (but non-signaling) correlations in the triangle network.
Quantum networks allow in principle for completely novel forms of quantum correlations. In particular, quantum nonlocality can be demonstrated here without the need of having various input settings, but only by considering the joint statistics of fixed local measurement outputs. However, previous examples of this intriguing phenomenon all appear to stem directly from the usual form of quantum nonlocality, namely via the violation of a standard Bell inequality. Here we present novel examples of 'quantum nonlocality without inputs', which we believe represent a new form of quantum nonlocality, genuine to networks. Our simplest examples, for the triangle network, involve both entangled states and joint entangled measurements. A generalization to any odd-cycle network is also presented. Finally, we conclude with some open questions.
We report a detailed study of the noise properties of a visible-to-telecom photon frequency converter based on difference frequency generation (DFG). The device converts 580 nm photons to 1541 nm using a strong pump laser at 930 nm, in a periodically poled lithium niobate ridge waveguide. The converter reaches a maximum device efficiency of 46 % (internal efficiency of 67 %) at a pump power of 250 mW. The noise produced by the pump laser is investigated in detail by recording the noise spectra both in the telecom and visible regimes, and measuring the power dependence of the noise rates. The noise spectrum in the telecom is very broadband, as expected from previous work on similar DFG converters. However, we also observe several narrow dips in the telecom spectrum, with corresponding peaks appearing in the 580 nm noise spectrum. These features are explained by sum frequency generation of the telecom noise at wavelengths given by the phase matching condition of different spatial modes in the waveguide. The proposed noise model is in good agreement with all the measured data, including the power-dependence of the noise rates, both in the visible and telecom regime. These results are applicable to the class of DFG converters where the pump laser wavelength is in between the input and target wavelength.
A quantum network consists of independent sources distributing entangled states to distant nodes which can then perform entangled measurements, thus establishing correlations across the entire network. But how strong can these correlations be? Here we address this question, by deriving bounds on possible quantum correlations in a given network. These bounds are nonlinear inequalities that depend only on the topology of the network. We discuss in detail the notably challenging case of the triangle network. Moreover, we conjecture that our bounds hold in general no-signaling theories. In particular, we prove that our inequalities for the triangle network hold when the sources are arbitrary no-signaling boxes which can be wired together. Finally, we discuss an application of our results for the device-independent characterization of the topology of a quantum network.
We study Bell scenarios with binary outcomes supplemented by one bit of classical communication. We develop a method to find facet inequalities for such scenarios even when direct facet enumeration is not possible, or at least difficult. Using this method, we partially solve the scenario where Alice and Bob choose between three inputs, finding a total of 668 inequivalent facet inequalities (with respect to relabelings of inputs and outputs). We also show that some of these inequalities are constructed from the facet inequalities found in scenarios without communication, the well known Bell inequalities.
We give the complete list of 175 facet Bell inequalities for the case where Alice and Bob each choose their measurements from a set of four binary outcome measurements. For each inequality we compute the maximum quantum violation for qubits, the resistance to noise, and the minimal detection efficiency required for closing the detection loophole with maximally entangled qubit states, in the case where both detectors have the same efficiency (symmetric case).
A sequential steering scenario is investigated, where multiple Bobs aim at demonstrating steering using successively the same half of an entangled quantum state. With isotropic entangled states of local dimension $d$, the number of Bobs that can steer Alice is found to be $N_\mathrm{Bob}\sim d/\log{d}$, thus leading to an arbitrary large number of successive instances of steering with independently chosen and unbiased inputs. This scaling is achieved when considering a general class of measurements along orthonormal bases, as well as complete sets of mutually unbiased bases. Finally, we show that similar results can be obtained in an anonymous sequential scenario, where none of the Bobs know their position in the sequence.
The quantum Fisher information (QFI) of certain multipartite entangled quantum states is larger than what is reachable by separable states, providing a metrological advantage. Are these nonclassical correlations strong enough to potentially violate a Bell inequality? Here, we present evidence from two examples. First, we discuss a Bell inequality designed for spin-squeezed states which is violated only by quantum states with a large QFI. Second, we relax a well-known lower bound on the QFI to find the Mermin Bell inequality as a special case. However, a fully general link between QFI and Bell correlations is still open.
Twenty-five years after the invention of quantum teleportation, the concept of entanglement gained enormous popularity. This is especially nice to those who remember that entanglement was not even taught at universities until the 1990's. Today, entanglement is often presented as a resource, the resource of quantum information science and technology. However, entanglement is exploited twice in quantum teleportation. First, entanglement is the `quantum teleportation channel', i.e. entanglement between distant systems. Second, entanglement appears in the eigenvectors of the joint measurement that Alice, the sender, has to perform jointly on the quantum state to be teleported and her half of the `quantum teleportation channel', i.e. entanglement enabling entirely new kinds of quantum measurements. I emphasize how poorely this second kind of entanglement is understood. In particular, I use quantum networks in which each party connected to several nodes performs a joint measurement to illustrate that the quantumness of such joint measurements remains elusive, escaping today's available tools to detect and quantify it.
In standard quantum theory, time is not an observable. It enters as a parameter in the Schrödinger equation, but there is no measurement operator associated to it. Nevertheless, one may take an operational viewpoint and regard time as the information one can read from clocks. The Alternate Ticks Game, introduced in arXiv:1506.01373, is a completely operational means to quantify the accuracy of time scales generated by clocks. The idea is to count the number of ticks that two copies of a clock can produce until they run out of synchronisation. Here we investigate the performance of stochastic clocks in this game. These are clocks which are classical in the sense that they do not exploit quantum coherence. Our results support earlier conjectures that their accuracy grows linearly in the size of the clockwork, measured in terms of the dimension of the associated Hilbert space. In particular, we derive explicit bounds on the accuracy of a natural class of stochastic clocks, the stochastic ladder clocks.
The semi-device-independent framework allows one to draw conclusions about properties of an unknown quantum system under weak assumptions. Here we present a semi-device-independent scheme for the characterisation of multipartite entanglement based around a game played by several isolated parties whose devices are uncharacterised beyond an assumption about the dimension of their Hilbert spaces. Our scheme can certify that an $n$-partite high-dimensional quantum state features genuine multipartite entanglement. Moreover, the scheme can certify that a joint measurement on $n$ subsystems is entangled, and provides a lower bound on the number of entangled measurement operators. These tests are strongly robust to noise, and even optimal for certain classes of states and measurements, as we demonstrate with illustrative examples. Notably, our scheme allows for the certification of many entangled states admitting a local model, which therefore cannot violate any Bell inequality.
It is usual to identify initial conditions of classical dynamical systems with mathematical real numbers. However, almost all real numbers contain an infinite amount of information. I argue that a finite volume of space can't contain more than a finite amount of information, hence that the mathematical real numbers are not physically relevant. Moreover, a better terminology for the so-called real numbers is ``random numbers'', as their series of bits are truly random. I propose an alternative classical mechanics, which is empirically equivalent to classical mechanics, but uses only finite-information numbers. This alternative classical mechanics is non-deterministic, despite the use of deterministic equations, in a way similar to quantum theory. Interestingly, both alternative classical mechanics and quantum theories can be supplemented by additional variables in such a way that the supplemented theory is deterministic. Most physicists straightforwardly supplement classical theory with real numbers to which they attribute physical existence, while most physicists reject Bohmian mechanics as supplemented quantum theory, arguing that Bohmian positions have no physical reality.
Quantum non-locality has been an extremely fruitful subject of research, leading the scientific revolution towards quantum information science, in particular to device-independent quantum information processing. We argue that time is ripe to work on another basic problem in the foundations of quantum physics, the quantum measurement problem, that should produce good physics both in theoretical, mathematical, experimental and applied physics. We briefly review how quantum non-locality contributed to physics (including some outstanding open problems) and suggest ways in which questions around Macroscopic Quantumness could equally contribute to all aspects of physics.
Do experiments based on superconducting loops segmented with Josephson junctions (e.g., flux qubits) show macroscopic quantum behavior in the sense of Schrödinger's cat example? Various arguments based on microscopic and phenomenological models were recently adduced in this debate. We approach this problem by adapting (to flux qubits) the framework of large-scale quantum coherence, which was already successfully applied to spin ensembles and photonic systems. We show that contemporary experiments might show quantum coherence more than 100 times larger than experiments in the classical regime. However, we argue that the often-used demonstration of an avoided crossing in the energy spectrum is not sufficient to make a conclusion about the presence of large-scale quantum coherence. Alternative, rigorous witnesses are proposed.
Solid-state electronic spins are extensively studied in quantum information science, as their large magnetic moments offer fast operations for computing and communication, and high sensitivity for sensing. However, electronic spins are more sensitive to magnetic noise, but engineering of their spectroscopic properties, e.g. using clock transitions and isotopic engineering, can yield remarkable spin coherence times, as for electronic spins in GaAs, donors in silicon and vacancy centres in diamond. Here we demonstrate simultaneously induced clock transitions for both microwave and optical domains in an isotopically purified $^{171}$Yb$^{3+}$:Y$_2$SiO$_5$ crystal, reaching coherence times of above 100 $\mu$s and 1 ms in the optical and microwave domain, respectively. This effect is due to the highly anisotropic hyperfine interaction, which makes each electronic-nuclear state an entangled Bell state. Our results underline the potential of $^{171}$Yb$^{3+}$:Y$_2$SiO$_5$ for quantum processing applications relying on both optical and spin manipulation, such as optical quantum memories, microwave-tooptical quantum transducers, and single spin detection, while they should also be observable in a range of different materials with anisotropic hyperfine interaction.
Rare-earth ion doped crystals are promising systems for quantum communication and quantum information processing. In particular, paramagnetic rare-earth centres can be utilized to realize quantum coherent interfaces simultaneously for optical and microwave photons. In this article, we study hyperfine and magnetic properties of a Y$_2$SiO$_5$ crystal doped with $^{171}$Yb$^{3+}$ ions. This isotope is particularly interesting since it is the only rare--earth ion having electronic spin $S=\frac{1}{2}$ and nuclear spin $I=\frac{1}{2}$, which results in the simplest possible hyperfine level structure. In this work we determine the hyperfine tensors for the ground and excited states on the optical $^2$F$_{7/2}(0) \longleftrightarrow ^2$F$_{5/2}$(0) transition by combining spectral holeburning and optically detected magnetic resonance techniques. The resulting spin Hamiltonians correctly predict the magnetic-field dependence of all observed optical-hyperfine transitions, from zero applied field to the high-field regime where the Zeeman interaction is dominating. Using the optical absorption spectrum we can also determine the order of the hyperfine levels in both states. These results pave the way for realizing solid-state optical and microwave quantum memories based on a $^{171}$Yb$^{3+}$:Y$_2$SiO$_5$ crystal.
Antonio Acín, Immanuel Bloch, Harry Buhrman, Tommaso Calarco, Christopher Eichler, Jens Eisert, Daniel Esteve, Nicolas Gisin, Steffen J. Glaser, Fedor Jelezko, Stefan Kuhr, Maciej Lewenstein, Max F. Riedel, Piet O. Schmidt, Rob Thew, Andreas Wallraff, Ian Walmsley, Frank K. Wilhelm Within the last two decades, Quantum Technologies (QT) have made tremendous progress, moving from Noble Prize award-winning experiments on quantum physics into a cross-disciplinary field of applied research. Technologies are being developed now that explicitly address individual quantum states and make use of the 'strange' quantum properties, such as superposition and entanglement. The field comprises four domains: Quantum Communication, Quantum Simulation, Quantum Computation, and Quantum Sensing and Metrology. One success factor for the rapid advancement of QT is a well-aligned global research community with a common understanding of the challenges and goals. In Europe, this community has profited from several coordination projects, which have orchestrated the creation of a 150-page QT Roadmap. This article presents an updated summary of this roadmap. Besides sections on the four domains of QT, we have included sections on Quantum Theory and Software, and on Quantum Control, as both are important areas of research that cut across all four domains. Each section, after a short introduction to the domain, gives an overview on its current status and main challenges and then describes the advances in science and technology foreseen for the next ten years and beyond.
Efficient optical pumping is an important tool for state initialization in quantum technologies, such as optical quantum memories. In crystals doped with Kramers rare-earth ions, such as erbium and neodymium, efficient optical pumping is challenging due to the relatively short population lifetimes of the electronic Zeeman levels, of the order of 100 ms at around 4 K. In this article we show that optical pumping of the hyperfine levels in isotopically enriched $^{145}$Nd$^{3+}$:Y$_2$SiO$_5$ crystals is more efficient, owing to the longer population relaxation times of hyperfine levels. By optically cycling the population many times through the excited state a nuclear-spin flip can be forced in the ground-state hyperfine manifold, in which case the population is trapped for several seconds before relaxing back to the pumped hyperfine level. To demonstrate the effectiveness of this approach in applications we perform an atomic frequency comb memory experiment with 33% storage efficiency in $^{145}$Nd$^{3+}$:Y$_2$SiO$_5$, which is on a par with results obtained in non-Kramers ions, e.g. europium and praseodymium, where optical pumping is generally efficient due to the quenched electronic spin. Efficient optical pumping in neodymium-doped crystals is also of interest for spectral filtering in biomedical imaging, as neodymium has an absorption wavelength compatible with tissue imaging. In addition to these applications, our study is of interest for understanding spin dynamics in Kramers ions with nuclear spin.
Quantum measurements have intrinsic properties which seem incompatible with our everyday-life macroscopic measurements. Macroscopic Quantum Measurement (MQM) is a concept that aims at bridging the gap between well understood microscopic quantum measurements and macroscopic classical measurements. In this paper, we focus on the task of the polarization direction estimation of a system of $N$ spins $1/2$ particles and investigate the model some of us proposed in Barnea et al., 2017. This model is based on a von Neumann pointer measurement, where each spin component of the system is coupled to one of the three spatial components direction of a pointer. It shows traits of a classical measurement for an intermediate coupling strength. We investigate relaxations of the assumptions on the initial knowledge about the state and on the control over the MQM. We show that the model is robust with regard to these relaxations. It performs well for thermal states and a lack of knowledge about the size of the system. Furthermore, a lack of control on the MQM can be compensated by repeated "ultra-weak" measurements.
We characterize the Europium (Eu$^{3+}$) hyperfine interaction of the excited state ($^5$D$_0$) and determine its effective spin Hamiltonian parameters for the Zeeman and quadrupole tensors. An optical free induction decay method is used to measure all hyperfine splittings under weak external magnetic field (up to 10 mT) for various field orientations. On the basis of the determined Hamiltonian we discuss the possibility to predict optical transition probabilities between hyperfine levels for the $^7$F$_{0} \longleftrightarrow ^5$D$_{0}$ transition. The obtained results provide necessary information to realize an optical quantum memory scheme which utilizes long spin coherence properties of $^{151}$Eu$^{3+}$:Y$_2$SiO$_5$ material under external magnetic fields
We consider a spin chain extending from Alice to Bob with next neighbors interactions, initially in its ground state. Assuming that Bob measures the last spin of the chain, the energy of the spin chain has to increase, at least on average, due to the measurement disturbance. Presumably, the energy is provided by Bob's measurement apparatus. Assuming now that, simultaneously to Bob's measurement, Alice measures the first spin, we show that either energy is not conserved, - implausible - or the projection postulate doesn't apply, and that there is signalling. An explicit measurement model shows that energy is conserved (as expected), but that the spin chain energy increase is not provided by the measurement apparatus(es), that the projection postulate is not always valid - illustrating the Wigner-Araki-Yanase (WAY) theorem - and that there is signalling, indeed. The signalling is due to the non-local interaction Hamiltonian. This raises the question of a suitable quantum information inspired model of such non-local Hamiltonians.
We present an algebraic description of the sets of local correlations in arbitrary networks, when the parties have finite inputs and outputs. We consider networks generalizing the usual Bell scenarios by the presence of multiple uncorrelated sources. We prove a finite upper bound on the cardinality of the value sets of the local hidden variables. Consequently, we find that the sets of local correlations are connected, closed and semialgebraic, and bounded by tight polynomial Bell-like inequalities.
In order to study N-locality without inputs in long lines and in configurations with loops, e.g. the triangle, we introduce a natural joint measurement on two qubits different from the usual Bell state measurement. The resulting quantum probability $p(a_1,a_2,...,a_N)$ has interesting features. In particular the probability that all results are equal is that large, while respecting full symmetry, that it seems highly implausible that one could reproduce it with any N-local model, though - unfortunately - I have not been unable to prove it.
Large-scale quantum effects have always played an important role in the foundations of quantum theory. With recent experimental progress and the aspiration for quantum enhanced applications, the interest in macroscopic quantum effects has been reinforced. In this review, we critically analyze and discuss measures aiming to quantify various aspects of macroscopic quantumness. We survey recent results on the difficulties and prospects to create, maintain and detect macroscopic quantum states. The role of macroscopic quantum states in foundational questions as well as practical applications is outlined. Finally, we present past and on-going experimental advances aiming to generate and observe macroscopic quantum states.
The realization of quantum networks and quantum repeaters remains an outstanding challenge in quantum communication. These rely on entanglement of remote matter systems, which in turn requires creation of quantum correlations between a single photon and a matter system. A practical way to establish such correlations is via spontaneous Raman scattering in atomic ensembles, known as the DLCZ scheme. However, time multiplexing is inherently difficult using this method, which leads to low communication rates even in theory. Moreover, it is desirable to find solid-state ensembles where such matter-photon correlations could be generated. Here we demonstrate quantum correlations between a single photon and a spin excitation in up to 12 temporal modes, in a $^{151}$Eu$^{3+}$ doped Y$_2$SiO$_5$ crystal, using a novel DLCZ approach that is inherently multimode. After a storage time of 1 ms, the spin excitation is converted into a second photon. The quantum correlation of the generated photon pair is verified by violating a Cauchy - Schwarz inequality. Our results show that solid-state rare-earth crystals could be used to generate remote multi-mode entanglement, an important resource for future quantum networks.
We consider the characterization of quantum superposition states beyond the pattern "dead and alive". We propose a measure that is applicable to superpositions of multiple macroscopically distinct states, superpositions with different weights as well as mixed states. The measure is based on the mutual information to characterize the distinguishability between multiple superposition states. This allows us to overcome limitations of previous proposals, and to bridge the gap between general measures for macroscopic quantumness and measures for Schrödinger-cat type superpositions. We discuss a number of relevant examples, provide an alternative definition using basis-dependent quantum discord and reveal connections to other proposals in the literature. Finally, we also show the connection between the size of quantum states as quantified by our measure and their vulnerability to noise.
Quantum theory predicts that entanglement can also persist in macroscopic physical systems, albeit difficulties to demonstrate it experimentally remain. Recently, significant progress has been achieved and genuine entanglement between up to 2900 atoms was reported. Here we demonstrate 16 million genuinely entangled atoms in a solid-state quantum memory prepared by the heralded absorption of a single photon. We develop an entanglement witness for quantifying the number of genuinely entangled particles based on the collective effect of directed emission combined with the nonclassical nature of the emitted light. The method is applicable to a wide range of physical systems and is effective even in situations with significant losses. Our results clarify the role of multipartite entanglement in ensemble-based quantum memories as a necessary prerequisite to achieve a high single-photon process fidelity crucial for future quantum networks. On a more fundamental level, our results reveal the robustness of certain classes of multipartite entangled states, contrary to, e.g., Schrödinger-cat states, and that the depth of entanglement can be experimentally certified at unprecedented scales.
The problem of characterizing classical and quantum correlations in networks is considered. Contrary to the usual Bell scenario, where distant observers share a physical system emitted by one common source, a network features several independent sources, each distributing a physical system to a subset of observers. In the quantum setting, the observers can perform joint measurements on initially independent systems, which may lead to strong correlations across the whole network. In this work, we introduce a technique to systematically map a Bell inequality to a family of Bell-type inequalities bounding classical correlations on networks in a star-configuration. Also, we show that whenever a given Bell inequality can be violated by some entangled state $\rho$, then all the corresponding network inequalities can be violated by considering many copies of $\rho$ distributed in the star network. The relevance of these ideas is illustrated by applying our method to a specific multi-setting Bell inequality. We derive the corresponding network inequalities, and study their quantum violations.
The nature of quantum correlations in networks featuring independent sources of entanglement remains poorly understood. Here, focusing on the simplest network of entanglement swapping, we start a systematic characterization of the set of quantum states leading to violation of the so-called "bilocality" inequality. First, we show that all possible pairs of entangled pure states can violate the inequality. Next, we derive a general criterion for violation for arbitrary pairs of mixed two-qubit states. Notably, this reveals a strong connection between the CHSH Bell inequality and the bilocality inequality, namely that any entangled state violating CHSH also violates the bilocality inequality. We conclude with a list of open questions.
We present the quantum measurement problem as a serious physics problem. Serious because without a resolution, quantum theory is not complete, as it does not tell how one should - in principle - perform measurements. It is physical in the sense that the solution will bring new physics, i.e. new testable predictions, hence it is not merely a matter of interpretation of a frozen formalism. I argue that the two popular ways around the measurement problem, many-worlds and Bohmian-like mechanics, do, de facto, introduce effective collapses when "I" interact with the quantum system. Hence, surprisingly, in many-worlds and Bohmian mechanics, the "I" plays a more active role than in alternative models, like e.g. collapse models. Finally, I argue that either there are several kinds of stuffs out there, i.e. physical dualism, some stuff that respects the superposition principle and some that doesn't, or there are special configurations of atoms and photons for which the superposition principle breaks down. Or, and this I argue is the most promising, the dynamics has to be modified, i.e. in the form of a stochastic Schrödinger equation.
High-dimensional entanglement offers promising perspectives in quantum information science. In practice, however, the main challenge is to devise efficient methods to characterize high-dimensional entanglement, based on the available experimental data which is usually rather limited. Here we report the characterization and certification of high-dimensional entanglement in photon pairs, encoded in temporal modes. Building upon recently developed theoretical methods, we certify an entanglement of formation of 2.09(7) ebits in a time-bin implementation, and 4.1(1) ebits in an energy-time implementation. These results are based on very limited sets of local measurements, which illustrates the practical relevance of these methods.
Device-independent quantum key distribution (DI-QKD) represents one of the most fascinating challenges in quantum communication, exploiting concepts of fundamental physics, namely Bell tests of nonlocality, to ensure the security of a communication link. This requires the loophole-free violation of a Bell inequality, which is intrinsically difficult due to losses in fibre optic transmission channels. Heralded photon amplification is a teleportation-based protocol that has been proposed as a means to overcome transmission loss for DI-QKD. Here we demonstrate heralded photon amplification for path entangled states and characterise the entanglement before and after loss by exploiting a recently developed displacement-based detection scheme. We demonstrate that by exploiting heralded photon amplification we are able to reliably maintain high fidelity entangled states over loss-equivalent distances of more than 50~km.