Abstract
Do completely unpredictable events exist? Classical physics excludes fundamental randomness. Although quantum theory makes probabilistic predictions, this does not imply that nature is random, as randomness should be certified without relying on the complete structure of the theory being used. Bell tests approach the question from this perspective. However, they require prior perfect randomness, falling into a circular reasoning. A Bell test that generates perfect random bits from bits possessing high—but less than perfect—randomness has recently been obtained. Yet, the main question remained open: does any initial randomness suffice to certify perfect randomness? Here we show that this is indeed the case. We provide a Bell test that uses arbitrarily imperfect random bits to produce bits that are, under the nonsignalling principle assumption, perfectly random. This provides the first protocol attaining full randomness amplification. Our results have strong implications onto the debate of whether there exist events that are fully random.
Introduction
Understanding whether nature is deterministically predetermined or there are intrinsically random processes is a fundamental question that has attracted the interest of multiple thinkers, ranging from philosophers and mathematicians to physicists or neuroscientists. Nowadays, this question is also important from a practical perspective, as random bits constitute a valuable resource for applications such as cryptographic protocols, gambling or the numerical simulation of physical and biological systems.
Classical physics is a deterministic theory. Perfect knowledge of the positions and velocities of a system of classical particles at a given time, as well as of their interactions, allows one to predict their future (and also past) behaviour with total certainty^{1}. Thus, any randomness observed in classical systems is not intrinsic to the theory but just a manifestation of our imperfect description of the system.
The advent of quantum physics put into question this deterministic viewpoint, as there exist experimental situations for which quantum theory gives predictions only in probabilistic terms, even if one has a perfect description of the preparation and interactions of the system. A possible solution to this classically counterintuitive fact was proposed in the early days of quantum physics: quantum mechanics had to be incomplete^{2} and there should be a complete theory capable of providing deterministic predictions for all conceivable experiments. There would thus be no room for intrinsic randomness and any apparent randomness would again be a consequence of our lack of control over hypothetical ‘hidden variables’ not contemplated by the quantum formalism.
Bell's nogo theorem^{3}, however, implies that local hiddenvariable theories are inconsistent with quantum mechanics. Therefore, none of these could ever render a deterministic completion to the quantum formalism. More precisely, all hiddenvariable theories compatible with a local causal structure predict that any correlations among spacelike separated events satisfy a series of inequalities, known as Bell inequalities. Bell inequalities, in turn, are violated by some correlations among quantum particles. This form of correlations defines the phenomenon of quantum nonlocality.
Now, it turns out that quantum nonlocality does not necessarily imply the existence of fully unpredictable processes in nature. The reasons behind this are subtle. First of all, unpredictable processes could be certified only if the nosignalling principle holds. This states that no instantaneous communication is possible, which in turn imposes a local causal structure on events, as in Einstein's special relativity. In fact, Bohm’s theory is both deterministic and able to reproduce all quantum predictions^{4}, but it is incompatible with nosignalling at the level of the hidden variables. Thus, we assume throughout the validity of the nosignalling principle. Yet, even within the nosignalling framework, it is still not possible to infer the existence of fully random processes only from the mere observation of nonlocal correlations. This is due to the fact that Bell tests require measurement settings chosen at random, but the actual randomness in such choices can never be certified. The extremal example is given when the settings are determined in advance. Then, any Bell violation can easily be explained in terms of deterministic models. As a matter of fact, superdeterministic models, which postulate that all phenomena in the universe, including our own mental processes, are fully predetermined, are by definition impossible to rule out. These considerations imply that the strongest result on the existence of randomness one can hope for using quantum nonlocality is stated by the following possibility: given a source that produces an arbitrarily small but nonzero amount of randomness, can one still certify the existence of completely random processes?
Here, we show that this is the case for a very general, and physically meaningful, set of randomness sources. This includes subsets of the wellknown Santha–Vazirani sources^{5} as particular cases. Besides the philosophical and physicsfoundational implications, our results provide a protocol for full randomness amplification using quantum nonlocality. Randomness amplification is an informationtheoretic task whose goal is to use an input source of imperfectly random bits to produce perfect random bits. Santha and Vazirani^{5} proved that randomness amplification is impossible using classical resources. This is in a sense intuitive, in view of the absence of any intrinsic randomness in classical physics. In the quantum regime, randomness amplification has been recently studied by Colbeck and Renner^{6}. They proved how input bits with very high initial randomness can be mapped into arbitrarily pure random bits, and conjectured that randomness amplification should be possible for any initial randomness^{6}. Our results also solve this conjecture, as we show that quantum nonlocality can be exploited to attain full randomness amplification.
Results
Previous work
Before presenting our results, it is worth commenting on previous works on randomness in connection with quantum nonlocality. In the study by Pironio et al.^{7}, it was shown how to bound the intrinsic randomness generated in a Bell test. These bounds can be used for deviceindependent randomness expansion, following a proposal by Colbeck^{8}, and to achieve a quadratic expansion of the amount of random bits (see refs 9, 10, 11, 12 for further works on deviceindependent randomness expansion). Note however that, in randomness expansion, one assumes instead, from the very beginning, the existence of an input seed of free random bits, and the main goal is to expand this into a larger sequence. The figure of merit is the ratio between the length of the final and initial strings of free random bits. Finally, other recent works have analysed how a lack of randomness in the measurement choices affects a Bell test^{13,14,15} and the randomness generated in it^{16}.
Definition of the scenario
From an information perspective, our goal is to construct a protocol for full randomness amplification based on quantum nonlocality. In randomness amplification, one aims at producing arbitrarily free random bits from many uses of an input source of imperfectly random bits.
A random bit b is said to be free if it is uncorrelated from any classical variables e generated outside the future lightcone of b (of course, the bit b can be arbitrarily correlated with any event inside its future lightcone). This requirement formalizes the intuition that the only systems that may share some correlation with b are the ones that are influenced by b. Note also that this definition of randomness is strictly stronger than the demand that b is uncorrelated with any classical variable generated in the past lightcone of the process. This is crucial if the variables e and b are generated by measuring on a correlated quantum system. In this case, even if both systems interacted somewhere in the past lightcone of b, the variable e is not produced until the measurement is performed, possibly outside both past and future lightcones. Furthermore, we say that a random bit is εfree if any correlations with events outside its future lightcone are bounded by ε, as explained in what follows.
Source produces a sequence of bits x_{1},x_{2},…x_{j},…, with x_{j}=0 or 1 for all j, see Fig. 1, which are εfree. More precisely, each bit j contains some randomness, in the sense that the probability P(x_{j}all other bits, e) that it takes a given value x_{j}, conditioned on the values of all the other bits produced by , as well as the variable e, is such that
for all j, where 0<ε ≤1/2. Given our previous definition of εfree bits, the variable e represents events outside the future lightcone of all the x_{j}'s. Free random bits correspond to and deterministic ones to ε=0. More precisely, when ε=0 the bound (1) is trivial and no randomness can be certified. We refer to as an εsource, and to any bit satisfying (1) as an εfree bit.
The aim of randomness amplification is to generate, from arbitrarily many uses of , a final source of ε_{f}free bits with arbitrarily close to 1/2. If this is possible, no cause e can be assigned to the bits produced by , which are then fully unpredictable. Note that, in our case, we require the final bits to be fully uncorrelated from e. When studying randomness amplification, the first and most fundamental question is whether the process is at all possible. This is the question we consider and solve in this work. Thus, we are not concerned with efficiency issues, such as the rate of uses of required per final bit generated by , and, without loss of generality, restrict our analysis to the problem of generating a single final free random bit k. Our relevant figure of merit is just the quality, measured by ε_{f}, of the final bit. Of course, efficiency issues are relevant when considering applications of randomness amplification protocols for information tasks, but this is beyond the scope of this work.
The randomness amplification protocols we consider exploit quantum nonlocality. This idea was introduced in the study by Colbeck and Renner^{6}, where a protocol was presented in which the source is used to choose the measurement settings by two distant observers, Alice and Bob, in a Bell test^{17} involving two entangled quantum particles. The measurement outcome obtained by one of the observers, say Alice, in one of the experimental runs (also chosen with ) defines the output random bit. Colbeck and Renner proved how input bits with high randomness, of 0.442<ε≤0.5, can be mapped into arbitrarily free random bits of ε_{f} → 1/2. In our case, the input εsource is used to choose the measurement settings in a multipartite Bell test involving a number of observers, which depends both on the input ε and the target ε_{f}. After verifying that the expected Bell violation is obtained, the measurement outcomes are combined to define the final bit k. For pedagogical reasons, we adopt a cryptographic perspective and assume the worstcase scenario where all the devices we use may have been prepared by an adversary Eve equipped with arbitrary nonsignalling resources, possibly even supraquantum ones. In the preparation, Eve may have also had access to and correlated the bits it produces with some physical system at her disposal, represented by a black box in Fig. 1. Without loss of generality, we can assume that Eve can reveal the value of e at any stage of the protocol by measuring this system. Full randomness amplification is then equivalent to proving that Eve's correlations with k can be made arbitrarily small.
An important comment is now in order that applies to all further discussion as well as the protocol subsequently presented. For convenience, we represent (see Figs 1 and 2) as a single source generating all the inputs and delivering them among the separated boxes without violating the nosignalling principle. However, this is not the scenario in practice. Operationally, each user generates his input from a local source in his lab. However, all these sources can be arbitrarily correlated to the sources of the other users with each other, without violating the bound on the correlations given by (1) and thus can be seen as a single εsource . With this understanding, we proceed to discuss a single effective source in the rest of the text.
Partial randomness from GHZtype paradoxes
Bell tests for which quantum correlations achieve the maximal nonsignalling violation, also known as Greenberger–Horne–Zeilinger (GHZ)type paradoxes^{18}, are necessary for full randomness amplification. This is due to the fact that unless the maximal nonsignalling violation is attained, for sufficiently small ε, Eve may fake the observed correlations with classical deterministic resources. Nevertheless, GHZtype paradoxes are not sufficient. In fact, given any function of the measurement outcomes, it is always possible to find nonsignalling correlations that (i) maximally violate the threeparty GHZ paradox^{18} but (ii) assign a deterministic value to that function of the measurement outcomes. This observation can be checked for all unbiased functions mapping {0,1}^{3} to {0,1} (there are of those) through a linear programme analogous to the one used in the proof of the Lemma below. As a simple example, consider the particular function defined by the outcome bit of the first user. This can be fixed by using a tripartite nosignalling probability distribution consisting of a deterministic distribution for the first party and a Popescu–Rohrlich box^{19} for the second and third party.
However, for five parties, the latter does not hold good. Consider now any correlations attaining the maximal violation of the fiveparty Mermin inequality^{20}. In each run of this Bell test, measurements (inputs) x=(x_{1},…,x_{5}) on five distant black boxes generate five outcomes (outputs) a=(a_{1},…,a_{5}), distributed according to a nonsignalling conditional probability distribution P(ax), see Supplementary Note 1. Both inputs and outputs are bits, as they can take two possible values, x_{i},a_{i}ε{0,1} with i=1,…,5.
The inequality can be written as
with coefficients
where
and
That is, only half of all possible combinations of inputs, namely those in =_{0}_{1}, occur in the Bell inequality. This inequality may be thought of as a nonlocal game in which the parties are required to minimize the parity of their outputs when the sum of their inputs is 1 or 5, whereas minimizing the inverse parity of the outputs when their inputs sum to 3. It turns out that the minimum achievable with classical strategies is 6.
The maximal, nonsignalling and algebraic, violation of the inequality corresponds to the situation in which the lefthand side of (2) is zero. The key property of inequality (2) is that its maximal violation can be attained by quantum correlations, and, further, one can construct a function of the outcomes that is not completely determined. Take the bit corresponding to the majorityvote function of the outcomes of any subset of three out of the five observers, say the first three. This function is equal to zero if at least two of the three bits are equal to zero and equal to one otherwise. We show that Eve's predictability on this bit is at most 3/4. We state this result in the following Lemma:
Lemma: Let a fiveparty nonsignalling conditional probability distribution P(ax) in which inputs x=(x_{1},…,x_{5}) and outputs a=(a_{1},…,a_{5}) are bits. Consider the bit maj(a)ε{0,1} defined by the majorityvote function of any subset consisting of three of the five measurement outcomes, say the first three, a_{1}, a_{2} and a_{3}. Then, all nonsignalling correlations attaining the maximal violation of the fiveparty Mermin inequality are such that the probability that maj(a) takes a given value, say 0, is bounded by
Proof: This result was obtained by solving a linear programme. Therefore, the proof is numeric but exact. Formally, let P(ax) be a fivepartite nonsignalling probability distribution. For x=x_{0}ε, we performed the maximization,
which yields the value P_{max}=3/4. As the same result holds for P(maj(a)=1x_{0}), we get the bound 1/4≤P(maj(a)=0)≤3/4.
As a further remark, note that a lower bound to P_{max} can easily be obtained by noticing that one can construct conditional probability distributions P(ax) that maximally violate fivepartite Mermin inequality (2) for which at most one of the output bits (say a_{1}) is deterministically fixed to either 0 or 1. If the other two output bits (a_{2},a_{3}) were to be completely random, the majorityvote of the three of them maj(a_{1},a_{2},a_{3}) could be guessed with a probability of 3/4. Our numerical results say that this turns out to be an optimal strategy.
The previous lemma strongly suggests that, given an εsource with any 0<ε≤1/2 and quantum fiveparty nonlocal resources, it should be possible to design a protocol to obtain an ε_{i}source of ε_{i}=1/4. We do not explore this possibility here but rather use the partial unpredictability in the fiveparty Mermin Bell test as building block of our protocol for full randomness amplification. To complete it, we must equip it with two essential components: (i) an estimation procedure that verifies that the untrusted devices do yield the required Bell violation; and (ii) a distillation procedure that, from sufficiently many ε_{i}bits generated in the fiveparty Bell experiment, distils a single final ε_{f}source of ε_{f} → 1/2. Towards these ends, we consider a more complex Bell test involving N groups of five observers (quintuplets) each.
A protocol for full randomness amplification
Our protocol for randomness amplification uses as resources the εsource and 5N quantum systems. Each of the quantum systems is abstractly modelled by a black box with binary input x and output a. The protocol processes classically the bits generated by and by the quantum boxes. When the protocol is not aborted it produces a bit k. The protocol consists of the five steps described below (see also Fig. 2).
In step 1: is used to generate N quintuplebits x_{1},…x_{N}, which constitute the inputs for the 5N boxes and are distributed among them without violating nosignalling. The boxes then provide N output quintuplebits a_{1},…a_{N}.
In step 2: The quintuplets such that x∉ are discarded. The protocol is aborted if the number of remaining quintuplets is <N/3. (Note that the constant factor 1/3 is arbitrary. In fact, it is enough to demand that the number of remaining quintuplets is larger than N/c, with c>1. See the Supplementary Note 2).
In step 3: The quintuplets left after step 2 are organized in N_{b} blocks each one having N_{d} quintuplets. The number N_{b} of blocks is chosen to be a power of 2. For the sake of simplicity, we relabel the index running over the remaining quintuplets, namely x_{1},…{\text{X}}_{{N}_{\text{b}}{N}_{d}} and outputs a_{1},…{\text{a}}_{{N}_{\text{b}}{N}_{d}}. The input and output of the jth block are defined as y_{j}=({\text{X}}_{\text{(}j1){N}_{d}+1},…{\text{X}}_{\text{(}j1){N}_{d}+{N}_{d}}) and b_{j}=(a_{(j–1)Nd+1},…a_{(j–1)}N_{d+}N_{d}), respectively, with jε{1,…,N_{b}}. The random variable lε{1,…N_{b}} is generated by using log_{2}N_{b} further bits from . The value of l specifies that block (b_{l},y_{l}) is chosen to generate k, that is, the distilling block. We define . The other N_{b}–1 blocks are used to check the Bell violation.
In step 4: The function
tells whether block (b,y) features the right correlations (r=1) or the wrong ones (r=0), in the sense of being compatible with the maximal violation of inequality (2). This function is computed for all blocks but the distilling one. The protocol is aborted unless all the blocks give the right correlations,
Note that the abort/noabort decision is independent of whether the distilling block l is right or wrong.
In step 5: If the protocol is not aborted then k is assigned a bit generated from b_{l}=(a_{1},…{\text{a}}_{{N}_{d}}) as
Here f:{0, 1}^{N} → {0, 1} is a function whose existence is proven in the Supplementary Note 2, whereas maj(a_{i})ε{0, 1} is the majorityvote among the three first bits of the quintuple string a_{i}.
At the end of the protocol, the bit k is potentially correlated with the settings of the distilling block , the bit g defined in (9), and the information
In addition, an eavesdropper Eve might have access to a physical system correlated with k, which she can measure at any stage of the protocol. This system is not necessarily classical nor quantum, the only assumption about it is that measuring it does not produce instantaneous signalling anywhere else. The measurements that Eve can perform on her system are labelled by z, and the corresponding outcomes by e. In summary, after performing the protocol all the relevant information is with statistics described by an unknown conditional probability distribution . When the protocol is aborted (g=0) there is no value for k. Therefore, to have a welldefined distribution in all cases, we set k=0 when g=0—that is , where is the Kronecker tensor.
To assess the quality of our protocol for full randomness amplification, we compare it with an ideal protocol having the same marginal for the variables and the physical system described by e, z. That is, the global distribution of the ideal protocol is
where is the marginal of the distribution generated by the real protocol. Note that, consistently, in the ideal distribution we also set k=0 when g=0.
Our goal is that the statistics of the real protocol P is indistinguishable from the ideal statistics P_{ideal}. We consider the optimal strategy to discriminate between P and P_{ideal}, which obviously involves having access to all possible information and the physical system e, z. As shown in the study by Masanes^{21}, the optimal probability for correctly guessing between these two distributions is
Note that the second term can be understood as (one half of) the variational distance between P and P_{ideal} generalized to the case when the distributions are conditioned on an input z. The following theorem is proven in the Supplementary Note 2.
Theorem: Let be the probability distribution of the variables generated during the protocol and the adversary's physical system e, z; and let be the corresponding ideal distribution (11). The optimal probability of correctly guessing between the distributions P and P_{ideal} satisfies
where the real numbers α,β fulfil 0<α<1<β.
Now, the righthand side of (13) can be made arbitrary close to 1/2, for instance by setting and increasing N_{d} subject to the condition N_{d}N_{b}>N/3. (Note that log_{2}(1−ε)<0.) In the limit of large N_{d} the probability P(guess) tends to 1/2, which implies that the optimal strategy is as good as tossing a coin. In this case, the performance of the protocol is indistinguishable from that of an ideal one. This is known as universally composable security and accounts for the strongest notion of cryptographic security (see refs 21, 22).
Let us discuss the implications and limitations of our result. Note first that step 2 in the protocol involves a possible abortion (a similar step can also be found in the study by Colbeck and Renner^{6}). Hence, only those εsources with a nonnegligible probability of passing step 2 can be amplified by our protocol. The abortion step can be relaxed by choosing a larger value of the constant c used for rejection. Yet, in principle, it could possibly exclude some of the εsources defined in (1). Notice, however, that demanding that step 2 is satisfied with nonnegligible probability is just a restriction on the statistics seen by the honest parties P(x_{1},…,x_{n}) and does not imply any restriction on the value of ε in ε≤P(x_{1},…,x_{n}e)≤(1−ε), which can be arbitrarily small. Also, we identify at least two reasons why sources that fulfil step 2 with high probability are the most natural in the context of randomness amplification. First, from a cryptographic perspective, if the observed sequence x_{1},…,x_{n} does not fulfil step 2, then the honest parties will abort any protocol, regardless of whether a condition similar to step 2 is included. The reason is that such sequence would be extremely atypical in a fair source P(x_{1},…,x_{n})=1/2^{n} and thus the honest players will conclude that the source is intervened by a malicious party or seriously damaged. Moreover, as discussed in the Supplementary Note 3, imposing that the source has unbiased statistics from the honest parties' point of view does not imply any restriction on Eve's predictability. Second, from a more fundamental viewpoint, the question of whether truly random events exist in nature is interesting as the observable statistics of many physical processes look random, that is, they are such that P(x_{1},…,x_{n})=1/2^{n}. If every process in nature was such the observable statistics does not fulfil step 2, the problem of whether truly random processes exist would hardly have been considered relevant. Finally, note that possible sources outside this subclass do not compromise the security of the protocol, only its probability of being successfully implemented.
Under the conditions demanded in step 2, our protocol actually goes through for sources more general than those in (1). These are defined by the following restrictions,
for any pair of functions G(n,ε),F(n,ε) defining the lower and upper bounds to Eve's control on the bias of each bit fulfilling the conditions G(n,ε)>0 and lim_{n→∞}F(n,ε)=0. In fact, this condition is sufficient for our amplification protocol to succeed, see also Supplementary Notes 2 and 3.
To complete the argument, we must mention that according to quantum mechanics, given a source that passes step 2, we can in principle implement the protocol with success probability equal to one, P(g=1)=1. It can be immediately verified that the qubit measurements X or Y on the quantum state , with 0› and 1› the eigenstates of Z, yield correlations that maximally violate the fivepartite Mermin inequality in question. (In a realistic scenario the success probability P(g=1) might be lower than one but our theorem warrants that the protocol is still secure).
We can now state the main result of our work. Full randomness amplification: a perfect free random bit can be obtained from sources of arbitrarily weak randomness using nonlocal quantum correlations.
Discussion
We would like to conclude by explaining the main intuitions behind the proof of the previous theorem. As mentioned, the protocol builds on the fiveparty Mermin inequality because it is the simplest GHZ paradox allowing some randomness certification. The estimation part, given by step 4, is rather standard and inspired by estimation techniques introduced in the study by Barrett et al.^{23}, which were also used in the study by Colbeck and Renner^{6} in the context of randomness amplification. The most subtle part is the distillation of the final bit in step 5. Naively, and leaving aside estimation issues, one could argue that it is nothing but a classical processing by means of the function f of the imperfect random bits obtained via the N_{d} quintuplets. But this seems in contradiction with the result by Santha and Vazirani^{5} proving that it is impossible to extract by classical means a perfect free random bit from imperfect ones. This intuition is, however, misleading. Indeed, the Bell certification allows applying techniques similar to those obtained in the study by Masanes^{21} in the context of privacy amplification against nonsignalling eavesdroppers. There it was shown how to amplify the privacy, that is the unpredictability, of one of the measurement outcomes of bipartite correlations violating a Bell inequality. The key point is that the amplification or distillation, is attained in a deterministic manner. That is, contrary to standard approaches, the privacy amplification process described in the study by Masanes^{21} does not consume any randomness. Clearly, these deterministic techniques are extremely convenient for our randomness amplification scenario. In fact, the distillation part in our protocol can be seen as the translation of the privacy amplification techniques of Masanes^{21} to our more complex scenario, involving now fiveparty nonlocal correlations and a function of three of the measurement outcomes.
To summarize, we have presented a protocol that, using quantum nonlocal resources, attains full randomness amplification, a task known to be impossible classically. As our goal was to prove full randomness amplification, our analysis focuses on the noisefree case. In fact, the noisy case only makes sense if one does not aim at perfect random bits and bounds the amount of randomness in the final bit. Then, it should be possible to adapt our protocol to get a bound on the noise it tolerates. Other open questions that our results offer as challenges consist of extending randomness amplification to other randomness sources, studying randomness amplification against quantum eavesdroppers or the search of protocols in the bipartite scenario.
From a more fundamental perspective, our results imply that there exist experiments whose outcomes are fully unpredictable. The only two assumptions for this conclusion are the existence of events with an arbitrarily small but nonzero amount of randomness that pass step 2 of our protocol and the validity of the nosignalling principle. Dropping the first assumption would lead to superdeterminism or to accept that the only source of randomness in nature is one that does not pass step 2 of our protocol, and in particular, one that does not look unbiased. On the other hand, dropping the second assumption would imply abandoning a local causal structure for events in spacetime. However, this is one of the most fundamental notions of special relativity.
Additional information
How to cite this article: Gallego, R. et al. Full randomness from arbitrarily deterministic events. Nat. Commun. 4:2654 doi: 10.1038/ncomms3654 (2013).
References
Laplace, P. S. A philosophical essay on probabilities (1840).
Einstein, A., Podolsky, B. & Rosen, N. Can quantummechanical description of physical reality be considered complete? Phys. Rev. 47, 777–780 (1935).
Bell, J. On the Einstein Podolsky Rosen Paradox. Physics 1, 195–200 (1964).
Bohm, D. A. Suggested interpretation of the quantum theory in terms of ‘hidden’ variables. I. Phys. Rev. 85, 166–179 (1952).
Santha, M. & Vazirani, U. V. Generating Quasirandom sequences from semirandom sources. J. Comput. Syst. Sci. 33, 75–87 (1986).
Colbeck, R. & Renner, R. Free randomness can be amplified. Nat. Phys. 8, 450–454 (2012).
Pironio, S. et al. Random numbers certified by Bell's theorem. Nature 464, 1021–1024 (2010).
Colbeck, R. Quantum and Relativistic Protocols for Secure MultiParty Computation, PhD Thesis, University of Cambridge (2007).
Acín, A., Massar, S. & Pironio, S. Randomness versus nonlocality and entanglement. Phys. Rev. Lett. 108, 100402 (2012).
Pironio, S. & Massar, S. Security of practical private randomness generation. Phys. Rev. A 87, 012336 (2013).
Fehr, S., Gelles, R. & Schaffner, C. Security and Composability of Randomness Expansion from Bell Inequalities. Preprint at http://arXiv.org/quantph/1111.6052 (2011).
Vazirani, U. & Vidick, T. Certifiable Quantum Dice: or, true random number generation secure against quantum adversaries. Proceedings of the ACM Symposium on the Theory of Computing (2012).
Kofler, J., Paterek, T. & Brukner, C. Experimenter's freedom in Bell's theorem and quantum cryptography. Phys. Rev. A 73, 022104 (2006).
Barrett, J. & Gisin, N. How much measurement independence is needed in order to demonstrate nonlocality? Phys. Rev. Lett. 106, 100406 (2011).
Hall, M. J. W. Local deterministic model of singlet state correlations based on relaxing measurement independence. Phys. Rev. Lett. 105, 250404 (2010).
Koh, D. E. et al. The effects of reduced ‘free will’ on Bellbased randomness expansion. Phys. Rev. Lett. 109, 160404 (2012).
Braunstein, S. L. & Caves, C. M. Wringing out better Bell inequalities. Ann. Phys. 202, 22 (1990).
Greenberger, D. M., Horne, M. A. & Zeilinger, A. Bell's Theorem, Quantum Theory, and Conceptions of the Universe Kluwer (1989).
Popescu, S. & Rohrlich, D. Quantum nonlocality as an axiom. Found. Phys. 24, 379–385 (1994).
Mermin, N. D. Simple unified form for the major nohiddenvariables theorems. Phys. Rev. Lett. 65, 3373–3376 (1990).
Masanes, L. Universally composable privacy amplification from causality constraints. Phys. Rev. Lett. 102, 140501 (2009).
Canetti, R. Universally composable security: a new paradigm for cryptographic protocols. Proc. 42nd Ann. IEEE Symp. Found. Comp. Sci. (FOCS) 136–145 (2001).
Barrett, J., Hardy, L. & Kent, A. No signaling and quantum key distribution. Phys. Rev. Lett. 95, 010503 (2005).
Acknowledgements
We acknowledge support from the ERC Starting Grant PERCENT, the EU Projects QEssence and QCS, the Spanish FPI grant and projects FIS201014830, ExploraIntrinqra and CHISTERA DIQIP, an FI Grant of the Generalitat de Catalunya, Fundació Catalunya  La Pedrera and Fundació Privada Cellex, Barcelona. L.A. acknowledges support from the Spanish MICIIN through a Juan de la Cierva grant and the EU under Marie Curie IEF No. 299141.
Author information
Authors and Affiliations
Contributions
All authors contributed extensively to the work presented in this paper.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing financial interests.
Supplementary information
Supplementary Information
Supplementary Notes 13 (PDF 120 kb)
Rights and permissions
About this article
Cite this article
Gallego, R., Masanes, L., De La Torre, G. et al. Full randomness from arbitrarily deterministic events. Nat Commun 4, 2654 (2013). https://doi.org/10.1038/ncomms3654
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/ncomms3654
Further reading

Effects of measurement dependence on 1parameter family of Bell tests
Quantum Information Processing (2022)

Semideviceindependent randomness expansion using $$n\rightarrow 1$$ sequential quantum random access codes
Quantum Information Processing (2021)

Experimental deviceindependent certified randomness generation with an instrumental causal structure
Communications Physics (2020)

The effect on (2, N, 2) Bell tests with distributed measurement dependence
Quantum Information Processing (2020)

On the effects of pseudorandom and quantumrandom number generators in soft computing
Soft Computing (2020)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.