Researchers uncover path to perfect photon entanglement
When harnessed, the incredible phenomena of the quantum world stand to radically shift the landscape of modern technology. Among these phenomena is quantum entanglement: the reality that two particles can become so strongly linked that observing the state of one allows us to predict the state of the other – even across continents.
Studying entanglement could advance developing technologies in quantum sensing, communication and cryptography. The ability to generate entangled photons in the ideal state needed for their application – not only sometimes, but 99.9% of the time – is important for the practical implementation of these technologies.
Current technology produces entangled photons by exciting quantum dots – small semiconductor particles sometimes referred to as ‘artificial atoms’ – but understanding how this cascade process can produce so-called ‘perfect’ entanglement has been a challenge for more than a decade.
Researchers have previously tried to increase the strength or fidelity of entanglement by changing the source material or by employing new methods of exciting the quantum dot with a laser. Unfortunately, these changes have not produced entanglement fidelity even approaching 90%. It was assumed that that the quantum dot’s nuclear spins – rotating subatomic movements – were causing particles to lose their entangled state in a process called dephasing, and that perfect fidelity was unattainable.
A study led by Institute for Quantum Computing (IQC) faculty member Michael Reimer suggests otherwise. Nuclear spins are not the cause of dephasing. In fact, if one uses resonant excitation (one of the cleanest methods of exciting a quantum dot, designed to decrease outside interference and multi-photon emissions) no dephasing occurs at all. It turns out that past studies have failed to measure higher entanglement fidelities simply because measuring devices have been too slow to record them.
While current photodetectors measure entangled photons every several billionths of a second, there’s still a gap or timing jitter between measurements. It’s these periods of imprecision – in addition to false signals emitted by the detector, called dark counts – that are responsible for the perceived lower fidelities.
“We’ve found in literature that people have already generated perfect entanglement,” said Reimer, also an assistant professor in the Department of Electrical and Computer Engineering. “They just don't know it yet.”
To demonstrate this, Reimer’s group first conducted an entanglement experiment using resonant excitation, and then constructed a computer model to simulate the experiment and others like it. This model is significant because it operates under the assumption that no dephasing is occurring. It instead considers only the biexciton-exciton cascade: a process where the excited quantum dot traps two electrons in two electron holes to form a single bi-exciton photon. This photon then recombines in the presence of a separate exciton photon, creating entanglement between them.
Remarkably, the simulation’s results overlapped perfectly with the physical experiment, even though the model did not account for dephasing. This indicates that nuclear spin dephasing has no significant effect on entanglement fidelity. It is not responsible for entanglement breakdown.
In light of this revelation, the problem of achieving perfect entanglement is now only a technical one. Reimer’s model projects that a detector with an ultra-low timing jitter of less than 30 picoseconds, used alongside a resonant excitation scheme, could expect to achieve fidelities approaching 99.9%. Detectors released in the past year, which use superconducting nanowires rather than the single-photon avalanche photodiodes currently in use, have this capability.
“It’s now actually possible to get to the level of fidelity you need for applications, which is really exciting,” said Reimer. In the future, his team plans to use electric fields to radically increase single-photon generation rates, making their entanglement more “applicable, compact, and good for use in different types of technology.”
Dephasing Free Photon Entanglement with a Quantum Dot
A. Fognini, A. Ahmadi, M. Zeeshan, J. T. Fokkens, S. J. Gibson, N. Sherlekar, S. J. Daley, D. Dalacu, P. J. Poole, K. D. Jöns, V. Zwiller, M. E. Reimer
ACS Photonics 2019 May 30, 2019
Assistant Professor in Institute for Quantum Computing (IQC), University of Waterloo
University of Waterloo