> Token: A digital certificate indicating certain rights and values, such as digital assets, user information, and access rights.
That is not much detail.
> Quantum key distribution (QKD) systems use quantum mechanics to share random secret keys between two communicating parties in order to guarantee secure communication, and then encrypt and decrypt information based on those keys. (Patented (as of November 18, 2024))
This sounds like rather old technology. What exactly is novel here?
In any case, the article’s drawing makes it look like the customer’s “token” is some classical information. This cannot work.
Is there any projected practical use for QKD apart from being a jobs program for researchers?
(This is a thing I am fine with, research is research and it doesn't necessarily need a near-term practical outcome, but why is it "sold" to the public as though there is some useful capability coming just around the corner?).
Who would use dedicated fiber to get secrets between point A and point B? Am I just insufficiently imaginative?
Whenever I read these headlines I am reminded of how much biological research needs to have a "could one day cure cancer" to give funders and journalists a hook.
Right but what are they going to do with the keys being exchanged? Load them into networked traditional computers?
If the computers are secure they can presumably do authenticated key agreement perfectly well and if they are not then I don't see how the QKD helps.
Security is nuanced and thinking in binaries is often a mistake - but I don't see how QKD meaningfully changes anyone's threat model in any plausible deployment scenario.
QKD will generate a session key, just like Diffie-Hellman or some of the post-quantum DH alternatives. If your threat model includes the risk that someone captures and stores ciphertext and subsequently gets access to a quantum computer and the ability to break whatever post-quantum scheme you’ve augmented with, then maybe QKD is useful. I agree that this is a bit of a stretch.
(Of course, one can also augment DH with symmetric crypto for the datacenter use case, with someone trustworthy literally carrying the key to the other end of the link, and I see no realistic usage of QKD that will outperform that unless one is worried about post-compromise recovery of a symmetric key stored in a piece of hardware. Plus, QKD has its own issues: security of QKD is subject to catastrophic failures if the single-photon source isn’t actually a single-photon source and possibly also if a malicious light source injected into the fiber causes the transmitter to stop being a single-photon source or the receiver to behave in a manner inconsistent with any possible single received photon. Think of these as side channel and fault attacks that are rather difficult to manage.)
I am worried about the future of quantum tokens...
Whilst theoretically they are secure, I worry about potential huge side-channels allowing leaking of the key...
All it takes is a few extra photons emitted at some harmonic frequency for the key to be leaked...
I would much prefer dumb hardware and clever digital software, because at least software is much easier to secure against side channels, and much easier to audit.
In principle quantum communication has no side channels because side channels act like measurements, and measurements make it not a functioning quantum channel in the first place. So you need to have already solved side channel issues for basic function.
That said, wherever you convert the quantum data into classical data there will be potential side channels. For example, there have been attacks based on using a laser down the communication line to track the orientation of the measurement device at the receiver.
In general, the more you can do while the data stays quantum the better. For example, if you transduce the photon into a qubit inside a quantum computer, then the measurement can be hidden away inside the computer, instead of exposed to the communication line. And the measurement basis can be chosen after transmission arrival, instead of before.
The larger issue for most quantum key exchange setups is the transition from classical to quantum: you want not to accidentally generate two unentangled photons in the same secret polarization.
Isn't the entire security of Quantum Communication predicated on its complete lack of side-channels due to the fact that measuring quantum systems collapses their wave function?
Once you put error correction, doenn't you lose all the nice properties of the non cloning theorem? If the protocol tolerates 30% of errors, doesn't it tolerate 30% of MITM? (60%??)
You don't need error correction for some crypto primitives. There are QKD networks deployed that don't have that kind of error correction, as far as I know.
How can QKD repeaters store and forward or just forward without collapsing phase state?
How does photonic phase state collapse due to fiber mitm compare to a heartbeat on a classical fiber?
There is quantum counterfactual communication without entanglement FWIU? And there's a difference between QND "Quantum Non-Demolition" and "Interaction-free measurement"
>> IIRC I read on Wikipedia one day that Bell's actually says there's like a 60% error rate?(!)
> That was probably the "Bell test" article, which - IIUC - does indeed indicate that if you can read 62% of the photons you are likely to find a loophole-free violation
> [ "Violation of Bell inequality by photon scattering on a two-level emitter", ]
> when using a maximally entangled state and the CHSH inequality an efficiency of
η>2sqrt(2)−2≈0.83 is required for a loophole-free violation.[51] Later Philippe H. Eberhard showed that when using a partially entangled state a loophole-free violation is possible for
η>2/3≈0.67,[52] which is the optimal bound for the CHSH inequality. [53] Other Bell inequalities allow for even lower bounds. For example, there exists a four-setting inequality which is violated for
η>(5−1)/2≈0.62 [54]
Isn't modern error detection and classical PQ sufficient to work with those odds?
> Historically, only experiments with non-optical systems have been able to reach high enough efficiencies to close this loophole, such as trapped ions, [55] superconducting qubits, [56] and nitrogen-vacancy centers. [57] These experiments were not able to close the locality loophole, which is easy to do with photons. More recently, however, optical setups have managed to reach sufficiently high detection efficiencies by using superconducting photodetectors, [30][31] and hybrid setups have managed to combine the high detection efficiency typical of matter systems with the ease of distributing entanglement at a distance typical of photonic systems. [10]
Security is never about absolutes. It’s about relative costs vs the attacker. It seems like this system adds a strong enough layer of security over the transport that the attacker would switch to going after the endpoints instead.
With quantum tokens, law enforcement have to crack your physical devices, so they at least have to good-old-fashion bug your devices. With classical schemes, they can intercept on the way.
I wouldn't say that current side-channels, most certainly enabled by hardware, not software, are easier to audit.
I don't think that's true. If you're paranoid you can build a very simple and easy to audit device that lets packets through exactly every x microseconds, with a short buffer to prevent timing via dropouts.
Works fine for digital, doesn't work for quantum stuff.
Light is remarkably good at keeping its polarization state intact for long distances through single mode fiber. At least historically, the main issues with doing quantum computation with light is that’s it’s hard to store light and hard to get one photon to interact with another one in a controlled manner.
(Polarization of a photon is a two-state quantum system, otherwise known as a qubit.)
Does this have anything resembling details? The press release is here:
https://www.nec.com/en/press/202411/global_20241118_01.html
And it has goodies like:
> Token: A digital certificate indicating certain rights and values, such as digital assets, user information, and access rights.
That is not much detail.
> Quantum key distribution (QKD) systems use quantum mechanics to share random secret keys between two communicating parties in order to guarantee secure communication, and then encrypt and decrypt information based on those keys. (Patented (as of November 18, 2024))
This sounds like rather old technology. What exactly is novel here?
In any case, the article’s drawing makes it look like the customer’s “token” is some classical information. This cannot work.
Is there any projected practical use for QKD apart from being a jobs program for researchers?
(This is a thing I am fine with, research is research and it doesn't necessarily need a near-term practical outcome, but why is it "sold" to the public as though there is some useful capability coming just around the corner?).
Who would use dedicated fiber to get secrets between point A and point B? Am I just insufficiently imaginative?
Whenever I read these headlines I am reminded of how much biological research needs to have a "could one day cure cancer" to give funders and journalists a hook.
Large companies and governments go to some lengths to protect their internal communications between their sites.
Cloud providers also have some dedicated fiber between their data centers.
Right but what are they going to do with the keys being exchanged? Load them into networked traditional computers?
If the computers are secure they can presumably do authenticated key agreement perfectly well and if they are not then I don't see how the QKD helps.
Security is nuanced and thinking in binaries is often a mistake - but I don't see how QKD meaningfully changes anyone's threat model in any plausible deployment scenario.
QKD will generate a session key, just like Diffie-Hellman or some of the post-quantum DH alternatives. If your threat model includes the risk that someone captures and stores ciphertext and subsequently gets access to a quantum computer and the ability to break whatever post-quantum scheme you’ve augmented with, then maybe QKD is useful. I agree that this is a bit of a stretch.
(Of course, one can also augment DH with symmetric crypto for the datacenter use case, with someone trustworthy literally carrying the key to the other end of the link, and I see no realistic usage of QKD that will outperform that unless one is worried about post-compromise recovery of a symmetric key stored in a piece of hardware. Plus, QKD has its own issues: security of QKD is subject to catastrophic failures if the single-photon source isn’t actually a single-photon source and possibly also if a malicious light source injected into the fiber causes the transmitter to stop being a single-photon source or the receiver to behave in a manner inconsistent with any possible single received photon. Think of these as side channel and fault attacks that are rather difficult to manage.)
I am worried about the future of quantum tokens...
Whilst theoretically they are secure, I worry about potential huge side-channels allowing leaking of the key...
All it takes is a few extra photons emitted at some harmonic frequency for the key to be leaked...
I would much prefer dumb hardware and clever digital software, because at least software is much easier to secure against side channels, and much easier to audit.
In principle quantum communication has no side channels because side channels act like measurements, and measurements make it not a functioning quantum channel in the first place. So you need to have already solved side channel issues for basic function.
That said, wherever you convert the quantum data into classical data there will be potential side channels. For example, there have been attacks based on using a laser down the communication line to track the orientation of the measurement device at the receiver.
In general, the more you can do while the data stays quantum the better. For example, if you transduce the photon into a qubit inside a quantum computer, then the measurement can be hidden away inside the computer, instead of exposed to the communication line. And the measurement basis can be chosen after transmission arrival, instead of before.
The larger issue for most quantum key exchange setups is the transition from classical to quantum: you want not to accidentally generate two unentangled photons in the same secret polarization.
Isn't the entire security of Quantum Communication predicated on its complete lack of side-channels due to the fact that measuring quantum systems collapses their wave function?
Yes, in theory. In practice, photon generators won't behave perfectly. There are lots of possible attacks, like photon splitting [1].
[1] https://onlinelibrary.wiley.com/doi/full/10.1002/qute.202300...
Once you put error correction, doenn't you lose all the nice properties of the non cloning theorem? If the protocol tolerates 30% of errors, doesn't it tolerate 30% of MITM? (60%??)
You don't need error correction for some crypto primitives. There are QKD networks deployed that don't have that kind of error correction, as far as I know.
How can QKD repeaters store and forward or just forward without collapsing phase state?
How does photonic phase state collapse due to fiber mitm compare to a heartbeat on a classical fiber?
There is quantum counterfactual communication without entanglement FWIU? And there's a difference between QND "Quantum Non-Demolition" and "Interaction-free measurement"
From https://news.ycombinator.com/item?id=41480957#41533965 :
>> IIRC I read on Wikipedia one day that Bell's actually says there's like a 60% error rate?(!)
> That was probably the "Bell test" article, which - IIUC - does indeed indicate that if you can read 62% of the photons you are likely to find a loophole-free violation
> [ "Violation of Bell inequality by photon scattering on a two-level emitter", ]
Bell test > Detection loophole: https://en.wikipedia.org/wiki/Bell_test#Detection_loophole :
> when using a maximally entangled state and the CHSH inequality an efficiency of η>2sqrt(2)−2≈0.83 is required for a loophole-free violation.[51] Later Philippe H. Eberhard showed that when using a partially entangled state a loophole-free violation is possible for
η>2/3≈0.67,[52] which is the optimal bound for the CHSH inequality. [53] Other Bell inequalities allow for even lower bounds. For example, there exists a four-setting inequality which is violated for η>(5−1)/2≈0.62 [54]
Isn't modern error detection and classical PQ sufficient to work with those odds?
> Historically, only experiments with non-optical systems have been able to reach high enough efficiencies to close this loophole, such as trapped ions, [55] superconducting qubits, [56] and nitrogen-vacancy centers. [57] These experiments were not able to close the locality loophole, which is easy to do with photons. More recently, however, optical setups have managed to reach sufficiently high detection efficiencies by using superconducting photodetectors, [30][31] and hybrid setups have managed to combine the high detection efficiency typical of matter systems with the ease of distributing entanglement at a distance typical of photonic systems. [10]
No-cloning theorem applies to logical qubits too! That "30% of errors" doesn't allow you to read out the logical state. Information is physical.
Security is never about absolutes. It’s about relative costs vs the attacker. It seems like this system adds a strong enough layer of security over the transport that the attacker would switch to going after the endpoints instead.
With quantum tokens, law enforcement have to crack your physical devices, so they at least have to good-old-fashion bug your devices. With classical schemes, they can intercept on the way.
I wouldn't say that current side-channels, most certainly enabled by hardware, not software, are easier to audit.
I don't think that's true. If you're paranoid you can build a very simple and easy to audit device that lets packets through exactly every x microseconds, with a short buffer to prevent timing via dropouts.
Works fine for digital, doesn't work for quantum stuff.
“lawful intercept” can be mandated to be built into anything
Yes, but it's much easier to see it in hardware than in software.
Hybrids?
How does a quantum state travel through fiber? Does it simply maintain state naturally during the journey?
Light is remarkably good at keeping its polarization state intact for long distances through single mode fiber. At least historically, the main issues with doing quantum computation with light is that’s it’s hard to store light and hard to get one photon to interact with another one in a controlled manner.
(Polarization of a photon is a two-state quantum system, otherwise known as a qubit.)
Does that mean that an individual and unique photon travels the full distance from transmitter to receiver without interacting with anything?
That seems... really difficult. I'd always assumed fiber operates by continually absorbing and reemitting photons at low loss. Maybe I've fundamentally misunderstood optics.
I keep thinking the headline says "unforgivable tokens"