A chat with an IBM researcher on quantum (Q.) computing rocks my view of reality.
“Encode classical data points into Q.state using unitary functions that we can apply to these data points, then we measure it. It’s basically feature mapping.”
“We don’t know if it’s useful for now. We just have the machines and we’re trying things out. In less than five years we’ll have some decent results.”
Like with all good science, they’re essentially playing.
I don’t get it. Do you get it? He tells me the collision of QM and ML basically solves the computational load problem. Classification at a fraction of the cost.
“Classification results in a week – with QC it’ll take a couple of hours. A n.net with weights you can parameterise goes much faster because it’s based on atoms that move faster than light.”
Imagine entangled communication.
“The problem of a non-binary conduit between in/output is noise – can’t trust it because so much noise - SOTA is all about noise reduction.”
“The Chinese did it. They don’t admit it, but they sent a q.bit to a satellite.”
We laugh there could be a honeypot; if we even look, they'd know.
“We’ve looked at reality through a classical computer as a lens. We’ve seen a deterministic reality.”
We’re beginning to look through a quantum lens.