Quantum and classical computers handle time differently. What does that mean for AI?
In fact, scientists have proven that time’s arrow – a bedrock concept related to the classical view of time – doesn’t really work on quantum computers. Classical physics suffers from a concept called causal asymmetry. Basically, if you throw a bunch of confetti in the air and take a picture when each piece is at its apex, it’ll be easier for a classical computer to determine what happens next (where the confetti is going) than what happened before (what direction the confetti would travel in going backwards through time).
Quantum computers can perform both calculations with equal ease, thus indicating they do not suffer causal asymmetry. Time’s arrow is only relevant to classical systems – of which the human mind appears to be, though our brains are almost certainly quantum constructs.
But experts such as Gary Marcus and Ernest Davis believe an understanding of time is essential to the future of AI, especially as it relates to “human-level” artificial general intelligence (AGI). The duo penned an op-ed for the New York Times where they stated:
In particular, we need to stop building computer systems that merely get better and better at detecting statistical patterns in data sets — often using an approach known as deep learning — and start building computer systems that from the moment of their assembly innately grasp three basic concepts: time, space and causality.
Quantum physics tells us that, at the very least, our understanding of time is likely different from what might be the ultimate universal reality.
See the full story here: https://thenextweb.com/neural/2020/09/17/quantum-and-classical-computers-handle-time-differently-what-does-that-mean-for-ai/
Pages
- About Philip Lelyveld
- Mark and Addie Lelyveld Biographies
- Presentations and articles
- Tufts Alumni Bio