Feature
Scalable Quantum Computing with Linear Optics and Quantum Memories
Quantum computing is a powerful application of the laws of quantum physics. It may one day be far more efficient than conventional computation methods at factoring large integers, performing database searches and making quantum physics simulations.
Computing techniques have been around since antiquity. One of the earliest effective tools—the abacus—appeared around 2400 B.C.E. The first calculator didn’t follow until several thousand years later, during the 17th century. By the dawn of the 20th century, machines that could perform a single task, such as punch clocks and desk calculators, had become commonplace. The term “computer” was coined in the middle of the 20th century, as machines became bigger and more able to perform complex functions.
…Log in or become a member to view the full text of this article.
This article may be available for purchase via the search at Optica Publishing Group.
Optica Members get the full text of Optics & Photonics News, plus a variety of other member benefits.