Commercial Quantum Computer?

Quantum computers could revolutionize the way we tackle problems that stump even the best classical computers.
Single atom transistor recently introduced has been seen as a tool that could lead the way to building a quantum computer. For general introduction how quantum computer work, read A tale of two qubits: how quantum computers work article.

D-Wave Announces Commercially Available Quantum Computer article tells that computing company D-Wave has announced that they’re selling a quantum computing system commercially, which they’re calling the D-Wave One. D-Wave system comes equipped with a 128-qubit processor that’s designed to perform discrete optimization operations. The processor uses quantum annealing to perform these operations.

D-Wave is advertisting a number of different applications for its quantum computing system, primarily in the field of artificial intelligence. According to the company, its system can handle virtually any AI application that can be translated to a Markov random field.

dwave

Learning to program the D-Wave One blog article tells that the processor in the D-Wave One – codenamed Rainier – is designed to perform a single mathematical operation called discrete optimization. It is a special purpose processor. When writing applications the D-Wave One is used only for the steps in your task that involve solving optimization problems. All the other parts of your code still run on your conventional systems of choice. Rainier solves optimization problems using quantum annealing (QA), which is a class of problem solving approaches that use quantum effects to help get better solutions, faster. Learning to program the D-Wave One is the first in a series of blog posts describing the algorithms we have run on D-Wave quantum computers, and how to use these to build interesting applications.

But is this the start of the quantum computers era? Maybe not. D-Wave Announces Commercially Available Quantum Computer article comments tell a story that this computer might not be the quantum computer you might be waiting for. It seem that the name “quantum computer” is a bit misleading for this product. There are serious controversies around the working and “quantumness” of the machine. D-Wave has been heavily criticized by some scientists in the quantum computing field. First sale for quantum computing article tells that uncertainty persists around how the impressive black monolith known as D-Wave One actually works. Computer scientists have long questioned whether D-Wave’s systems truly exploit quantum physics on their products.

Slashdot article D-Wave Announces Commercially Available Quantum Computer comments tell that this has the same central problem as before. D-Wave’s computers haven’t demonstrated that their commercial bits are entangled. There’s no way to really distinguish what they are doing from essentially classical simulated annealing. Recommended reading that is skeptical of D-Wave’s claims is much of what Scott Aaronson has wrote about them. See for example http://www.scottaaronson.com/blog/?p=639, http://www.scottaaronson.com/blog/?p=198 although interestingly after he visited D-Wave’s labs in person his views changed slightly and became slightly more sympathetic to them http://www.scottaaronson.com/blog/?p=954.

So it is hard to say if the “128 qubits” part is snake oil or for real. If the 128 “qubits” aren’t entangled at all, which means it is useless for any of the quantum algorithms that one generally thinks of. It seem that this device simply has 128 separate “qubits” that are queried individually, and is, essentially an augmented classical computer that gains a few minor advantages in some very specific algorithms (i.e. the quantum annealing algorithm) due to this qubit querying, but is otherwise indistinguishable from a really expensive classical computer for any other purpose. This has the same central problem as before: D-Wave’s computers haven’t demonstrated that their commercial bits are entangled.

Rather than constantly adding more qubits and issuing more hard-to-evaluate announcements, while leaving the scientific characterization of its devices in a state of limbo, why doesn’t D-Wave just focus all its efforts on demonstrating entanglement, or otherwise getting stronger evidence for a quantum role in the apparent speedup? There’s a reason why academic quantum computing groups focus on pushing down decoherence and demonstrating entanglement in 2, 3, or 4 qubits: because that way, at least you know that the qubits are qubits! Suppose D-Wave were marketing a classical, special-purpose, $10-million computer designed to perform simulated annealing, for 90-bit Ising spin glass problems with a certain fixed topology, somewhat better than an off-the-shelf computing cluster. Would there be even 5% of the public interest that there is now?

225 Comments

  1. Tomi Engdahl says:

    D-Wave Launches Free Quantum Cloud Service
    https://spectrum.ieee.org/tech-talk/computing/hardware/dwave-launches-free-quantum-cloud-service

    Even if you could afford the US $15 million to buy a D-Wave 2000Q quantum annealer [PDF], for example, you would need experts to maintain the ultracold operating conditions its processor requires.

    Until today, that is, when Canadian startup D-Wave Systems Inc. launched a real-time online quantum computing environment called Leap. Leap is the latest addition to the quantum cloud—services that virtualize quantum computing for almost anyone with a computer and a broadband connection to use.

    Leap allows anyone to sign up, giving them one minute of time on a cloud-connected 2000Q each month.

    “We want to enable hundreds of thousands or millions of developers to gain access to quantum computing technology, to understand it, and to develop applications,” says Murray Thom, D-Wave’s director of software and cloud services.

    As well as granting access to a 2000Q computer housed at D-Wave’s headquarters in the Vancouver suburbs, Leap provides documentation, videos, training materials, and a community for the majority of developers who have never worked with a quantum computing device.

    Reply
  2. Tomi Engdahl says:

    D-Wave offers the first public access to a quantum computer
    From Python to parallel universes

    https://techcrunch.com/2018/10/05/d-wave-offers-the-first-public-access-to-a-quantum-computer/

    Reply
  3. Tomi Engdahl says:

    Graduate Student Solves Quantum Verification Problem
    https://www.quantamagazine.org/graduate-student-solves-quantum-verification-problem-20181008/

    Urmila Mahadev spent eight years in graduate school solving one of the most basic questions in quantum computation: How do you know whether a quantum computer has done anything quantum at all?

    Reply
  4. Tomi Engdahl says:

    Lawrence Krauss: Quantum Computing Explained
    https://www.youtube.com/watch?v=UUpqnBzBMEE

    Lawrence Krauss describes quantum computing and the technical obstacles we need to overcome to realize this Holy Grail of processing.

    Lawrence Krauss: Let me briefly describe the difference between a quantum computer and a regular computer, at some level.

    But in the quantum world, it turns out that particles like electrons are actually spinning in all directions at the same time, one of the weird aspects of quantum mechanics. We may measure, by doing a measurement of an electron, find it’s spinning this way. But before we did the measurement, it was spinning this way and this way and that way and that way all at the same time. Sounds crazy, but true.

    Reply
  5. Tomi Engdahl says:

    Researchers Finally Proved Quantum Computers are More Powerful Than Classical Computers
    https://motherboard.vice.com/en_us/article/evw93z/researchers-finally-proved-quantum-computers-are-more-powerful-than-classical-computers

    Until this week, there was no conclusive proof that quantum computers have an advantage over classical computers.

    For the first time, an international team of researchers has proven that quantum computers offer a computational advantage over classical computers.

    As detailed in a paper published Thursday in Science, the researchers designed a quantum circuit that was able to solve a math problem that would be impossible for a classical computer to solve when subject to the same constraints.

    “Our work shows that quantum circuits are computationally more powerful than classical ones of the same structure,” Robert König, a complexity theorist at the Technical University of Munich and lead author of the paper, told me in an email. “We are not saying that the problem cannot be solved classically. It can, though this requires more resources.”

    The team was able to achieve quantum advantage due to “nonlocality,” a feature of spatially isolated quantum systems that allows them to be considered a single system: a change in one system instantaneously results in a change in another.

    Quantum advantage with shallow circuits
    http://science.sciencemag.org/content/362/6412/308

    Reply
  6. Tomi Engdahl says:

    Kvanttiteknologiaan miljardirahoitus EU:lta – Aalto ja VTT mukana
    https://www.uusiteknologia.fi/2018/10/29/kvanttiteknologiaan-miljardirahoitus-eulta-aalto-ja-vtt-mukana/

    Euroopan unioni rahoittaa mittavaa Quantum Flagship -hanketta miljardilla eurolla kymmeneksi vuodeksi ja siihen osallistuu yli 5000 tutkijaa. Suomesta Aalto-yliopiston ryhmät tutkivat ja kehittävät kvanttiviestintäteknologiaa, kvanttioptisiin ilmiöihin perustuvia erityisherkkiä magneettiantureita sekä fotoneja välittäviä kvanttisiruja. Suomesta mukana kvanttiohjelmassa on myös VTT.

    Reply
  7. Tomi Engdahl says:

    Graduate Student Solves Quantum Verification Problem
    By
    ERICA KLARREICH
    October 8, 2018
    https://www.quantamagazine.org/graduate-student-solves-quantum-verification-problem-20181008/

    Urmila Mahadev spent eight years in graduate school solving one of the most basic questions in quantum computation: How do you know whether a quantum computer has done anything quantum at all?

    Reply
  8. Tomi Engdahl says:

    Quantum Physicists Found a New, Safer Way to Navigate
    https://www.wired.com/story/quantum-physicists-found-a-new-safer-way-to-navigate/

    In 2015, the U.S. Naval Academy decided that its graduates needed to return to the past and learn how to navigate using the stars. Nine years prior, it had dropped celestial navigation from its requirements because GPS was so accurate and simple to use.

    Reply
  9. Tomi Engdahl says:

    Tests show integrated quantum chip operations possible
    https://electroiq.com/2018/10/tests-show-integrated-quantum-chip-operations-possible/

    Quantum computers that are capable of solving complex problems, like drug design or machine learning, will require millions of quantum bits – or qubits – connected in an integrated way and designed to correct errors that inevitably occur in fragile quantum systems.

    Now, an Australian research team has experimentally realised a crucial combination of these capabilities on a silicon chip, bringing the dream of a universal quantum computer closer to reality.

    They have demonstrated an integrated silicon qubit platform that combines both single-spin addressability – the ability to ‘write’ information on a single spin qubit without disturbing its neighbours – and a qubit ‘read-out’ process that will be vital for quantum error correction.

    Reply
  10. Tomi Engdahl says:

    Novel Quantum Emitter Provides Key Building Block for a Quantum Internet
    Moving away from a decade of using quantum dots, this on-chip design dramatically boosts the efficiency of quantum emitters

    https://spectrum.ieee.org/nanoclast/semiconductors/materials/novel-onchip-quantum-emitter-provides-key-building-block-for-a-quantum-internet

    Reply
  11. Tomi Engdahl says:

    Quantum memories

    https://semiengineering.com/manufacturing-bits-nov-13/

    The University of Alberta has developed a new method for making quantum memories, paving the way for a next-generation quantum Internet.

    Quantum memory is targeted for quantum networks and computers. In classical computing, the information is stored in bits, which can be either a “0” or “1”. In quantum computing, information is stored in quantum bits, or qubits, which can exist as a “0” or “1” or a combination of both. The superposition state enables a quantum computer to perform millions of calculations at once.

    Quantum communications or networks follow the same basic idea. A quantum fiber link connects one location to another, which is supposedly impossible to hack. Traditional communication networks use public key cryptography. In contrast, quantum key distribution (QKD) uses quantum superposition states for unconditional security.

    Quantum networks have been demonstrated over short distances. But long distance QKDs are limited, due to losses in the optical fibers, according to the Centre of Excellence for Quantum Computation and Communication Technology in Australia.

    Still in R&D, quantum memories have been in the works for years. Today’s quantum memories are based on various coherent light–matter interaction schemes, but they are limited due to a host of technical challenges, according to the University of Alberta.

    U of A physicists develop new technique to create quantum memory
    https://www.folio.ca/u-of-a-physicists-develop-new-technique-to-create-quantum-memory/

    Discovery offers a simpler way to build a crucial element for more secure data communication and storage.

    Reply
  12. Tomi Engdahl says:

    The Case Against Quantum Computing
    https://spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing

    The proposed strategy relies on manipulating with high precision an unimaginably huge number of variables

    Quantum computing is all the rage. It seems like hardly a day goes by without some news outlet describing the extraordinary things this technology promises. Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decades—and without any practical results to show for it.

    We’ve been told that quantum computers could “provide breakthroughs in many disciplines, including materials and drug discovery, the optimization of complex manmade systems, and artificial intelligence.” We’ve been assured that quantum computers will “forever alter our economic, industrial, academic, and societal landscape.” We’ve even been told that “the encryption that protects the world’s most sensitive data may soon be broken” by quantum computers. It has gotten to the point where many researchers in various fields of physics feel obliged to justify whatever work they are doing by claiming that it has some relevance to quantum computing.

    Meanwhile, government research agencies, academic departments (many of them funded by government agencies), and corporate laboratories are spending billions of dollars a year developing quantum computers.

    It’s become something of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to avoid being left behind. Some of the world’s top technical talent, at places like Google, IBM, and Microsoft, are working hard

    In light of all this, it’s natural to wonder: When will useful quantum computers be constructed? The most optimistic experts estimate it will take 5 to 10 years. More cautious ones predict 20 to 30 years. (Similar predictions have been voiced, by the way, for the last 20 years.)

    I belong to a tiny minority that answers, “Not in the foreseeable future.” Having spent decades conducting research in quantum and condensed-matter physics, I’ve developed my very pessimistic view.

    The idea of quantum computing first appeared nearly 40 years ago, in 1980, when the Russian-born mathematician Yuri Manin

    Realizing that computer simulations of quantum systems become impossible to carry out when the system under scrutiny gets too complicated, Feynman advanced the idea that the computer itself should operate in the quantum mode

    The subject did not attract much attention, though, until 1994, when mathematician Peter Shor (then at Bell Laboratories and now at MIT) proposed an algorithm for an ideal quantum computer that would allow very large numbers to be factored much faster than could be done on a conventional computer. This outstanding theoretical result triggered an explosion of interest in quantum computing.

    The basic idea of quantum computing is to store and process information in a way that is very different from what is done in conventional computers

    In quantum computing, the classical two-state circuit element (the transistor) is replaced by a quantum element called a quantum bit, or qubit.

    the simplest thing to use is the electron’s internal angular momentum, or spin
    For whatever the chosen axis, you can denote the two basic quantum states of the electron’s spin

    Here’s where things get weird. With the quantum bit, those two states aren’t the only ones possible.

    In contrast to a classical bit, which can only be in one of its two basic states, a qubit can be in any of a continuum of possible states

    Yes, quantum mechanics often defies intuition. But this concept shouldn’t be couched in such perplexing language.

    Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000.

    To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.

    In contrast, it’s absolutely unimaginable how to keep errors under control for the 10300 continuous parameters that must be processed by a useful quantum computer. Yet quantum-computing theorists have succeeded in convincing the general public that this is feasible.

    they argue, you can handle errors by forming logical qubits using multiple physical qubits.

    How many physical qubits would be required for each logical qubit? No one really knows, but estimates typically range from about 1,000 to 100,000. So the upshot is that a useful quantum computer now needs a million or more qubits.

    The huge amount of scholarly literature that’s been generated about quantum-computing is notably light on experimental studies describing actual hardware. The relatively few experiments that have been reported were extremely difficult to conduct, though, and must command respect and admiration.

    By contrast, the theory of quantum computing does not appear to meet any substantial difficulties in dealing with millions of qubits.

    Here, the key words are “under certain assumptions.”

    I argue that they can’t. In the physical world, continuous quantities (be they voltages or the parameters defining quantum-mechanical wave functions) can be neither measured nor manipulated exactly.

    While various strategies for building quantum computers are now being explored, an approach that many people consider the most promising, initially undertaken by the Canadian company D-Wave Systems and now being pursued by IBM, Google, Microsoft, and others, is based on using quantum systems of interconnected Josephson junctions cooled to very low temperatures (down to about 10 millikelvins).

    On the hardware front, advanced research is under way, with a 49-qubit chip (Intel), a 50-qubit chip (IBM), and a 72-qubit chip (Google) having recently been fabricated and studied. The eventual outcome of this activity is not entirely clear, especially because these companies have not revealed the details of their work.

    While I believe that such experimental research is beneficial and may lead to a better understanding of complicated quantum systems, I’m skeptical that these efforts will ever result in a practical quantum computer.

    Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system?

    My answer is simple. No, never.

    I believe that, appearances to the contrary, the quantum computing fervor is nearing its end. That’s because a few decades is the maximum lifetime of any big bubble in technology or science. After a certain period, too many unfulfilled promises have been made, and anyone who has been following the topic starts to get annoyed by further announcements of impending breakthroughs.

    There is a tremendous gap between the rudimentary but very hard experiments that have been carried out with a few qubits and the extremely developed quantum-computing theory, which relies on manipulating thousands to millions of qubits to calculate anything useful. That gap is not likely to be closed anytime soon.

    Reply
  13. Tomi Engdahl says:

    Quantum Computing: Atomic Clocks Make for Longer-Lasting Qubits
    https://spectrum.ieee.org/computing/hardware/quantum-computing-atomic-clocks-make-for-longerlasting-qubits

    IBM now has a 50-qubit machine, Intel is at 49 qubits, and Google has developed a 72-qubit device. And in September, Pennsylvania State University researchers announced they’d built the framework for a 125-qubit compute engine.

    However, unlike the more mature devices from IBM, Intel, and Google, the foundational element for the proof-of-concept Penn State system is not the computer chip but rather the atomic clock.

    The neutral-atom quantum computer, proposed by the Penn State group and other researchers around the world, uses a cesium atom in a laser trap (the gold standard of precision timekeeping) as the quantum bit on which the compute engine is based.

    “There’s no quantum-mechanical system we understand better than an atom,”

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*