Saturday, May 15, 2021

Quantum Computing: Top Players 2021

[This is a transcript of the video embedded below.]

Quantum computing is currently one of the most exciting emergent technologies, and it’s almost certainly a topic that will continue to make headlines in the coming years. But there are now so many companies working on quantum computing, that it’s become really confusing. Who is working on what? What are the benefits and disadvantages of each technology? And who are the newcomers to watch out for? That’s what we will talk about today.

Quantum computers use units that are called “quantum-bits” or qubits for short. In contrast to normal bits, which can take on two values, like 0 and 1, a qubit can take on an arbitrary combination of two values. The magic of quantum computing happens when you entangle qubits.

Entanglement is a type of correlation, so it ties qubits together, but it’s a correlation that has no equivalent in the non-quantum world. There are a huge number of ways qubits can be entangled and that creates a computational advantage - if you want to solve certain mathematical problems.

Quantum computer can help for example to solve the Schrödinger equation for complicated molecules. One could use that to find out what properties a material has without having to synthetically produce it. Quantum computers can also solve certain logistic problems of optimize financial systems. So there is a real potential for application.

But quantum computing does not help for *all types of calculations, they are special purpose machines. They also don’t operate all by themselves, but the quantum parts have to be controlled and read out by a conventional computer. You could say that quantum computers are for problem solving what wormholes are for space-travel. They might not bring you everywhere you want to go, but *if they can bring you somewhere, you’ll get there really fast.

What makes quantum computing special is also what makes it challenging. To use quantum computers, you have to maintain the entanglement between the qubits long enough to actually do the calculation. And quantum effects are really, really sensitive even to smallest disturbances. To be reliable, quantum computer therefore need to operate with several copies of the information, together with an error correction protocol. And to do this error correction, you need more qubits. Estimates say that the number of qubits we need to reach for a quantum computer to do reliable and useful calculations that a conventional computer can’t do is about a million.

The exact number depends on the type of problem you are trying to solve, the algorithm, and the quality of the qubits and so on, but as a rule of thumb, a million is a good benchmark to keep in mind. Below that, quantum computers are mainly of academic interest.

Having said that, let’s now look at what different types of qubits there are, and how far we are on the way to that million.

1. Superconducting Qubits

Superconducting qubits are by far the most widely used, and most advanced type of qubits. They are basically small currents on a chip. The two states of the qubit can be physically realized either by the distribution of the charge, or by the flux of the current.

The big advantage of superconducting qubits is that they can be produced by the same techniques that the electronics industry has used for the past 5 decades. These qubits are basically microchips, except, here it comes, they have to be cooled to extremely low temperatures, about 10-20 milli Kelvin. One needs these low temperatures to make the circuits superconducting, otherwise you can’t keep them in these neat two qubit states.

Despite the low temperatures, quantum effects in superconducting qubits disappear extremely quickly. This disappearance of quantum effects is measured in the “decoherence time”, which for the superconducting qubits is currently a few 10s of micro-seconds.

Superconducting qubits are the technology which is used by Google and IBM and also by a number of smaller companies. In 2019, Google was first to demonstrate “quantum supremacy”, which means they performed a task that a conventional computer could not have done in a reasonable amount of time. The processor they used for this had 53 qubits. I made a video about this topic specifically, so check this out for more. Google’s supremacy claim was later debated by IBM. IBM argued that actually the calculation could have been performed within reasonable time on a conventional super-computer, so Google’s claim was somewhat premature. Maybe it was. Or maybe IBM was just annoyed they weren’t first.

IBM’s quantum computers also use superconducting qubits. Their biggest one currently has 65 qubits and they recently put out a roadmap that projects 1000 qubits by 2023. IBMs smaller quantum computers, the ones with 5 and 16 qubits, are free to access in the cloud.

The biggest problem for superconducting qubits is the cooling. Beyond a few thousand or so, it’ll become difficult to put all qubits into one cooling system, so that’s where it’ll become challenging.

2. Photonic quantum computing

In photonic quantum computing the qubits are properties related to photons. That may be the presence of a photon itself, or the uncertainty in a particular state of the photon. This approach is pursued for example by the company Xanadu in Toronto. It is also the approach that was used a few months ago by a group of Chinese researchers, which demonstrated quantum supremacy for photonic quantum computing.

The biggest advantage of using photons is that they can be operated at room temperature, and the quantum effects last much longer than for superconducting qubits, typically some milliseconds but it can go up to some hours in ideal cases. This makes photonic quantum computers much cheaper and easier to handle. The big disadvantage is that the systems become really large really quickly because of the laser guides and optical components. For example, the photonic system of the Chinese group covers a whole tabletop, whereas superconducting circuits are just tiny chips.

The company PsiQuantum however claims they have solved the problem and have found an approach to photonic quantum computing that can be scaled up to a million qubits. Exactly how they want to do that, no one knows, but that’s definitely a development to have an eye on.

3. Ion traps

In ion traps, the qubits are atoms that are missing some electrons and therefore have a net positive charge. You can then trap these ions in electromagnetic fields, and use lasers to move them around and entangle them. Such ion traps are comparable in size to the qubit chips. They also need to be cooled but not quite as much, “only” to temperatures of a few Kelvin.

The biggest player in trapped ion quantum computing is Honeywell, but the start-up IonQ uses the same approach. The advantages of trapped ion computing are longer coherence times than superconducting qubits – up to a few minutes. The other advantage is that trapped ions can interact with more neighbors than superconducting qubits.

But ion traps also have disadvantages. Notably, they are slower to react than superconducting qubits, and it’s more difficult to put many traps onto a single chip. However, they’ve kept up with superconducting qubits well.

Honeywell claims to have the best quantum computer in the world by quantum volume. What the heck is quantum volume? It’s a metric, originally introduced by IBM, that combines many different factors like errors, crosstalk and connectivity. Honeywell reports a quantum volume of 64, and according to their website, they too are moving to the cloud next year. IonQ’s latest model contains 32 trapped ions sitting in a chain. They also have a roadmap according to which they expect quantum supremacy by 2025 and be able to solve interesting problems by 2028.

4. D-Wave

Now what about D-Wave? D-wave is so far the only company that sells commercially available quantum computers, and they also use superconducting qubits. Their 2020 model has a stunning 5600 qubits.

However, the D-wave computers can’t be compared to the approaches pursued by Google and IBM because D-wave uses a completely different computation strategy. D-wave computers can be used for solving certain optimization problems that are defined by the design of the machine, whereas the technology developed by Google and IBM is good to create a programmable computer that can be applied to all kinds of different problems. Both are interesting, but it’s comparing apples and oranges.

5. Topological quantum computing

Topological quantum computing is the wild card. There isn’t currently any workable machine that uses the technique. But the idea is great: In topological quantum computers, information would be stored in conserved properties of “quasi-particles”, that are collective motions of particles. The great thing about this is that this information would be very robust to decoherence.

According to Microsoft “the upside is enormous and there is practically no downside.” In 2018, their director of quantum computing business development, told the BBC Microsoft would have a “commercially relevant quantum computer within five years.” However, Microsoft had a big setback in February when they had to retract a paper that demonstrated the existence of the quasi-particles they hoped to use. So much about “no downside”.

6. The far field

These were the biggest players, but there are two newcomers that are worth having an eye on.

The first is semi-conducting qubits. They are very similar to the superconducting qubits, but here the qubits are either the spin or charge of single electrons. The advantage is that the temperature doesn’t need to be quite as low. Instead of 10 mK, one “only” has to reach a few Kelvin. This approach is presently pursued by researchers at TU Delft in the Netherlands, supported by Intel.

The second are Nitrogen Vacancy systems where the qubits are places in the structure of a carbon crystal where a carbon atom is replaced with nitrogen. The great advantage of those is that they’re both small and can be operated at room temperatures. This approach is pursued by The Hanson lab at Qutech, some people at MIT, and a startup in Australia called Quantum Brilliance.

So far there hasn’t been any demonstration of quantum computation for these two approaches, but they could become very promising.

So, that’s the status of quantum computing in early 2021, and I hope this video will help you to make sense of the next quantum computing headlines, which are certain to come.

I want to thank Tanuj Kumar for help with this video.


  1. This comment has been removed by the author.

    1. This comment has been removed by the author.

  2. The D-wave computer is an annealing machine. This is a sort of quantum version of a neural network. It is an optimizing system.

    Quantum computers are based on linear algebra on a complex field. As such quantum computers only really solve linear algebra problems. The Shor algorithm is a Fourier transform method. A lot of mathematics, and by extension physics, involves linear algebra. Many mathematical theorems are solved by transforming the problem into linear algebra, where the methods are well known.

    The quantum computer will creep into the computing world slowly at first and in time will assume some level of importance. There are other architectures that will also assume more importance, artificial neural nets, spintronics, and others. As computing most probably must conform to the Church-Turing thesis it is likely though these systems will be supplementary to a standard von-Neumann computer, such as what we have.

    1. Hi Lawrence,
      Perhaps neural nets and other AI computing will be among the specialist applications that quantum computing will be used for, with hybrid systems made to increase efficiency.
      And WRT China, I think where they lack advantage is in collaboration and information exchange internationally; same with Russia. I personally think the greatest leaps will come with cross-pollination of methods and ideas.

  3. There is a challenge that involves China on this. It is interesting how China has become the bogie-man these days. Their government has been playing dishonestly for many years, but American companies were making profits anyway, so nobody paid attention. It took t'Rump to raise the alarm, but his way of doing this was hopelessly wrong. With all his stuff about Kung-Flu and China-virus we now have a sickening east-Asian hate thing going on.

    China has its sights on gaining a monopoly on as many technological areas as possible. They have a lot of world outside their sphere of influence to surpass. Whether they succeed and if the US and EU (UK if there is such a thing before long) rise to the challenge is to be seen.

    Russia is also a bit of a challenge, but their military developments are being built on a weak national and economic basis. From the white Tsars, to the red Tsars and now the blue-grey Tsars this pattern has happened repeatedly.

    China wants to master a global quantum internet. EPR and other entanglements W, GHZ etc will probably be implemented on fiber optic and U-verse. They may succeed, and the logo image of Huawei looks a bit like a sliced up apple.

    It is too bad in a way that this all feeds into power games and militarization.

  4. ion traps - shouldn't be positive charge (instead of negative

    1. Yes, sorry, I have fixed that in the text. Can't fix it in the video. It's in the info.

  5. Interesting video. So how long do you think before Shor's Algorithm will be run for non trivial cases?

    1. Is it possible for humans to change the earth's axial tilt? Can we cancel climate change by that?

  6. This comment has been removed by the author.

  7. Is Shor's Algorithm the fastest algorithm, or is the mindset on what is logically fastest wrong? I think fixing that issue would be necessary for understanding if quantum or regular computers will be the fastest. Only example I have that can produce evidence of screwing up the notion of run time is Simple algorithm I figured out years ago. Take N odd, and instead of trying to find PQ as a rectangle like kids would do with unit blocks, you find a unit block trapazoid it converges to and then that shape can be broken into a rectangle. While it can be simplified in computer speak, geometry of it is just first try to make a right triangle with the square blocks. If the blocks make a right triangle then break the triangle if its even height in half and flip around and you got a rectangle, which gives two factors. If its odd break off the nose and flip it up and you got a rectangle, which gives factors. But likely there remainder blocks at bottom of triangle, so you take rows off the top and throw them on the bottom till there is no remainder, you converge to a block trapezoid. The run time for this is bizaree. If one takes N odd, and trying to find p or q, but say takes 3N, it can actually often does converge faster.. it can pop out p or q or 3p or 3q or pq. This is because of the geometry, because it will tend to converge to the largest factor first below the initial triangle height. That why increasing the size it can converge faster. Shor's algorithm fits this nice notion of Log N convergence... not this chaos algorithm convergence time of periodic unknown. Its runtime is like rolling a dice and sometimes it will be instant, not matter how large the key size of something like RSA. That is why I think its important for the discussion of quantum computers verses regular computers that the computer science notion of Run Time itself be challenged with evidence like the chaos algorithm above has produced. If we live in an evidence based system, and there out of the blue pops up evidence that run time itself may need core logic retuning, then that needs to be done. I really should publish the algorithm and the data charts for how it defies the notion of runtime. But I can say with evidence that Shor's algorithm while nice and pretty, these screwy unexplored functions at times can beat it. And if a set of these chaos algorithms can be put together and ran at the same time, there possible a high density of constant run time convergence where N size is relatively unimportant. Entire class of algorithms that are unexplored. The beauty of math has no boundaries.

    1. FWIW, Wikipedia reports that there are multiple cryptographic systems that are not breakable by the known quantum computer algorithms, and that several of these date back to the last century*. It'll be a minor irritation to have to switch to a different system, but my understanding is that software that implements quantum-safe cryptography has already been written and is ready to use.

      *: At least two (from the 1970s) predated Shor's Algorithm, and a couple more were invented after that.

  8. I am surprised that when a quantum computer needs about a million qubit, Google already declared quantum supremacy with only 53 qubit. If this claim is correct then, with a million qubit, the quantum computer will be really fantastic. Do you agree?

    1. FWIW, the "problem" that Google "solved" in achieving "quantum supremacy" was simulating quantum gates. It's not particularly surprising (to me, anyway) that quantum gates are good at simulating quantum gates, but, whatever. The main spokesperson for quantum computing has been very specific in stating that this "problem" and it's "solution" are unrelated to anything anyone would ever want or need to do with a computer, but he insists that (a) it's true that quantum supremacy has been achieved, and (b) it's really good that someone found something to actually do with current quantum computers.

      As I said, "whatever".

      I will admit, though, that as a 1970s/1980s generation Comp. Sci. type, I'm surprised how few things other than Shor's algorithm have been found that can be done with quantum computers. According to a recent blog post at a Comp. Sci. blog, there's really only one other algorithm. And it's been 25 years.

  9. The algorithms which supposedly demonstrate "quantum speedup" tend to have caveats, for example the quantum Fourier transform part of Shor's algorithm would scale well but the exponentiation part which is required to load your number (and test integer) into the QFT is much heavier. There are algorithms which supposedly demonstrate that an oracle can be interrogated once in order to obtain all the information about it, but the oracle is necessarily part of your quantum circuit so you already knew how to program it. They tend to rely on a conditional-NOT gate kicking its phase back to the control qubit if your control and target qubits are not in pure |0> or |1>.

    (Generally, even the problem of how to make the most efficient transpilation of a quantum circuit onto real hardware isn't even solved.)

    But "entanglement" isn't always necessary, but rather superposition.

    (I make material for semiconductor qubits by the way, but I'm not affiliated with TU Delft).

    1. For oracle problems, if you put your oracle on one half of the computer chip and the algorithm circuit on the other half, I really don't see why this wouldn't demonstrate quantum speedup.

  10. While Penrose has written extensively on this and related topics, it's all philosophic speculation with not one whit of actual evidence behind it.

  11. This comment has been removed by the author.

  12. Penrose's idea of humans performing quantum computations is not likely right. For one thing, bounded quantum polynomial space is a subset of PSPACE, which is the set of algorithms that satisfy the Church-Turing thesis --- well modulo oracle inputs. This most likely means the human brain does not perform self-referential loop calculations that skirt the limits set by Godel and Turing.

  13. Quantum computing is a scam to pump up stock, the base physics of it is flood/wrong, they will use specialied hardware (ex: Cuda Cores) with AI to get certain calculations done then will call it quantum computer.
    Let's look at other things like Light Computing and see what goes on there.

    1. Lee, I agree entirely. The 'ideal' they still aim for remains an infinite distance away because the theory behind it is wrong. The Swiss banks had the good sense to turn down Anton Zeilingers proposal for quantum cryptographic security because is was founded on Poppers 'mud'. Those throwing millions into trying to develop true quantum computers may also one day see through the hype. Uncertainty has a real fundamental physical cause (SpringerNature paper imminent) and I suggest it can't be overcome.

    2. Hi Lee and Peter,
      so you both think Dr. Hossenfelder is mistaken about the current developments, or what?

    3. C Thompson,

      Those are guys who don't understand how quantum computing works and who also haven't noticed, apparently, that as a matter of fact it does work, and that we know -- again, for a fact -- that the theory behind it is correct (in the parameter range tested, etc etc).

      World's full with people who have strong opinions on things they know very little about.

    4. Dr. Hossenfelder,
      Indeed. I wondered what they thought made them better-informed about quantum computing that you supposedly missed with your well-researched and comprehensive summary, and why they saw fit to comment thusly on this blog, with no evidence to back up their claims, especially Lee's comment.

    5. C Thompson,

      Yes, you are asking very good questions...

  14. 1000000 qubits? Still a lot of questions.

    I think I can show even this would live up to the hype with the following thought experiment:

    For the sake of discussion, lets imagine we want to break an encryption which uses 100 digit prime numbers for keys. So you would have factor a 200 digit number into a couple of 100 digit primes to break the code.

    Now this is going to be pretty tough. Consider how many 100 digit prime numbers there are, and that defines the space you have search with your quantum code breaker in order to find your answer.

    Now remember that getting "close" with some kind of refinement process isn't going to hack it. The nature of beast is such that you either find the answer or you don't. What is more, you are constrained by your search because you have to deal with the whole search space directly as a whole, because the information held by your superimposition is only held holistically. You can't look at just a part of the space and hope to find your answer.

    So just how big is our search space? A reasonable approximation for discussion purposes is the set of all 100 digit integers. So lets get a handle on just how large this search space is on an intuitive level.

    Lets look for our needle in a haystack by getting an idea how large the haystack is. Assume a 0.5 mm x 0.5 mm x 3 cm as our needle size. Now what is the size of the haystack?

    The volume of our needle would be 7.5 mm^3 and the volume of our haystack would be 7.5 x 10^100 mm^3

    Converting 1 light-year to millimeters we have:
    (1 lightyear)x365.35x24x60x60x186000x5280x12x2.54x10 = 9.45x10^18 mm

    or (1 lightyear) / 9.45x10^18 = 1 mm

    Which means 1 cubic lightyear / (8.43x10^56) = 1 cubic millimeter

    So our haystack is (7.5x10^100 / 8.43x10^56) cubic lightyears

    or 8.897x10^43 cubic lightyears

    Assuming the diameter of the Universe is a cube 93 billion lightyears on a side, then the volume of the Universe is about 8.04x10^32 cubic lightyears.

    So 8.897x10^43 / 8.04x10^32 = a haystack about 111 billion times as big as the whole of the known Universe.

    In other words we need to find and isolate 1 needle in a haystack over 100 billion times as big as our whole Universe.

    Somehow I doubt we will ever achieve such a feat even with a million qubit quantum computer because our error rate would have to be low enough to distinguish that one prime number "needle" from all other 100 digit prime numbers without error.


COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.