As technological progress is encoded as Moore's Law slows down, computer scientists are turning to alternative computational methods, such as the superconductors of quantum processors to generate computational gains in the future.

Jeffrey Welser, vice president and laboratory director at IBM Research in Almaden, spoke about quantum computing at the 49th annual Semicon West chip manufacturing fair in San Francisco last week. I reached it to obtain from him a secular point of view of quantum computing.

IBM also shows part of its IBM Q system at the show, which gives us an idea of how much cooling technology has been built around a current quantum processor to ensure that its calculations are accurate.

The binary digits (ones and zeros) are the basic components of information in classical computers. Quantum bits, or qubits, are built on a much smaller scale. And the qubits can be in a state of 0, 1 or both at a given time. These computers can handle very complex calculations in parallel, but they require a lot of manufacturing precision. IBM is working to improve this, and it may be years before improvements are consolidated and give quantum computing the chance to outperform classic computers, Welser said.

In a quantum processor, the superconducting qubits, or quantum bits, process the quantum information and send the calculation results through the system through microwave signals. The whole contraption around the processor has wanted to cool it down as much as possible. The quantum processor has to sit inside a shield to protect itself from electromagnetic radiation.

Here is an edited transcript of our interview.

**VentureBeat: The usual question is, what the hell is quantum computing?**

**Jeff Welser:** Quantum computing is a form of computation that takes advantage of some quantum effects that we believe can make certain types of algorithms much more efficient than classical ones. The basic unit for a quantum computer is something we call a quantum bit, a qubit. We are all familiar with regular bits, one or zero. That's what we use for the normal calculation. A qubit can also be one or two, but since it is a quantum bit, it can be in an overlap of one and one zero at the same time. It has a chance to be any of these.

In addition, you can enter two qubits, or hundreds or thousands of qubits, and if you do one in them, determine the status of all of them instantly, due to the entanglement. In a certain sense, it gives you the ability to do a massively parallel calculation. For algorithms or problems that are assigned to it, you can do things exponentially faster or better than with a classical system.

Examples of things that can do this are chemistry and materials, of course, are based on quantum chemistry. It's all quantum effects. You can simulate those atoms or interactions with a quantum computer much more accurately and at much larger scales. The example on the keynote, think about the caffeine molecule. It is an important molecule for us every day. It has about 95 electrons, so it is not a particularly large molecule, but if you want to simulate that in a classical computer, it must have 10 to 48 bits of classical energy. For reference, there are about 10 to 50 atoms of power on planet earth. Obviously, you will never do that.

With a quantum system, if it were a very robust and fault-tolerant quantum system, it could do so with only 160 qubits. The system here is a model of our 50 qubit system. We are not that far from today. If you go to the IBM Q website, you can access a 16 qubit system with which you can play for fun. In a sense, we still have some years to go where we really have something that has value over classical systems, but it's not what we used to think.

**VentureBeat: What kind of physical space are we talking about?**

**Welser:** If you look at the system, the reason it is structured is really: it's about isolating the chip. The chip is down in that lower section where all those wires are coming in. That is the real quantum computing chip. If we were using it, we had a container and things around it, so you could not see it, but we discovered it. When it is covered, this whole system goes down at low pressure, but also low at low temperature, which is what really matters.

The upper part is about 40 degrees Kelvin, and then it goes down to four Kelvin, 100 milli-Kelvin, and so on. When you get to the bottom, it is at 15 milli-Kelvin, which is 15 thousandths of a degree above absolute zero. For reference, the space is approximately two to three Kelvin. It's a couple of hundred times more than outer space, when you get there.

The reason you have it is so cold that you need to isolate it from any type of interference, including thermal interference. Any thermal energy will eliminate the qubits of the superposition state that we want. Even with all this isolation, the qubits will only maintain their superposition for about 100 microseconds. That is really very good. We are proud of that number. But it's still very little time, obviously. You must perform all the calculations in that period of time before an error is generated.

**VentureBeat: Is this a demo unit now?**

**Welser:** This is a demonstration, yes. The components are all there. In theory, you could execute it. But the vacuum systems and the things around them are missing. Those who are running today are in the basement of our Yorktown Heights lab in New York. We have several systems in operation there. These are the ones you can access in the cloud. We put the first online in May 2016. It was a five qubit system. As I said, we now have a 16 qubit system that you can use for free, and we have a 20 qubit system for people who join our network. We have a network of companies and universities, more than 70 at this time, that can also access the 20 qubit system.

We have also put together an open source software infrastructure called Qiskit. It is giving people the tools they need to try to program. One of the challenges is, as you can guess, it is a different programming than what we used to do. Qiskit has different ways to manipulate qubits if you understand that part. Over time, we are introducing libraries, so a chemist could use a library of quantum algorithms. They would understand what the high-level algorithm does, and that would translate into running on the quantum computer.

**VentureBeat: Why do people find it useful at this time?**

**Welser:** Most people who are seeing it are in three main areas. One is for chemistry or the discovery of materials. For example, JSR, a large producer of semiconductor polymers, is a member. Samsung is a member. They are using it a lot: they believe that when they have large enough systems, it will help them discover new materials with different properties for any application that is necessary. The materials are fed by what happens in consumer products, in cars, in batteries, etc. That is where we believe that, in three or five years, there will be systems large enough to see the real benefit of them. Right now they are just experiments.

The next one is optimization. We have J.P. Morgan Chase and Barclays as members. They are considering using really large quantum Monte Carlo simulations or other optimization problems for bond pricing or predicting the behavior of very complex financial systems. Today we do it with very large supercomputers, but it's one of those things: just like with the caffeine problem, you can only simulate a lot. It's more like five years before you have a large enough system.

The other one is for AI and machine learning. There are some machine learning problems that can be assigned to quantum systems that we believe will allow you to make parameter sets and function spaces much larger than what you can do in standard systems. We just published an article about that about six months ago. That, again, is three to five, maybe five years out.

The one I mentioned, in which most people think, is factoring or cryptography, this idea that quantum computers can potentially factor a large number of people and, therefore, could break the Internet, break the encryption we use . It is true that if I had a large enough system, I could take into account very large numbers and the current types of encryption that we use on the Internet would be vulnerable at that time. But to get there you need a system probably in the thousands of qubits, or even millions. Qubits very free of errors, which we do not have today. We have at least 10 years, if not 15 or 20, before we have a system big enough to do it. There is no immediate concern there.

In the meantime, we already know encryption methods that we could use today in classical systems that do not allocate well to a quantum computer. Even when you get a very large system, they would not be vulnerable. A form of cryptography called lattice cryptography, for example. We have a lot of time to implement that kind of thing. In fact, one of the things we talk about with our clients is that, because our clients are big players in the industry or the government, it's too early to worry about anything that breaks the Internet.

If you are filing data or have data that you want to make secure and private for 10, 15, 20 years in the future, think about the tape files of all your data that you are entering there, it is not too early to think about encrypting and using something like lattice cryptography, which is very feasible. In 15 years from now, it will not encrypt the data in its file once the quantum computers appear. It is not too early to think in that way.

**VentureBeat: How big is this effort done within IBM right now?**

**Welser:** It is a strong focus. We have a lot of work in our laboratory in Yorktown Heights, as well as some programs that work in the Albany laboratory, as well as in the Zurich laboratory. Part of the reason for creating this great network of universities and companies is that many people work in many different spaces. Of course, we will continue with the hardware part, as well as with the software part and the algorithm, but we want many people to create applications, because that is how we are going to discover how to use this.

**VentureBeat: How many years have you been working on it?**

**Welser:** It could be said that we have been working on this since 1981. In 1981 there was a very famous meeting that took place among a group of physicists. It was co-sponsored by MIT and IBM. Richard Feynman, who is a very famous physicist, that's where he basically coined the idea of quantum computing. He said he thought it would make sense to use the quantum effects to do the calculation. He also pointed out that it might be necessary to use them if you ever wanted to do chemistry simulations.

That's where the idea began to filter. People started to gather some of the ideas about what it would take to build it: David DiVincenzo, a physicist who worked for IBM in the 1990s, put together a series of five criteria needed to build a quantum computer. We built our first seven-qubit system in the late 1990s using trapped ions, a completely different technology, just to show that it was possible. It was particularly useful, but it proved the concept.

In terms of the version seen here, we started working on this version about six or seven years ago, to discover how it could be built: these are based on superconducting transmones, that's what the real device is at the bottom. We started to build that six or seven years ago, and as I said, in May 2016 we started the first one.

The IBM Q System One, which is our first commercial version, will go online in the near future. That's all for many people who want a more robust system. We hope this continues to extend the work to more companies that are very aware of quantum computing, but that are more generalist.

**VentureBeat: There were many skeptics about this from the beginning. What are some of the milestones you've found that help overcome some of that skepticism?**

**Welser:** We are seeing it progressively progressing. Much of the skepticism is that, in theory, it has only been shown that two algorithms are faster in a quantum computer. There is Shor's algorithm, which is factoring, and Grover's algorithm, which is a type of search algorithm. But everything else was more speculation than if it were not really faster.

We're starting to see articles being published where people say, "Hey, I just did this, and if you scale it to a certain number of qubits, it's more than you could do in a classical system." Starting to run simulations and demonstrate that you can do this. Part of the skepticism is breaking.

The other thing is that we start our own route map of increasing the quantum volume, we call it. That is, find the error rate low while increasing the number of qubits. That shows you increasingly deep circuits, increasingly complex algorithms. These things are starting to make people think, "Well, this looks a lot more real." Nobody knows where we'll go until the end, but people are starting to see that if you take it and combine it with classic computing in ways you can get something that might seem feasible.

**VentureBeat: Is there some kind of benefit from Moore's Law that I get from this?**

**Welser:** Not directly. Probably there is no direct analog. But one thing we are considering is that we would like to double the quantum volume each year, similar to the way that Moore's Law doubled the number of components. But it is a more complicated problem, because to duplicate the quantum volume that you need not only to increase the number of qubits, but it is quite easy, since they are large compared to what we do, they run in the 40 nm range, in comparison with Moore's Law. below 10 nm today. We can easily make more qubits. That's not a problem. But if we improve the error rate in the qubits, then having more qubits does not help. You need to lower the error rate.

We hope to find ways to improve that error rate on a regular basis, so that the quantum volume can be improved following a pattern similar to Moore's Law. But it's a very different physics involved now.

Source link