fbpx
Getting Started With Quantum Computing is Surprisingly Easy Getting Started With Quantum Computing is Surprisingly Easy
Editor’s note: Frank Zickert, PhD is a speaker for ODSC West 2022 this November 1st-3rd. Be sure to check out his... Getting Started With Quantum Computing is Surprisingly Easy

Editor’s note: Frank Zickert, PhD is a speaker for ODSC West 2022 this November 1st-3rd. Be sure to check out his talk, “Getting Started With Quantum Bayesian Networks in Python and Qiskit,” there!

Quantum computing promises to be the most disruptive technology of the 21st century. It’s now up to you to set the course for whether this becomes a threat to your career or a once-in-a-lifetime opportunity to gain a competitive advantage.

Getting Started With Quantum Computing is Surprisingly Easy

Of course, quantum computing is nothing you learn in a day. But neither is learning how to program a classical computer.

Of course, the field of quantum computing is still in the process of evolving. But if you wait until it became mainstream, you’ve spoiled the opportunity of turning your knowledge into a competitive advantage. Instead, you’ll be urged to keep up to not be left behind. Then, it will become a threat.

Do you remember 2014? That year, breakthrough advances in deep neural networks changed the way we think of machine learning. Those who understood and mastered this technology back then were in the pole position for high-paying jobs. The clincher parallel was that artificial neural networks were not new in 2014, either. The idea dates back to the 1940s.

But in 2014, three factors finally unleashed AI on the world.

  • Cheap parallel computing
  • Vast amounts of data
  • Algorithmic improvements

I learned about artificial neural networks in college in 2004. During my semester abroad at Victoria University of Wellington, New Zealand, I took a class on artificial intelligence. However, I lost track of it. Later, I dedicated myself to business informatics and writing my Ph.D. thesis on effort estimations.

So, even though I have always been fascinated by artificial intelligence, I missed being ready for the machine learning revolution. That’s not going to happen to me again.

So, in 2018, I started to deep dive into quantum machine learning. Scientific papers and a few academic books were all I could find. So I was happy about every little piece.

But these quantum computing publications left me scratching my head. Most of the papers are pretty heavy on math and assume you’re familiar with much physical jargon. I could not even find an appropriate starting point or guidance on how to structure my learning efforts.

Frustrated with my failed attempts, I spent hours searching on Google. Finally, I hunted for practical tutorials, only to come up empty-handed.

I could see the potential value of quantum computing, especially for machine learning. Yet, I couldn’t see how all these parts of quantum computing fit together. Entry-level material was hard to find. And practical guides were simply not existent. I wanted to get started, but I had nothing to show for my effort, except for a stack of quantum computing papers on my desk that I didn’t understand.

Finally, I resorted to learning the theory first. I returned to relying on my core competence. That is software development. I am a programmer by heart. And what do we programmers do? We open our IDE and write code. Then, we turn the knobs until something works as we expect it.

And guess what. It worked. And with that practical experience at hand, I managed to make sense of the theory later, too.

So, why don’t we start with practical quantum computing right away?

The first and most important lesson to learn concerns the quantum bit — in short — the qubit. The theory says it is in a state of superposition. That is a complex linear relation of its basis states. Is everything clear? Probably not — but that’s absolutely fine.

So, let’s ignore the theory for now and start coding!

First, we need a working environment. This post guides you through the setup. We’ll be coding in Python and use Qiskit — that is the IBM quantum SDK for Python.

Now, here’s the first program. The “hello-world” of quantum computing, if you will.

from qiskit import QuantumCircuit, execute, Aer
from qiskit.visualization import plot_histogram
# Create a quantum circuit with one qubit
qc = QuantumCircuit(1)
# YOUR CODE GOES HERE
qc.h(0)
# measure the qubit
qc.measure_all()
# Tell Qiskit how to simulate our circuit
backend = Aer.get_backend('qasm_simulator')
# Do the simulation, returning the result
result = execute(qc,backend, shots=1000).result()
# get the probability distribution
counts = result.get_counts()
# Show the histogram
plot_histogram(counts)

First, we create a QuantumCircuit and pass the number of qubits we want it to have (line 5). The QuantumCircuit is the program that runs on a quantum computer.

Second, we add some instructions to the circuit (line 8). We will learn about the instructions in a minute. Right now, only remember that your individual code goes here.

Third, we measure all the qubits (line 11). The theory tells us that a qubit is in the state of superposition only as long as you do not look at it. But, first, that’s not entirely true. And second, who cares about the theory, right? So, let’s just say we collect the values from the qubit. We always do this at the very end of the quantum circuit.

Fourth, we boot our quantum computer. Unfortunately not. But we create a backend that simulates a quantum computer (line 14).

Fifth, we run the QuantumCircuit using the backend for 1,000 times (shots) and obtain the result (line 17).

Finally, we take the counts from the result (line 20) and display them in a histogram (line 23).

The following figure depicts (more or less) what you see.

Out of the 1,000 times, we executed the circuit, we observed the qubit as a 0 505 times and a 1 495 times.

When you run the code, the exact values will differ. They differ because the qubit is a probabilistic system. Whether you measure it as a 0 or a 1 depends on chance–and even more importantly–the qubit state.

Let’s, therefore, return to step two where we add the instructions to the circuit because this is where we manipulate the qubit state. In our case, we used the Hadamard operator (qc.h) to put the qubit into a state where the outcomes 0 and 1 are equally likely. The small differences occurred due to the empirical nature of running the circuit 1,000 times.

Essentially, manipulating the qubits state is all we do in quantum computing. The challenge is to manipulate the qubit in such a way that lets us measure the qubit as the value that represents the solution to our problem.

For this purpose, the quantum computer provides us with a set of technical operators. And this is where it gets complicated because there are many different operators and their behavior is not as intuitive as the behavior of their classical counterparts.

In classical computing, we work with only very few logical operators. Most commonly, we use “and”, “or”, “not”, and “xor”. And we can easily specify their behavior in a truth table.

The main reason why there are only a few operators is that the underlying value is pretty simple. Logically, it is a boolean value that is either true or false. Computationally, it is either 1 or 0. And technically it is either current on or off. In the end, it is one out of two possible values. And indeed, one needs only a few basic operators that can be put together to construct more complex behaviors.

The qubit state, however, is not that simple. It is not one out of two values but it is a 2n-dimensional vector of complex numbers with n being the number of qubits. So, of course, there are more basic operators. But what’s worse is that their effects are not quite so intuitive. And to top it off, we can’t really look at the qubit state because if we look at the qubit, it instantly collapses to one out of two values — 1 or 0. It collapses to an ordinary bit.

But that doesn’t mean we can’t reason about the quantum state. We can very well. Unfortunately, if we want to reason about it precisely, we will need to draw on mathematics. As a result, you see all these equations in scientific papers and articles on quantum computing.

But let me tell you something. Quantum operators are not as counter-intuitive as they look at first sight. In fact, with a little experience, you can reason about them intuitively. Of course, you won’t build up the experience in a day. But if you start today, you’ll be in time for the quantum revolution.

The best way to start your quantum machine learning journey is with my upcoming session at ODSC West, Getting Started With Quantum Bayesian Networks in Python and Qiskit. We will start with the basic quantum operators and learn how to connect them to create Quantum Bayesian Networks (QBN). Like their classical counterparts, QBNs are probabilistic models for representing knowledge and reasoning about an uncertain domain. They are intuitive and easy to understand. Yet, they use fundamental quantum computing concepts. Therefore, QBNs are great tools to get your hands on quantum machine learning.

About the author/ODSC West 2022 Speaker:

Frank Zickert is a Quantum machine learning engineer and the author of Hands-On Quantum Machine Learning With Python. He teaches quantum machine learning in an accessible way to help those without a degree in math or physics to get started in the field.

In his research, Frank strives to use quantum machine learning to advance the field of knowledge graph-based natural language processing. He is also the Chief Technology Officer of Ihr MPE B+C where he supports medical physicists to provide radiation protection services for clinical customers. Previously he worked at Aperto-An IBM Company and Deutsche Bank.

Frank earned his Ph.D. in Information Systems Development from Goethe University Frankfurt am Main, Germany.

ODSC Community

The Open Data Science community is passionate and diverse, and we always welcome contributions from data science professionals! All of the articles under this profile are from our community, with individual authors mentioned in the text itself.

1