# Discovering: Quantum Computing

## Episode 1: The strange case of the Qubit

# Intro

The Discovering series is a new format I’m experimenting with, the idea for which came from the notion that some subjects require a bit more written content and, by extension, time to cover the basics over other similar articles.

For each Discovery subject at least two content ‘episodes’ are planned. The episodes are ordered in an incremental way and released in sequential order which is the same way they are meant to be read through.

This article episode also marks the first time I have to credit a co-author for it. In the world of machine learning and AI, there have been made significant strides in the past few years and one particular tool introduced itself to the world at the end of last year (2022) and that is **ChatGPT**. It would be dishonest of me to not mention that I’ve used this tool during the writing of this article episode. In particular the helping hand of ChatGPT was needed for sorting and picking the right material from my previous notes in order to put more emphasis on terminology and questions that the readers would potentially find more interesting.

With the credits out of the way, we can start with the principal topic of this article.

The subject of quantum computing is something I’ve been attempting to tackle for quite some time and tried to write about it for at least half an year now. One would be right in thinking that this isn’t a subject matter that is easy to explain let alone provide some validity and practical use cases for the implementation in this, relatively new industry. To say that what follows is a layman’s interpretation of a very complex subject matter is an understatement, but here goes nothing! Let’s dive into the amazing world of quantum computing.

# Basic concepts

To cover the subject of the qubit (quantum bit) and quantum computing as a whole, we should take a few moments and look into quantum mechanics in search for answers.

It has always been fascinating to physicists to study the behavior of matter and energy data at an atomic, even more importantly subatomic level. The results of those kind of researches gave us the very foundations of modern technology in the form of transistors and computer memory, not to mention how that data help us in understanding the whole physical world around us. When researching and gathering the needed data, the prospect of general data measurements becomes significantly more difficult at the subatomic level. Two key principles emerge when looking into measuring energy data in quantum mechanics. One is the concept of superposition and the other is the process of collapsing state.

Superposition comes from the notion that tiny particles such as electrons and protons can indeed exist in multiple states simultaneously! Crazy, right? This means that a particle can occupy spacial matter in two or more locations at the same time. Not only that, at that level, a particle can hold different properties at the same time.

The other key principle, measurement of a particle, can lead to something what’s called the observer effect when the particle collapses into a single, more “stable” state while being observed (or measured). It’s easy to understand how this complicates any attempt of obtaining tangible and predictable data if a particle is changing its properties every time its being measured.

To further complicate things even more, quantum mechanics also introduces the concept of quantum entanglement, an event in which two particles are connected in such a way that one directly depends on the properties of the other one. This directly alludes to the notion that in the process of measuring the state of one particle we can directly affect the state of an another one. What’s the fun part in this? The distance between two particles doesn’t seem to matter! With that we are knocking on the door of quantum teleportation, but that is a subject to be explored more in the future (I’m pushing the limits of my mind enough as it is with this topic).

With all said, quantum mechanics provides the very foundations for quantum computing. Quantum computing is nothing more than a new way of ‘computing’ things and solve problems previously perceived as unsolvable, using the principles of quantum mechanics. When performing those computational tasks, quantum computing at its core uses quantum bits or qubits for short instead of the traditional binary (‘1’ and ‘0’) units of measurement in computer science. More on that in just a ‘bit’ (I’m not sorry for the bad pun).

It is important to note that although the unquestionable relationship between the two, quantum computing and quantum mechanics are completely two separate fields. The understanding of the principles covered in quantum mechanics are essential in understanding and developing quantum computing algorithms on top of the ‘general’ knowledge of computer science and mathematics

# What is a qubit?

In 1935, Albert Einstein, Boris Podolsky and Nathan Rosen published a scientific paper describing the phenomenon of quantum entanglement mentioned before. Since then the concept of quantum computing and even quantum cryptography has branched from the concepts of quantum mechanics.

A qubit represents the basic unit of informational data in a quantum computer and is directly comparable to the traditional ‘bit’ in the classic computer system. The main difference between the qubit and the classic bit is that the qubit can have a lot of states instead of just the two (1 and 0) of the classic bit. This is the concept of superposition that quantum mechanics teaches us. In theory, should help improve the computational speed of certain types of problems a lot. In a quantum computer, each qubit can be both a 0 and a 1 at the same time with a certain probability for each value. This, again in theory, can enable quantum machines to execute certain computational tasks in parallel which leads to reduced computation times.

The idea for using quantum mechanics for performing computational tasks is first pioneered by Richard Feynman in the early 1980’s. An interesting figure on himself, he introduced the concept of nanotechnology and more importantly for this topic, the mathematical expressions describing the movements and behaviors of subatomic particles known as Feynman diagrams. The first demonstrated examples of quantum computing followed in the 1990’s using nuclear magnetic resonance (NMR) spectroscopy.

While the qubit is very difficult to work with by the nature quantum mechanics and simplistic maintenance of their quantum states is challenging to say the least, there is an ongoing effort to research all the potential benefits of it that can provide a promising area of research.

At the time of this writing, there's ongoing work by engineers and physicists to improve the overall robustness and ‘stability’ for practical applications.

# Use cases

Quantum computing might seem like the clear winner when it comes to computational speeds against its classic computer rival with some calculations that would take the ‘opponent’ longer than the age of the known universe, a quantum computer would execute in a few seconds, but that’s not the case.

Not all calculations can be performed faster on a quantum computer, notably tasks that we take for granted in our daily routines such as media playback, browsing the Internet or typing a new blog article can, on average, take longer to be done on a quantum computer. The technology is still in its early stages of development with news about new funds and faster quantum computers (more qubits) coming out on regular intervals. Where the quantum computer struggles the most is in its practical implementation. Since it’s an emerging technology, many use cases for it have yet to be though of.

A few use cases where quantum computing might have a significant impact in today’s technology are the fields of cryptography, machine learning, AI and medicine. The field of cryptography keeps coming up in conversations and articles about quantum computing as something that seems to be the likeliest and easiest to find usage for. Breaking encoded data that was previously thought of as being unbreakable seems like something that would interest potential investors in this technology.

Machine learning and AI could see new optimization techniques that could cut-down on the times spent on training new models. This would lead to shorter turnarounds for new models based on new and updated data sources.

Simulations and optimized prediction algorithms can be brought into the spotlight with much more accuracy and discoverability of new, previously unthought of solutions. But, these are just some of the use cases that are talked about most in the scientific world, the most impactful use cases are probably waiting for the further development and adaptation of the general quantum computing field.

# Getting started with quantum computing

Let’s get this right out of the way, one does not need a quantum computer in order to get into quantum computing. A quantum computer simulator would most likely satisfy most use cases developers might have.

To start of a journey into the field of quantum computing, developers must firstly cover the basic concepts of quantum mechanics such as quantum entanglement, superposition and quantum gates. While there are countless resources available on the internet, The Strange Story of the Quantum (hence the subtitle) might be a good starting point. The book is quite popular in the quantum theory field and should be easily obtainable in paper, e-book or audio-book formats.

Learning a quantum computing-specific programming language or frameworks that can be used to experiment on quantum computer simulators should be the next step for anyone interest in this topic. There are a few programming languages specifically designs for quantum computing. The first one is Microsofts own Q#, the documentation for which is quite exhaustive. Familiarity with C# is also recommended if deciding to go the Q# route.

While the documentation for the Quipper programming language might seem a bit out of date, it can’t be argued that it is at least functional.

Like any process of learning a new programming language, experimentation is key. By experimenting and running tests on quantum simulators, one can form a better understanding how abstract concepts can be applied to in the new field. The process can be best described as comparing programming language paradigms such as object-oriented programming and functional programming for example. Both can achieve almost the same results at the end ,but to fully utilize them, one has to understand the way they are meant to be implemented and change their approach to problem solving accordingly.

Taking a dedicated course on the topic from one the popular teaching platforms such as Udemy or Coursera can provide a nice structured, step-by-step way to learn more and apply the gathered knowledge on various testing platforms or quizzes.

After grasping the basic concepts of quantum mechanics and quantum computing, learning about the various quantum computing algorithms should solidify any gathered knowledge about the topic.

Some of the more vocal algorithms in the world quantum computing are:

- Quantum gates: The basic building blocks for quantum circuits used for performing operations on qubits
- Shor’s Algorithm: An efficient algorithms for factoring large integer values
- Grover’s Algorithm: Used for fast searching through unsorted databases
- QFT: A quantum version of the Fourier transform
- QPE (quantum phase estimation): An algorithm used to estimate the values of scalar values in matrix equations used in many quantum algorithms
- QML (quantum machine learning): A set of algorithms used for performing machine learning tasks such as supervised and unsupervised learning

Qiskit,a popular framework for working with quantum computing, includes implementations of the before mentioned algorithms and many more.

Below is an example implementation of the quantum circuit that creates a superposition of two qubits done using the Qiskit framework:

`from qiskit import QuantumCircuit, execute, Aer`

# New circuit with two qubits

qc = QuantumCircuit(2)

# Apply a Hadamard gate to the first qubit to put it into a superposition

qc.h(0)

# Apply a controlled-not gate between the first and second qubits

qc.cx(0, 1)

# Execute the circuit on a Qiskit statevector simulator (CPU & GPU optimized)

backend = Aer.get_backend('statevector_simulator')

job = execute(qc, backend)

result = job.result()

# Get the final statevector

statevector = result.get_statevector()

print(statevector)

For those that have significant programming experience, they might have an easier time with picking up a new programming language designed for this topics, but nonetheless, the general path outlined in this section remains almost the same. The basics of quantum mechanics still need to be clear before moving forward with a career in quantum computing.

# Carrier opportunities

At the time of writing this, there are significantly less jobs on the market for those seeing a career in quantum computing than there are for more conventional computer science technologies, but like with any emerging technology, this can quickly change and new job openings might be closer than we think.

There aren’t many places where quantum computing is being researched and utilized. The general perception is that that only large corporations have in their budgets room for research and development in this field. This is not necessarily true. There are smaller companies that seek to get their foot first into what might be the “next big thing”.

For those little job postings that are out there (again, at the time of this writing) there should be a few pieces of advice for those looking for applications. Without learning the principles of quantum mechanics and having a strong foundation in mathematics, especially linear algebra, the chances of landing a position are quite reduced.

Writing research papers on the subject and extensive networking are bonuses that can land a person a new career in a exciting new world of the qubit.

# Final notes

This concludes a short first episode in what should be a series of article posts dedicated to the subject of quantum computing. Again, this article turnover has been significantly shortened thanks to the amazing tools such as ChatGPT.

It is important to note that the principles of quantum computing can potentially be difficult to understand and are often counterintuitive to what we are used to in the more traditional mechanics. The field is constantly growing and those working hard on it,are still trying to understand the full implications of quantum mechanics when applied to the world of computing.