What is Quantum Computing and How is it Useful in Artificial Intelligence?

What is Quantum Computing and How is it Useful in Artificial Intelligence?

Introduction

After decades of a heavy slog with no promise of success, quantum computing is suddenly buzzing! Nearly two years ago, IBM made a quantum computer available to the world. The 5-quantum-bit (qubit) resource they now call the IBM Q experience. It was more like a toy for researchers than a way of getting any serious number crunching done. But 70,000 users worldwide have registered for it, and the qubit count in this resource has now quadrupled. With so many promises by quantum computing and data science being at the helm currently, are there any offerings by quantum computing for the AI? Let us explore that in this blog!

 

What is Quantum Computing?

A traditional computer works on bits of data that are binary, or Boolean, with only two possible values: 0 or 1. In contrast, a quantum bit has possible values of 1, 0 or a superposition of 1 and 0. According to scientists, qubits are more like physical atoms and molecular structures. However, many find it helpful to theorize a qubit as a binary data unit with superposition.

The use of qubits makes the practical quantum computer model quite difficult. Traditional hardware requires altering to read and use these unknown values. Another idea, known as entanglement, uses quantum theory to suggest that accurate values cannot be obtained in the ways that traditional computers read binary bits. It also has been suggested that a quantum computer is based on a non-deterministic model, where the computer has more than one possible outcome for any given case or situation. Each of these ideas provides a foundation for the theory of actual quantum computing, which is still problematic in today’s tech world.

 

Use of Quantum Computing

Let us look at some of the use cases of the quantum computing below. This will help you understand the scale of the application of quantum computing currently.

Use cases can be: 

1. Cryptography

The most common area people associate quantum computing with is advanced cryptography. The ordinary computers we use today make it infeasible to break the encryption that uses very large prime number factorization (300+ integers). With quantum computers, this decryption could become trivial, leading to much stronger protection of our digital lives and assets. Of course, we’ll also be able to break traditional encryption much faster.

2. Aviation

Quantum technology could enable much more complex computer modelling like aeronautical scenarios. Aiding in the routing and scheduling of aircraft has enormous commercial benefits for time and costs. Large companies like Airbus and Lockheed Martin are actively researching and investing in the space to take advantage of the computing power and the optimization potential of the technology.

3. Data Analytics

Quantum mechanics and quantum computing can help solve problems on a huge scale. A field of study called topological analysis where geometric shapes behave in specific ways describes computations that are simply impossible with today’s conventional computers due to the data set used.

NASA is looking at using quantum computing for analyzing the enormous amount of data they collect about the universe, as well as research better and safer methods of space travel.

4. Forecasting

Predicting and forecasting various scenarios rely on large and complex data sets. Traditional simulation of, for example, the weather is limited in the inputs that can be handled with classical computing. If you add too many factors, then the simulation takes longer than for the actual weather to evolve.

5. Pattern Matching

Finding patterns in data and using these to predict future patterns is highly valuable. Volkswagen is currently looking into how they can use quantum computing to inform drivers of traffic conditions 45 minutes in advance. Matching traffic patterns and predicting the behaviour of a system as complex as modern day traffic is so far not possible for today’s computers, but this is going to change with quantum computers.

6. Medical Research

There are literally billions of possibilities to how something could react across the human body and even more when you consider that this could be a drug administered to billions of people, each with slight differences in their makeup.

Today, it takes pharmaceutical companies up to 10+ years and often billions of dollars to discover a new drug and bring it to market. Improving the front end of the process with quantum computing can dramatically cut costs and time to market, repurpose pre-approved drugs more easily for new applications, and empower computational chemists to make new discoveries faster that could lead to cures for a range of diseases.

7. Self-Driving Cars

Car companies like Tesla and tech companies like Apple and Google are actively developing driverless cars. Not only will these improve the standard of living for most people, but also cut pollution, reduce congestion and bring about a bunch of other benefits.

 

AI and Quantum Computing

Quantum computing is not a replacement for AI but you can see it more like an enhancement. AI is a major task which we are trying to solve and quantum computing helps us in optimising the sub-tasks of it. Currently, we have a limited scope of quantum computing in AI as technology is still currently new. But on a broad level, quantum computing affects the following tasks in AI

1. Simulation
Simulation modelling is the process of creating and analyzing a digital prototype of a physical model to predict its performance in the real world. It is used to help designers and engineers understand whether, under what conditions, and in which ways a part could fail and what loads it can withstand. This modelling can also help to predict fluid flow and heat transfer patterns. It analyses the approximate working conditions by applying the simulation software.

2. Optimisation
An optimization problem is a problem of finding the best solution from all feasible solutions. Optimization problems can be divided into two categories depending on whether the variables are continuous or discrete. An optimization problem with discrete variables is known as a discrete optimization. In a discrete optimization problem, we are looking for an object such as an integer, permutation or graph from a countable set. Problems with continuous variables include constrained problems and multimodal problems.

3. Sampling
Data sampling is a statistical analysis technique used to select, manipulate and analyze a representative subset of data points to identify patterns and trends in the larger data set being examined. It enables data scientists, predictive modelers and other data analysts to work with a small, manageable amount of data about a statistical population to build and run analytical models more quickly, while still producing accurate findings.

 

Benefits of Quantum Computing in AI

1. Less time in training
The big advantage of quantum computing is that it allows an exponential increase in the number of dimensions it can process. While a classical perceptron can process an input of N dimensions, a quantum perceptron can process 2N dimensions.

2. Better Results
It turns out that quantum perceptron can easily classify the patterns in these simple images. We use the quantum model of perceptron as an elementary nonlinear classifier of simple patterns

3. Achieving parallelism
The earliest examples of a quantum algorithm are exponentially faster than any possible deterministic classical algorithm. Quantum computing allows solving the problem since it is capable of simultaneously evaluating f(0)and f(1). This possibility stems from ‘quantum parallelism’. The quantum parallelism allows computing 2n entries for a state consisting of n-qubits. That is: from a linear growth in the number of qubits, we can achieve exponential growth in computing space.

 

Challenges

  • Sensitivity to interaction with the environment 
    Quantum computers are extremely sensitive to interaction with the surroundings since any interaction (or measurement) leads to a collapse of the state function. This phenomenon is called decoherence. It is extremely difficult to isolate a quantum system, especially an engineered one for a computation, without it getting entangled with the environment. The larger the number of qubits the harder is it to maintain the coherence.
  • Error-correction
    Quantum error correction (QEC) is used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is essential if one is to achieve fault-tolerant quantum computation that can deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements. Copying quantum information is not possible due to the no-cloning theorem. This theorem seems to present an obstacle to formulating a theory of quantum error correction
  • Constraints on state preparation
    State preparation is the essential first step to be considered before the beginning of any quantum computation. In most schemes, the qubits need to be in a superposition state for the quantum computation to proceed correctly. We have a variety of problems due to the nature of superposition and entanglements, and state transition using local transformations is not realistic in a large system. Macrosystems that have been used as model quantum computing systems [14, 33,34] appear to implement not pure states but mixtures. Thus it appears that the NMR experiments do not validate the quantum algorithm.

 

Summary

Three decades after they were first proposed, quantum computers remain largely theoretical. Even so, there’s been some encouraging progress toward realizing a quantum machine. There’s no doubt that these are hugely important advances. and the signs are growing steadily more encouraging that quantum technology will eventually deliver a computing revolution. The potential of quantum computing in artificial intelligence will be evident soon, but still, we do not know how to translate that potential into reality. Undoubtedly, time will put things in place

Follow this link, if you are looking to learn data science online!

You can follow this link for our Big Data course!

Additionally, if you are having an interest in learning Data Science, click here to start Best Online Data Science Courses 

Furthermore, if you want to read more about data science, read our Data Science Blogs

How to Make Machine Learning Models for Beginners

Difference Between A Data Scientist and Statistician

Accurate Bitcoin Price Forecasting with Python – Blockchain Applications of Data Science Part 2

 

Top Trends for Data Science in 2019

Top Trends for Data Science in 2019

Trends to Watch Out For and Prepare Yourself

Ok. So there’s been a lot of coverage by various websites, data science gurus, and AI experts about what 2019 holds in store for us. Everywhere you look, we have new fads and concepts for the new year. This article is going to be rather different. We are going to highlight the dark horses – the trends that no one has thought about but will completely disrupt the working IT environment (for both good and bad – depends upon which side of the disruption you are on), in a significant manner. So, in order to give you a taste of what’s coming up, let’s go through the top four (plus 1 (bonus) = five) top trends of 2019 for data science:

  1. AutoML
  2. Interoperability (ONNX)
  3. Cyber Data Science Crime
  4. Cloud AI-as-a-Service
  5. (Bonus) Quantum Computation & Data Science

1. AutoML (& AutoKeras)

How AutoML works
Google AutoML Architecture
From https://cloud.google.com/automl/

This single innovation is going to change the way machine learning works in the real world. Earlier, deep learning and even advanced machine learning was an aristocratic property of PhD holders and other research scientists. AutoML has changed that entire domain – especially now that AutoKeras is out. 

AutoML automates machine learning. It chooses the best architecture by analyzing the data – through a technology called Neural Architecture Search (NAS), tries out various models and gives you the best possible hyperparameters for your scenario automatically! Now, this was priced at the ridiculous price of 76$ USD per hour, but we now have a free open source competitor, AutoKeras.

The open source killer of AutoML
From https://www.pyimagesearch.com

AutoKeras is an open source free alternative to AutoML developed at University of Texas A & M DATA lab and the open source community. This project should make a lot of deep learning accessible to everyone on the planet who can code even a little. To give you an example, this is the code used to train an arbitrary image classifier with deep learning:

From: https://autokeras.com/

Folks, it really doesn’t get simpler than this!

Note:Of course, the entire training and testing process will take more than a day to complete at the very least, but less if you have some high-throughput GPUs or Google’s TPUs (Tensor Processing Units – custom hardware for data science computation) or plenty of money to spend on the cloud infrastructure computation resources of AutoML.

2. Interoperability (ONNX)

For those of you are new as to what interoperability means to neural networks – we now have several Deep Learning Neural Network Software Libraries competing with each other for market dominance. The most highly rated ones are:

  1. TensorFlow
  2. Caffe
  3. Theano
  4. Torch & PyTorch
  5. Keras
  6. MXNet
  7. Chainer
  8. CNTK

However, converting an artificial neural network written in CNTK (Microsoft Cognitive Tool Kit) to Caffe is a laborious task. Why can’t we simply have one single standard so that discoveries in AI can be shared with the public and with the open source community?

To solve this problem, the following standard has been proposed:

Open Neural Network Exchange Format

One Neural Network Standard over them all.
From https://www.softwarelab.it

ONNX is a standard in which deep learning networks can be represented as a directed acyclic computation graph which is compatible with every deep learning framework available today (almost). Watch out for this release since if proper transparency is enforced, we could see decades worth of research happen in this single year!

3. Cyber Data Science Crime

There are allegations that the entire US elections conducted last year was a socially engineered manipulation of the US democratic process using data science and data mining techniques. The most incriminated product was Facebook, on which fake news were spread by Russian agents and the Russian intelligence forces, leading to an external agency deciding who the US president would be instead of the US people themselves. And yes, one of the major tools used was data science! So, this is not so much a new trend as an already existing phenomenon, one that needs to be recognized and dealt with effectively.

While this is a controversial trend or topic, it needs to be addressed. If the world’s most technologically advanced nation can be manipulated by its sworn enemies to elect a leader (Trump) that no one really wanted, then how much more easily can nations like India or the UK be manipulated as well?

This has already begun in a small way in India with BJP social media departments putting up pictures of clearly identifiable cities (there was one of Dubai) as cities in Gujarat on WhatsApp. This trend will not change any time soon. There needs to be a way to filter the truth from the lies. The threat comes not from information but from misinformation.

Are you interested in the elections? Then get involved in Facebook. What happened in the USA in 2018 could easily happen in India in 2019. The very fabric of democracy could break apart. As data scientists, we need to be aware of every issue in our field. And we owe it to the public – and to ourselves – to be honest and hold ourselves to the highest levels of integrity.

We could do more than thirty blog posts on this topic alonebut we digress.

4. Cloud AI-as-a-Service

AI-as-a-Service Overview
From: http://www.digitaljournal.com

To understand Cloud AI-as-a-Service, we need to know that maintaining an AI in-house analytics solution is overkill as far as most companies are concerned. It is so much easier to outsource the construction, deployment and maintenance costs of an AI system to a company that provides it online at a far lesser cost than what the effort and difficulty would be otherwise in maintaining and updating an in-built version that has to be managed by a separate department with some very hard-to-find and esoteric skills. There are so many start-ups in this area alone over the last year (over 100) that to list all of them would be a difficult task. Of course, as usual, all the big names are extremely involved.

This is a trend that has already manifested, and will only continue to grow in popularity. There are already a number of major players in this AI as a Service offering including but not limited to Google, IBM, Amazon, Nvidia, Oracle, and many, many more. In this upcoming year, companies without AI will fall and fail spectacularly. Hence the importance of keeping AI open to the public for all as cheaply as possible. What will be the end result? Only time will tell.

5. Quantum Computing and AI (Bonus topic)

Quantum Computing is very much an active research topic as of now. And the country with the greatest advances is not the US, but China. Even the EU has invested over 1 Billion Euros in its quest to build a feasible quantum computer. A little bit of info about quantum computing, in 5 crisp points:

  1. It has the potential to become the greatest quantum leap since the invention of the computer itself. (pun intended)
  2. However, practical hardware difficulties have kept quantum computers constructed so far as laboratory experiments alone.
  3. If a quantum computer that can manipulate 100-200 qubits (quantum bits) is built, every single encryption algorithm used today will be broken quite easily.
  4. The difficulty in keeping single atoms in isolated states consistently (decoherence) makes current research more of an academic choice.
  5. Experts say a fully functional quantum computer could be 5-15 years away. But, also, it would herald a new era in the history of mankind.

In fact, the greatest example of a quantum computer today is the human brain itself. If we develop quantum computing to practical levels, we could also gain the information to create a truly intelligent computer. 

Cognition in real world AI that is self-aware. How awesome is that?

D-Wave Quantum Computer processor
(From wikipedia.org)

Conclusion

So there you have it. The five most interesting and important trends that could become common in the year 2019 (although the jury will differ on the topic of the quantum computer – it could work this year or ten years from now – but it’s immensely exciting).

What are your thoughts? Do you find any of these topics worth further investigation? Feel free to comment and share!

For additional information, I strongly recommend the articles given below:

https://dimensionless.in/how-to-land-a-job-as-a-data-scientist-in-2019/
https://dimensionless.in/what-is-cloud-computing-aws-vs-gcs/

Alternate views or thoughts? Share your feelings below!