Fast Start Gen AI with a 15% discount

How will quantum computing affect data analytics?

Wim Stoop, Senior Director of Product Marketing, Cloudera

OCTOBER 13, 2023
close up of quantum computing machinery

By the end of 2022, the world produced and consumed a staggering 94 zettabytes of data in just 12 months. That almost unimaginable number is only expected to continue to grow as a result of the increased adoption of Internet of Things (IoT), artificial intelligence (AI), and 5G capabilities and as consumers’ appetite for data-driven products and services increases.

While consumers are demanding more targeted services, deriving intelligence from this mass of data is no easy task. For some organizations it is near impossible to find correlated information that has actionable value. 

That’s because, despite the fact that the amount of data generated keeps growing, we are at the limits of the data processing power of traditional computers. While Moore’s Law, which predicts the number of transistors on integrated circuits will double every two years, proved remarkably resilient since the term was coined in 1965, those transistors are now as small as we can make them with existing technology. 

Enter quantum computing, a field that can be difficult to explain, even for those with a background in physics.


Artificial intelligence like ChatGPT has been taking the tech world by storm. 

Read this blog to understand what ChatGPT gets right and wrong about data lakehouse 

Two people discussing in from of screens with data.

What is quantum computing?

In short, quantum computing is a rapidly-emerging technology that harnesses the laws of quantum mechanics to solve problems too complex for classical computers. While traditional computers can only encode information in bits that take the value of 1 or 0, quantum machines perform calculations in qubits. Qubits rely on the peculiarities of quantum mechanics (superposition and entanglement) to be in both the 0 and 1 state at the same time, thus having the potential to process exponentially more data. 

Some of the promises of quantum computing are to provide high-speed detection, analysis, integration, and diagnosis when dealing with huge, scattered data sets, and to quickly find patterns in enormous, unsorted data sets. 

Given these unmatched credentials, many expect quantum computing to revolutionize data analytics. Accenture’s Technology Vision 2022 identifies the technological breakthrough as one of the four key technology trends transforming the world, stating that the answer to the world’s “massive data conundrum” lies in quantum computing.

“These machines—including but not limited to quantum—are pushing Moore’s Law aside as they jump onto a curve of new compute capability,” the report states. “Industries are defined by their most intractable problems; as these new machines mature, they will help companies solve them.”

It’s easy to see why. Not only can quantum computers execute highly complex computations in seconds, but the speed and capabilities of quantum computing has the potential to assist many areas of artificial intelligence, including machine learning, natural language processing, and predictive analytics.

For example, artificial intelligence used to extract relevant historical information and current data from large databases can process much more data when quantum computing is employed, resulting in valuable knowledge that may be used to generate predictions.

A meeting of people reviewing data on a large screen

The quantum race is on

There is already a race underway between the most prominent leaders in the industry to be the first to launch a viable quantum computer. 

Google has taken an early lead, announcing that it had achieved quantum supremacy in October 2019. The tech giant said its 54-qubit Sycamore processor could perform a calculation in 200 seconds that would have taken the world’s most powerful supercomputer 10,000 years. That would mean the calculation, which involved generating random numbers, is essentially impossible on a traditional, non-quantum computer.

In November 2022, IBM announced it built the largest quantum computer yet, the 433-qubit Osprey. Compared to a traditional machine, “The number of classical bits that would be necessary to represent a state on the IBM Osprey processor far exceeds the total number of atoms in the known universe.”

At the same time, physicists are still debating whether Moore’s law will become obsolete and when, while new theories predicting the growth trajectories of quantum computing powers have been emerging over the years: from Rose’s law to Neven’s law.  

Prepare for quantum analytics now

Although it may be years before quantum computing makes its way into most organizations or becomes a standard data analytics tool, scientists believe that now is the time for data analysts to prepare for the quantum future. 

In a 2022 research paper, Professor Yazhen Wang says that two areas of exploration for today’s data analysts include quantum-inspired analytics, i.e. viewing information technologies from the vantage point of quantum to make gains in conventional systems, and benchmarking quantum technologies.

“Quantum data analytics may provide a new perspective to classic problems. That perspective opens the door for solutions that exploit quantum phenomena but has also inspired new ideas that emulate, rather than exploit such phenomena,” Wang writes.

“Though there is no quantum computer yet capable of implementing quantum data analytics, there is a large space for the quantum data analyst to continue to explore.”

Cloudera has always looked to data to make the impossible today possible tomorrow. With advances in quantum computing, we are looking forward to making the impossible possible within minutes. See how Cloudera Data Platform can help you prepare for the quantum future.

Article by

Photo of author Wim Stoop

Wim Stoop

Wim Stoop is senior director, product marketing for Cloudera. In this role, he leads the marketing direction and strategic vision for Cloudera’s mission to let organizations turn data into business value at scale. Prior to Cloudera, Wim spent more than 20 years helping blue-chip companies such as IBM, BP, and HSBC solve their most data-intensive challenges in the context of their business objectives and usage scenarios. Wim is a regular speaker at industry events where organisations are deciding and defining their big data strategy and direction. Wim holds a degree in Chemical Process Technology from the Eindhoven University of Technology in The Netherlands.

More articles


Entertain Us, Data

Learn more


How data has transformed the pharma industry

Learn more

Your form submission has failed.

This may have been caused by one of the following:

  • Your request timed out
  • A plugin/browser extension blocked the submission. If you have an ad blocking plugin please disable it and close this message to reload the page.