Although quantum computing is seen as the next generation of computing, many companies and governments are still trying the grasp the application and meaning of it beyond the buzz words. Without losing sight of the tragedy COVID-19 pandemic, the current crisis provides a valuable stage for zooming in on and questioning those potential applications of quantum computing in high-impact and complex situations. In a series of three articles we will zoom in on three potential applications of quantum computing in light of the COVID-19 pandemic. In this article (second) we zoom in on optimization challenges.

It is better to prevent than to cure. Preventing a pandemic or epidemic involves identifying the whereabouts and risks of zoonoses and predicting the spread of an existing one. In doing so, in organizations such as the One Health Institute at the School of Veterinary Medicine at the University of California, Davis is trying to understand the correlation between spillovers and animal-human interactions. Spillovers happen when a high pathogen prevalence of an animal (simply said, fluids containing infectious cells) encounters a potential host (human). Such a spillover can lead to human-to-human transferable diseases (such as COVID-19), but it doesn’t have to, as is the case with rabies. The One Health Institute’s PREDICT Project has been able to identify 1,200 viruses belonging to families known to have the potential to infect people and cause epidemics with 40 risk factors for those viruses to spill over and spread between humans. Identifying these viruses and risk factors required about 170,000 samples from animals and people in about 30 countries. As PREDICT expects that there are about 1.67 million yet-to-be-discovered viruses, both computing- and sample collecting challenges arise. ivv 

Another important factor in preventing (or managing) an epidemic is predicting the spread of an outbreak. In predicting the spread of an outbreak, we have to deal with numerous factors, including human behavior, social conditions, environment, etc. Outcomes of such predictions give insight into topics such as the number of cases and deaths. Based on these numbers, policymakers take (preventive) measures. However, combining this data and running different scenarios with multiple contexts to measure the measure’s economic and non-economic effects remains a challenge. Running predictions such as this are similar to the Netflix Problem. 

Predicting the evolution of coronavirus is a machine learning application. Similar to predicting the evolution of the stock market, or recommending the next movie on Netflix, it looks for meaningful patterns in the data. In stock market predictions, the covariance matrix analysis is used to suggest new asset trades or hedge portfolio items. In recommendation systems at Netflix, user behavior is categorized in different user groups by searching for essential features. Chances are that if person A has watched The Notebook, another romantic movie’s recommendation will be successful. In the case of coronavirus, we have the same problem. With limited information, we are trying to find the features that best define the spread and transmission of the virus. Using these features, we want to predict the number of fatalities, new diseases, and the locations of break-out zones. 

Such recommendation systems typically rely on principle component analysis (PCA). PCA is a machine learning technique that is used for determining the features in a data set that have the most predictive value. With PCA, you can summarize the data into a small set of describing features, the largest principal components (in the Netflix example this could be the genre romantic dramas), and then use those for prediction. 

Calculating the principal components is a mathematical exercise that relies either on singular value decomposition (SVD) or eigenvalue decomposition. Computing the SVD has a time complexity of O(n^3), where n is the size of the covariance matrix. For small datasets, this is no problem, but for more massive datasets, it becomes intractable. Take, for example, the Netflix problem, where the datasets include tens of thousands of movies and millions of users. The resulting dataset contains hundreds of millions of data points. Calculating the principal components, in this case, become impossible. 

Instead, an approximation is made. Instead of determining all principle components, only a few most prominent components are determined. Using the most prominent principle components, an elementary prediction can be made that reflects the preferences of broad user groups. However, for specialized projections, more principle components must be included. For example, a prediction of a movie that is not only in your favorite genre, but also stars your favorite actor might be more successful. 

Quantum computers may be able to determine the principal components much more efficiently. Quantum principle component analysis (qPCE) efficiently exploits the quantum state’s inherent structure in a process called self-tomography to reveal information about the eigenvectors corresponding to large eigenvalues. This process can be used to encode information about a recommendation model in a quantum state, and the principal components can be found by mapping the principal components to the largest eigenvectors. 

Often, the bottleneck for the quantum algorithm is the transformation of classical data into quantum data. However, a qPCE algorithm does not require access to the full dataset. Instead, it efficiently sub-samples the dataset by performing function calls when needed. In a Grover-type operation, the dataset is sampled without the need for quantum ram, or expensive transformation of classical data to quantum data. 

Altogether, qPCE may provide an exponential speed-up over any classical algorithm. Using qPCE, the internal structure of the data is revealed in several holistic variables with high predictive value. In Markovian systems, systems where the most recent events have the most predictive value, we would want to regularly recalculate the principal components. In these systems, qPCE, may have a significant impact. In the case of the stock market, this means that risk may be reduced by identifying relevant correlations. For Netflix, this means better suggestions for new series or movies.

The evolution of the coronavirus is a typical Markovian process. In the past few months, the news seemed to change the perspective daily. Simultaneously, accurate predictions of the number of fatalities or required IC beds were even more critical. For these predictions, where regular and precise calculation of the principal component is needed, qPCE may provide a significant impact. 

Conclusion

Looking at the impact that COVID-19 is having on society, economy, and healthcare, we can envision future use cases for the role of quantum computing in vaccine development, optimization solutions, and in identifying and managing the spread of viruses. 

As the Covid-19 crisis has been stretching its endurance, the societal and financial effects are accumulating. For example, the US has seen its GDP plummeted with 30 percent, and Covid-19 has contributed to driving an additional 12 million people below the extreme poverty threshold in 20201. This leaves to presume that investing in any solution that has the probability of shortening the stretch of the next pandemic is worthwhile. 

The challenge with aligning and allocating investments lies in the still (largely) unknown roadmap to clear use of quantum computing. Many use cases are still to be defined, and quantum’s real potential is expected only in 10 years’ time.  Nevertheless, there is value to be realized with quantum computing in a shorter period, as NISQ computers might already speed up computing. Furthermore, developments in the amount of- and stability of qubits on the one hand and efficiency of quantum algorithms, on the other hand, is developing at a rapid rate. Therefore, it is to be expected that clear business cases will be presented in three years. Because quantum computers are a natural fit for quantum chemistry, we may expect that a quantum advantage will first be realized in this domain. 

Even though quantum computing’s added value is still a couple of years down the line, we should prepare for it. Without dismissing the great technological developments that have been made before and during the current pandemic, different skillsets will likely be required in applying quantum computing versus using classical computing. This stems from the fact that the very foundation of quantum computing is different, and therefore the layers building on that foundation, such as programming languages, will be different too. Implementation of middleware, infrastructure, and development tools will be complex and time-intensive, and necessary skills will be hard to find. Companies would be smart, facilitating early quantum enthusiasts, and promote them to create awareness and explore quantum use cases. In the long run, we will need a variety of profiles, including quantum algorithm experts, developers, testers, hardware engineers, and business developers.  If quantum computers are to become mainstream in ten years, students should enroll today.

This article has been co-authored by Renate Wolters and Julian van Velzen