Monte Carlo, quantum computing’s killer app?


As the quantum computing revolution is unfolding, companies, start-ups and academia are racing to find the killer use case. In the past year, quantum Monte Carlo (QMC) simulations have come up more and more as a viable candidate. In the past year, we have seen breakthroughs, both in hardware and software development that bring a quantum advantage for quantum finance closer. Important trends are:

  • Roadmaps for hardware development have been defined and indicate estimate quantum advantage within a 4-7-year reach. See for example IBM, and IonQ.
  • End-to-end hardware requirements have been estimated for complex derivatives pricing at a T-depth of 50 million, and 8k qubits. Although this is beyond reach of current devices, simple derivatives might be feasible with a gate depth of around 1k. These numbers make initial application around the corner and put a full-blown advantage on the roadmap.
  • Advances in algorithmic development continue to reduce required gate depth and number of qubits. Examples are variational data loaders, or iterative amplitude estimation (IAE)
  • There has an increasing focus on data orchestration, pipelines, and pre-processing, readying organisations for adoption. Also, financial institutions world-wide are setting up teams that work on QMC implementation.

All these developments beg the question; what is the potential of quantum Monte Carlo? and should you look into it now?

Monte Carlo simulations are used extensively in finance to simulate behaviour of stochastic processes. For certain problems, analytical models (such as the Black-Scholes equation) are available that allow to calculate the solution at any moment in time. For many other problems, such an analytical model is not available. Instead, the behaviour of financial products can be simulated by starting with a portfolio, and then simulate the market behaviour. Two important examples are the following:

  • Derivatives pricing. Derivatives are financial products that are derived from underlying assets. Examples are options, futures contracts or swaps. The underlying assets are expected to be stochastic variables, they behave according to some distribution function. To price derivatives, the behaviour of underlying assets has to be modelled.
  • Risk management. To evaluate the risk of a portfolio, for example interest rates or loans, simulations can be performed that model the behaviour of assets to discover losses on the complete portfolio. Stress tests can be performed to evaluate the performance of the portfolio under specified scenario’s, or reverse stress tests can be performed to discover scenario’s that lead to catastrophic portfolio performance.

Classical Monte Carlo simulations requires in the order of samples to be taken, where  is the confidence interval. For large cases, this easily becomes prohibitive; suppose a confidence interval of , billions of samples are required. Even if workloads are parallelized on large clusters, this might not be feasible in an acceptable runtime or cost. Take for example the start of the COVID-19 crisis. Some risk models might have taken months, and before completion, the stock market would have dropped twenty percent.

Quantum Monte Carlo promises in theory a quadratic speedup over classical systems. Instead of  iterations on a classical system,  iterations on a quantum computer would attain the same accuracy. That means that large risk models that took months to complete, may just be feasible within hours. However, unfortunately it’s never as easy as it seems. Although sampling on quantum computers is quadratically faster, a large overhead could completely diminish any quantum speedup. In practice, loading a market model as quantum data seems extremely difficult. There are a few workarounds around this problem, such as the Data loaders as announced by QCWare, or a variational procedure as published by IBM, but it yet to be seen if these works well on real problems.

However, if quantum hardware & software continues to develop at current pace, we can expect a bright future for quantum Monte Carlo applications. A business case can easily be made. If quantum monte Carlo improve risk management simulations, the reserved capital, required through compliancy regulations, could be reduced, freeing up capital that can be used otherwise. On the other hand, the derivatives market in Europe alone accounts for a notional amount of €660 trillion. Slightly inaccurate evaluation of this market can lead to a large offset to its actual value, which in turn leads to instability and risks. Given the huge potential for derivative pricing and risk management, significant and deterministic speedups, and an industry that is in full gear to benefit from quantum, quantum monte Carlo seem to be one of the killer applications. However, before the QMC works successfully in production a lot of work remains to be done. Just like in any application, proper data pipelines needed to be implemented first. Time series required for risk management need to be processed on stationarity, frequency or time period. If policy is adjusted to daily risk management, data streams also have to be up to date. If a quantum advantage needs to be benchmarked, its classical counterpart must be benchmarked to. Additional steps, such as building the required Infrastructure (given the hybrid cloud nature of quantum applications), its relation to compliance regulations, and security considerations have only just begun.

Given the huge potential of quantum Monte Carlo, you do not want to be late to the party. Deploying algorithms in productionized workflows is not easy, even more so when a technology stack is fundamentally different. Hence if we want to benefit from quantum technology, now is the time to start.

Co-author of this article: Julian van Velzen

Julian Velzen


Julian likes to pioneer. Equipped with a master degree in physics, he put Capgemini's quantum technology efforts on the map, and now leads the computing futures (bits/qubits/neurons) domain from within the group's CTIO++ community. Furthermore, he initiated and led project FARM, a big data solution for small-holder farmers in developing countries.

More on Julian Velzen.

Related Posts

Your email address will not be published. Required fields are marked *