Machine learning on today’s classical computers is already having much impact, from powering chat bots to predicting the behaviour of financial markets. Researchers are excited to see what more quantum computers could do. In a paper published 14 December 2020 in Physical Review Research, CQT researchers and their collaborators in Greece and China describe a new approach to quantum machine learning.
So far, most proposals for quantum machine learning are digital. As in traditional computing, data are assumed to be presented in bits and processed with logic gates. The new idea is to use a quantum system’s natural quantum dynamics as a tool to learn complex functions and recognise patterns in data. This is an analogue rather than a digital approach. Previously, some groups had proposed hybrid approaches that mixed some analogue part into a digital scheme. This paper describes a fully analogue scheme.
The researchers showed that their proposed analogue approach can solve a classical machine learning task known as generative modelling. One example of an application of generative modelling is product recommendation, where the model recommends products to new customers based on the behaviour of previous customers, somewhat similar to how Netflix recommends movies based on earlier preferences. “We tried different example cases, including complicated data structures, and the learning was quite good,” says Principal Investigator Dimitris Angelakis, who led the team.
In their paper, the researchers analyse and discuss how the trainability of generic quantum systems – a measure of how well the system can be trained to learn different sets of data – is deeply connected with fundamental properties such as the amount of memory in the system’s quantum dynamics and the strength of interactions or disorder among its components.
Sidestepping digital gates
The team see some advantages to going analogue. For realistic problems, quantum machine learning algorithms would require hundreds of thousands of gates and many millions of qubits. This is beyond the capabilities of near-term quantum computers.
“The digital approach is prone to errors if one does not have a fault tolerant quantum computer, which is still many years away,” says Dimitris. “If your gates are not exactly tuned at the right value, you get errors. These errors accumulate and blow up if the system is noisy, which is the case with current or near-term quantum computers.” Furthermore, writing digital algorithms for quantum hardware is not always a straightforward process.
Using the dynamics of quantum systems instead of gates for machine learning could work on existing or near-term platforms. “We strongly believe that the analogue approach will be easier to implement because we are using the quantum systems as they are naturally,” says Dimitris.
On top of their analytic work, the researchers did simulations of a specific spin model, an Ising spin chain with ten qubits. The system’s natural dynamics include interactions between the spin qubits and interactions between qubits and the external magnetic field.
Quantum chaos and machine learning
It was not obvious that analogue quantum systems could be trained, and if they could, how to train them. Machine learning systems need a feedback loop – taking inputs, producing output and adjusting their predictions based on the output in the right way. For analogue quantum systems, it was not known how to implement a feedback loop. The researchers found a way by exploiting concepts from quantum chaos.
Chaos relates to the sensitivity of a system to its initial conditions, where a slight change in the initial conditions results in a very different output. The concept is often imagined as a flap of a butterfly’s wings causing a tornado – the ‘butterfly effect’ first suggested by the mathematician and meteorologist Edward Lorenz in the 1960s.
“If a system is fully in the quantum chaos regime it is very sensitive to change and so it is very difficult to have an efficient feedback loop,” says Dimitris. “At the same time, chaos allows the system to explore the entire Hilbert space, which is important for capturing all possible data distributions to be learned. You need to be at the right spot.”
Dimitris and his coauthors calculated that they could optimise the machine learning by adjusting the ratio of interactions and disorder in the quantum system, which tunes the distance from the chaotic regime. At the right spot, the system can be ‘guided’ to a targeted state during the learning process, and still explore and learn all the possible data fed into it.
The team is now talking to experimental and industry collaborators to test their ideas on real hardware and real world data.
Quantum simulation explores all possible futuresApril 12 2019
Hybrid machine learning algorithm optimises engineering of quantum sensorsAugust 28 2020
Google presents 'Quantum Innovation Award' to CQT PIApril 02 2018