Quantum computing has long promised to revolutionize the field of machine learning, but concrete advantages have been elusive—especially with current, noisy quantum devices. Now, a team of researchers from the University of Vienna, Quantinuum, and Politecnico di Milano has experimentally demonstrated a quantum-enhanced machine learning method that uses a photonic processor. Their approach doesn’t just match classical methods. It surpasses them, both in theoretical rigor and in experimental accuracy.
A quantum kernel technique that maps data into high-dimensional quantum spaces using photonic circuits. This setup enables efficient and accurate classification without requiring entanglement or overly complex hardware.
Quantum advantage for real-world tasks
Classical kernel methods have been a staple of machine learning for years. These models map data into higher-dimensional feature spaces where it becomes linearly separable, making classification tasks easier. However, pushing this to higher dimensions often creates computational bottlenecks and precision issues.
This is where quantum kernels come in. By encoding input data into quantum states and processing them through photonic circuits, the research team showed they could outperform both traditional kernels—like the Gaussian and polynomial kernels—and more modern approaches like the neural tangent kernel.
Photonic implementation of quantum kernels
The experimental setup centers around a programmable photonic processor with six input and output modes. Data points are encoded into phase shifts that define unitary operations applied to Fock states of two indistinguishable photons. The interference patterns generated by these photons carry rich information about the relationships between data points.
To compare performance, the team implemented both quantum and classical versions of the kernel. The quantum kernel used indistinguishable photons (which interfere), while the classical kernel used distinguishable photons (no interference). This design made it possible to isolate the advantage contributed by quantum behavior.
Measuring performance differences
The researchers introduced a metric called geometric difference to quantify how much better the quantum kernel performs. This metric guided the selection of data labeling and classification tasks, allowing the quantum method to demonstrate its superiority. In classification tests, the quantum kernel achieved up to 100% accuracy, while the classical version maxed out at 94%.
EU plans new Scaleup Europe Fund for AI and quantum tech
Efficient, scalable design without entanglement
One of the most remarkable aspects of the system is that it doesn’t require entangling gates, which are difficult to implement and maintain. The nonlinearity required for classification is achieved purely through the interference patterns of single-photon states manipulated by thermal phase shifters and beamsplitters.
The system scales well for mid-sized problems, which is critical given the experimental resource limitations of current quantum devices. By focusing on coincidence counts and collision-free photon detection events, the researchers ensured accurate kernel estimation using manageable data volumes and experimental time.
Better performance than neural networks
Compared to other classical approaches—including support vector machines and infinite-width neural networks—the quantum and even classical photonic kernels performed better for the tasks tested. This suggests that photonic systems, even without full entanglement, can handle meaningful machine learning workloads.
The measured fidelities support this performance: the quantum kernel achieved a fidelity of 0.9816 ± 0.0148, while the classical photonic kernel reached 0.9934 ± 0.0048. These numbers indicate that the photonic platform is capable of precise and repeatable computation at scale.
This work opens new doors for hybrid systems that combine quantum and classical resources. One particularly compelling idea is using classical light for pretraining tasks, then switching to photonic quantum hardware for the computationally intense feature mapping. This would reduce energy use and speed up training times—making machine learning both faster and greener.
Researchers also note the potential of integrating quantum kernels into neuromorphic computing platforms. By leveraging photonic nonlinearities and programmable feedback loops, it may be possible to build analog processors that merge memory and computation, further enhancing efficiency.
Despite the limited size of current photonic processors, this experiment proves that even modest quantum systems can enhance real machine learning workflows. The photonic kernel approach doesn’t just replicate what classical systems can do—it finds smarter ways to do it, exploiting the unique properties of quantum mechanics to improve accuracy and efficiency.
The architecture demonstrated is reproducible with current hardware and paves the way for practical applications in areas such as image recognition, medical diagnosis, and natural language processing—especially when working with small datasets.