Researchers from Auburn University and Oak Ridge National Laboratory have developed a quantum approach to solve the balanced k-means clustering training problem using the D-Wave 2000Q adiabatic quantum computer. This method targets the global solution of the optimization problem, potentially outperforming classical algorithms that often converge to local optima.
The balanced k-means clustering model partitions data points into clusters of approximately equal size, a requirement in applications like network design and marketing. Classical approaches, such as Lloyd's algorithm, scale poorly for large datasets and guarantee only locally optimal solutions. The quantum formulation converts the problem into a Quadratic Unconstrained Binary Optimization (QUBO) problem, solvable via quantum annealing.
Theoretical analysis shows the quantum approach achieves better scalability on large datasets compared to classical balanced k-means algorithms, with a time complexity of O(N²kd) versus O(N³) for the classical method. Empirical tests on synthetic and benchmark data sets, including portions of the Iris data set, demonstrated clustering performance similar to the best classical algorithms, even on current imperfect hardware.
As quantum hardware improves in fidelity and scale, this approach could become a viable alternative for training clustering models efficiently, addressing NP-hard problems in machine learning.
Original Source
Read the complete research paper
About the Author
Guilherme A.
Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.
Connect on LinkedIn