
1.3 Why Quantum Computing?
held the promise of outperforming classical counterparts in various machine learning applications.
Within this dynamic landscape, a range of groundbreaking algorithms and techniques emerged. These
included the Quantum Support Vector Machine (QSVM) [4] [5], designed to enhance the capabilities of support
vector machines using quantum computing. Quantum Algorithms for Nearest-Neighbor Methods[6] contributed
to the advancement of pattern recognition and data analysis. Quantum Encoders [7] enabled the efficient
representation of classical data in quantum states, unlocking the potential for quantum data processing. Quantum-
Enhanced Feature Spaces expanded the capacity to capture intricate patterns within datasets. Additionally,
researchers ventured into the integration of artificial feed-forward neural networks into a quantum framework,
unveiling new possibilities for neural network applications. The era also witnessed the arrival of Deep Quantum
Neural Networks, which exemplified the vast potential of quantum computing in the training and operation of
deep neural networks.
These developments collectively marked a dynamic and transformative era in QML, characterized by active
exploration and the development of a diverse array of quantum algorithms and techniques aimed at enhancing
machine learning capabilities, ultimately reshaping the landscape of both quantum computing and machine
learning.
1.3 Why Quantum Computing?
Quantum mechanics, a branch of physics that emerged in the early 1900s, was originally developed to
elucidate the behaviors of atoms and particles on a minuscule scale. Its profound insights subsequently paved
the way for transformative technological advancements, including transistors, lasers, and magnetic resonance
imaging.
The notion of uniting quantum mechanics with information theory first surfaced in the 1970s, but it
remained relatively obscure until 1982 when physicist Richard Feynman delivered a compelling lecture. In
his discourse, Feynman contended that classical logic-based computing faced insurmountable challenges when
attempting to process calculations describing quantum phenomena efficiently. He proposed that computing
systems grounded in quantum phenomena, designed to replicate other quantum phenomena, could potentially
circumvent these limitations. Although this visionary concept laid the groundwork for what would later become
the field of quantum simulation, it did not initially ignite widespread research efforts.
Quantum and classical computers share the overarching goal of problem-solving, yet their approaches to
data manipulation and problem-solving are inherently distinct. This section unveils the unique characteristics
of quantum computers by introducing two foundational principles of quantum mechanics that underpin their
functionality: superposition and entanglement.
Superposition, a seemingly paradoxical attribute of quantum objects like electrons, enables them to occupy
multiple ”states” concurrently. For instance, an electron can exist simultaneously in the lowest energy level
of an atom and the first excited level. When an electron is prepared in such a superposition, it possesses a
probability of being in either the lower or upper state. Only through measurement does this superposition
collapse, definitively placing the electron in one of the two states.
Understanding superposition provides the basis for comprehending the fundamental unit of information in
quantum computing: the qubit. In classical computing, bits, represented by transistors, have binary states of
0 and 1, signifying off and on. In contrast, qubits, exemplified by electrons, assign 0 and 1 to states similar
to the lower and upper energy levels discussed earlier. Qubits distinguish themselves from classical bits by
their ability to exist in superpositions with varying probabilities, subject to manipulation by quantum operations
during computational processes.
3