Hubbry Logo
Quantum engineeringQuantum engineeringMain
Open search
Quantum engineering
Community hub
Quantum engineering
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Quantum engineering
Quantum engineering
from Wikipedia
Colloidal quantum dots irradiated with a UV light. Different sized quantum dots emit different colour light due to quantum confinement.

Quantum engineering is the development of technology that capitalizes on the laws of quantum mechanics. This type of engineering uses quantum mechanics to develop technologies such as quantum sensors and quantum computers.

History

[edit]

From 2010 onwards, multiple governments have established programmes to explore quantum technologies,[1] such as the UK National Quantum Technologies Programme,[2] which created four quantum 'hubs'. These hubs are found at the Centre for Quantum Technologies in Singapore, and QuTech, a Dutch center to develop a topological quantum computer.[3] In 2016, the European Union introduced the Quantum Technology Flagship,[4][5] a €1 Billion, 10-year-long megaproject, similar in size to earlier European Future and Emerging Technologies Flagship projects. [6][7] In December 2018, the United States passed the National Quantum Initiative Act, which provides a US$1 billion annual budget for quantum research.[8] China is building the world's largest quantum research facility with a planned investment of 76 billion Yuan (approx. €10 Billion).[9][10] Indian government has also invested 8000 crore Rupees (approx. US$1.02 Billion) over 5-years to boost quantum technologies under its National Quantum Mission.[11]

In the private sector, large companies have made multiple investments in quantum technologies. Organizations such as Google, D-wave systems, and University of California Santa Barbara[12] have formed partnerships and investments to develop quantum technology.

Applications

[edit]

Secure communications

[edit]

Quantum secure communication is a method that is hypothesised to be 'quantum safe' in the advent of quantum computing systems utilizing Shor's algorithm to break current cryptography systems. This is done through a number of techniques such as quantum key distribution (QKD), the use of a quantum random number generator, quantum dense coding, and quantum teleportation.[13] Quantum key distribution (QKD), is a method of transmitting information using entangled light in a way that makes any interception of the transmission obvious to the user. A quantum random number generator can be used, which is capable of producing truly random numbers unlike non-quantum algorithms that imitate randomness.[14] A technique called quantum dense coding can also be used where one qubit is used in place of two classic computer bits. This enhances channel capacity through entanglement. It is important to note that a qubit is unable to be copied.[13]  Quantum computing uses a basic unit of information called a qubit in place of a classical computer bit. Qubits are two-level quantum systems that can be denoted as having the value of 1, 0, or any superposition of these states.[15][16] A qubit is unable to be copied due to the observer effect where measuring the properties of the quantum system causes the system to change. Quantum teleportation is another technique used where the quantum state of a qubit is teleported long distance without the particle itself being sent directly.[13]

Computing

[edit]

Quantum computers are hypothesised to have a number of important uses in computing fields such as optimization and machine learning. They are perhaps best known for their expected ability to carry out Shor's algorithm, which can be used to factorize large numbers and is an important process in the securing of data transmissions. This allows relatively small quantum computers to potentially outperform some of the largest supercomputers when it comes to solving certain mathematical problems.[17]

Quantum simulators are types of quantum computers intended to simulate a real world system, such as a chemical compound.[18][19] The idea of quantum simulators was first published in 1982 by Richard Feynman.[16] Quantum simulators are simpler to build as opposed to general purpose quantum computers because complete control over every component is not necessary.[18] Current quantum simulators under development include ultracold atoms in optical lattices, trapped ions, arrays of superconducting qubits, and others.[18]

Machine learning

[edit]

Quantum machine learning has also been proposed. Two examples of this are quantum clustering, where quantum principles might be used to group data into clusters and quantum autocoders that could compress and later reconstruct data.[17]

Sensors

[edit]

Quantum sensing uses certain quantum features to generate very precise measurements.  For instance, the use of quantum systems like neutral atoms and "trapped ions" are being used as quantum sensors for their ability to be relatively easy to manipulate and be put into the well known state.  Quantum sensors use a variety of different quantum systems to extract their measurements.[20] Quantum sensors are hoped to have a number of applications in a wide variety technologies including positioning systems, communication technology, electric and magnetic field sensors, gravimetry.[21] These technologies are used in a variety of different fields.

Quantum sensors are being considered for use in civil engineering and metrology to help determine unknown underground conditions of an area.  Quantum sensors complement ground penetrating radar, magnetometry, electric resistivity, and acoustic measurements.  This is done by taking measurements using quantum gravimetry.[22][23]

Quantum sensors are being considered in the field of medicine to detect conductivity in arteries and organs, neurons firing, the progress of chemotherapy, and isotopes inside the body. This is done through techniques such as spin entanglement, use of atomic spins as magnetic sensors and squeezed light. These techniques yield information that can be used to diagnose heart problems, malnutrition, early-stage osteoporosis, kidney disease as well as certain cancers.[24]

Education programs

[edit]

Quantum engineering is evolving as an engineering discipline. For example, ETH Zurich has initiated a Master of Science in Quantum Engineering, a joint venture between the electrical engineering department (D-ITET) and the physics department (D-PHYS), EPFL offers a dedicated Master's program in Quantum Science and Engineering, combining coursework in quantum physics and engineering with research opportunities, and the University of Waterloo has launched integrated postgraduate engineering programs within the Institute for Quantum Computing.[25][26]

In the realm of undergraduate studies, some institutions have begun to offer programs. The Université de Sherbrooke offers a Bachelor of Science in quantum information,[27] University of Waterloo offers a quantum specialization in its electrical engineering program, and the University of New South Wales offers a bachelor of quantum engineering.[28] A report on the development of this bachelor degree has been published in IEEE Transactions on Quantum Engineering.[29]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Quantum engineering is an interdisciplinary field that leverages , such as superposition, entanglement, and quantum coherence, to design, fabricate, and control physical systems for technological applications including , precision sensing, and networks. Emerging from advances in , it integrates expertise from physics, , , and to build scalable quantum devices that operate beyond classical limits. Key applications encompass quantum processors capable of simulating molecular interactions intractable for classical supercomputers, thus accelerating and materials design; ultra-sensitive sensors for detection and ; and protocols enabling provably secure data transmission resistant to . Notable achievements include the experimental realization of fault-tolerant in small-scale systems using superconducting s and trapped ions, which mitigate decoherence—a primary barrier to —and the demonstration of quantum advantage in specific tasks, such as random circuit sampling completed in minutes versus millennia on classical hardware. These milestones, achieved through iterative of cryogenic environments and nanoscale fabrication, underscore causal challenges like and qubit fidelity, yet affirm progress toward practical utility despite persistent hurdles in room-temperature operation and large-scale integration. The field's defining characteristics include a reliance on empirical validation through cryogenic testing and probabilistic outcomes inherent to quantum measurements, distinguishing it from deterministic classical , while controversies center on overoptimistic timelines for commercial viability amid funding-driven hype, though grounded assessments highlight incremental gains in coherence times exceeding milliseconds and gate fidelities above 99%. Prioritizing first-principles modeling of quantum Hamiltonians, quantum engineering continues to evolve, with ongoing efforts in hybrid classical-quantum architectures poised to yield transformative impacts in optimization problems for and .

Definition and Fundamentals

Core Principles and Scope

Quantum engineering constitutes the interdisciplinary application of engineering methodologies to systems, emphasizing the design, fabrication, control, and scaling of devices that exploit quantum mechanical principles for technological purposes. Unlike pure theoretical quantum physics, it prioritizes practical , including the mitigation of and the achievement of fault-tolerant operations in real-world conditions. This field emerged as a distinct in the late , driven by advances in quantum hardware, with programs established at institutions such as and MIT to train engineers in bridging quantum theory and manufacturable systems. At its core, quantum engineering relies on foundational quantum phenomena—superposition, wherein quantum states exist in multiple configurations simultaneously; entanglement, enabling correlated behaviors across distant particles; and interference, which underpins computational parallelism. Engineers apply to manipulate these states via precise Hamiltonians, often using feedback loops and cryogenic environments to preserve coherence times, typically on the order of microseconds to milliseconds for leading platforms like superconducting qubits. The discipline demands causal modeling of decoherence mechanisms, such as and , to engineer robust quantum gates with fidelities exceeding 99.9% as demonstrated in Google's 2019 Sycamore processor. The scope extends beyond computing to encompass quantum sensing for precision metrology—achieving sensitivities surpassing classical limits by factors of 10^3 in detection via nitrogen-vacancy centers in —and quantum communication protocols like those securing data transmission over 1,200 km via satellite in China's Micius experiment of 2017. It also includes materials engineering for topological insulators and quantum dots, targeting applications in energy-efficient electronics. Challenges within this scope involve hybrid system integration and , with ongoing efforts by bodies like the IEEE to define quantum hardware interfaces, reflecting the field's transition from proof-of-concept prototypes to industrially viable technologies as of 2025.

Distinction from Theoretical Quantum Physics

Quantum engineering diverges from theoretical quantum physics primarily in its emphasis on practical implementation and device fabrication rather than foundational modeling and prediction. Theoretical quantum physics seeks to elucidate the underlying principles of quantum mechanics, such as wave functions, operators, and probabilistic outcomes, through mathematical derivations and experimental validation of phenomena like superposition and entanglement. In contrast, quantum engineering leverages these established principles to design, construct, and optimize tangible systems that exploit quantum effects for functional purposes, addressing real-world constraints including material limitations and environmental interactions. A core distinction lies in the handling of engineering-specific challenges absent from pure theory. While theoretical work predicts ideal behaviors under controlled assumptions, quantum engineers must contend with decoherence—the loss of quantum coherence due to interactions with the environment—and develop techniques for error correction, qubit stabilization, and scalable architectures. For instance, quantum engineers fabricate hardware platforms, such as superconducting circuits or ion traps, to manipulate s reliably, integrating conventional engineering disciplines like electrical and to achieve viability beyond prototypes. This applied focus transforms abstract quantum predictions into operable technologies, such as sensors or processors, where and repeatability are paramount. The interdisciplinary nature of quantum engineering further sets it apart, requiring not only quantum theory but also expertise in control systems, , and nanofabrication to realize devices that harness subtle quantum features like entanglement for practical utility. Theoretical quantum physics, by comparison, remains oriented toward hypothesis testing and paradigm refinement, often without immediate concern for manufacturability or integration into larger systems. This shift from conceptual exploration to engineered application has accelerated since the , driven by investments in .

Historical Development

Early Theoretical Foundations (1900s–1970s)

The theoretical foundations of quantum engineering trace back to the emergence of , which provided the principles for manipulating matter and energy at atomic and subatomic scales. In December 1900, resolved the in by positing that electromagnetic energy is emitted and absorbed in discrete packets, or quanta, with energy E=hνE = h\nu, where hh is Planck's constant and ν\nu is frequency; this hypothesis, initially viewed as a mathematical expedient, marked the inception of quantization as a physical reality. Building on this, in 1905 explained the by treating light as consisting of particle-like quanta (later termed photons), demonstrating wave-particle duality and earning him the 1921 ; this work empirically validated quantization beyond . The "old quantum theory" phase from 1907 to 1924 incorporated ad hoc quantization rules into classical models, notably Niels Bohr's 1913 atomic model, which postulated stable electron orbits with quantized angular momentum L=nL = n\hbar (where nn is an integer and =h/2π\hbar = h/2\pi), successfully predicting hydrogen spectral lines but failing for multi-electron atoms. Louis de Broglie's 1924 thesis extended wave-particle duality to matter, proposing that particles like electrons possess wavelengths λ=h/p\lambda = h/p (with pp as momentum), experimentally confirmed by Davisson and Germer's 1927 electron diffraction. These developments culminated in the formulation of modern quantum mechanics: Werner Heisenberg's 1925 matrix mechanics, which used non-commuting operators to describe observables, and Erwin Schrödinger's 1926 wave equation iψt=H^ψi\hbar \frac{\partial \psi}{\partial t} = \hat{H} \psi, yielding equivalent probabilistic predictions via the wavefunction ψ\psi. By the 1930s, quantum mechanics integrated relativity through Paul Dirac's 1928 equation, predicting and particles, while emerged to reconcile quantum rules with , as in (QED). Post-World War II advancements included the renormalization techniques in QED by , , and Sin-Itiro Tomonaga (1940s–1950s), achieving precise predictions like the electron's anomalous to parts per billion. John Bell's 1964 theorem highlighted nonlocal correlations in entangled systems, challenging local realism and laying groundwork for concepts, though experimental verification awaited the 1980s. These theoretical pillars enabled later engineering pursuits by establishing predictive frameworks for coherent quantum states, superposition, and entanglement, despite ongoing debates over interpretations like versus alternatives.

Emergence of Practical Concepts (1980s–2000s)

The 1980s saw the initial conceptualization of as engineered computational tools, bridging theoretical with practical device design. In 1982, proposed that quantum mechanical computers could efficiently simulate quantum physical processes, which classical computers struggle to model due to exponential scaling in dimensionality. This insight emphasized the need for hardware exploiting superposition and interference, shifting focus from simulation limits to engineering coherent quantum states. Complementing this, Paul Benioff in 1980 described a operating on reversible quantum mechanical Hamiltonians, while in 1985 formalized a universal quantum computer model, proving its capacity for any quantum computation via interference of quantum amplitudes. These frameworks highlighted engineering challenges like maintaining coherence against environmental decoherence, spurring interest in controllable such as trapped particles and optical lattices. The 1990s accelerated practical concepts through algorithms demonstrating quantum advantage, necessitating scalable qubit engineering. Peter Shor's 1994 polynomial-time algorithm for factoring large integers on a quantum computer revealed potential to undermine classical , based on period-finding via , thus incentivizing experimental qubit arrays. Lov Grover's 1996 unstructured provided quadratic speedup over classical exhaustive methods, further underscoring the utility of in engineered quantum circuits. Concurrently, proposals like Ignacio Cirac and Peter Zoller's 1995 scheme for ion-trap quantum gates using vibrational modes as buses introduced architectures for entangling multiple s, addressing scalability via collective motion control. Error correction codes, such as Shor's 1995 nine-qubit scheme protecting against bit-flip and phase errors through redundancy and syndrome measurement, emerged as essential for fault-tolerant engineering, quantifying thresholds where quantum advantage persists despite noise. Experimental milestones validated these concepts, realizing rudimentary quantum hardware. In 1995, , David Wineland, and collaborators at NIST executed the first controlled-NOT gate on two trapped ions, laser-cooled to near , achieving state transfer with 96% fidelity and verifying two-qubit entanglement via coincidence detection. By 1998, (NMR) ensembles implemented the Deutsch-Jozsa algorithm on two effective s, distinguishing balanced from constant functions with near-perfect discrimination, leveraging liquid-state molecular spins for bulk coherence. Superconducting circuits advanced with Yasunobu Nakamura's 1999 charge demonstration, exhibiting Rabi oscillations at 5 GHz with nanosecond coherence times in a Cooper pair box tuned via Josephson junctions. Into the , these platforms scaled modestly: a 2001 NMR experiment factored 15 using on seven s, while quantum dots—nanoscale confinements discovered in the early by Alexei Ekimov and Louis Brus, showing size-dependent emission from discrete energy levels—began enabling solid-state proposals via spin or charge states. These proofs-of-principle established quantum engineering as the discipline of fabricating, isolating, and manipulating mesoscopic against decoherence.

Modern Milestones and Acceleration (2010s–Present)

The 2010s marked a transition from foundational to scaled efforts in quantum technologies, driven by national initiatives and private exceeding $30 billion globally by 2023. In the United States, the National Quantum Initiative Act of 2018 established a coordinated federal program, allocating over $1.2 billion to accelerate across agencies like NIST, NSF, and DOE, emphasizing hardware development and applications in computing, sensing, and communication. Similar programs, such as the European Union's Quantum Flagship launched in 2018 with €1 billion funding, spurred collaborative of scalable quantum systems. These efforts addressed bottlenecks like qubit coherence and error rates through interdisciplinary and cryogenic infrastructure advancements. In quantum computing hardware, superconducting qubit platforms achieved key scalability milestones. Google's 2019 demonstration of using the 53-qubit Sycamore processor completed a random circuit sampling task in approximately 200 seconds, a estimated to require 10,000 years on the fastest classical supercomputers at the time, validating engineered control over superposition and entanglement in noisy intermediate-scale systems. IBM advanced qubit architectures, releasing the 127-qubit Eagle processor in 2021 and the 433-qubit in 2022, with a roadmap targeting modular, error-corrected systems by 2029 featuring hundreds of logical qubits via surface code implementations. Trapped-ion and neutral-atom approaches also scaled, as seen in Quantinuum's 2024 entanglement of 50 logical qubits with over 98% , reducing rates through dynamical decoupling and syndrome extraction. Quantum communication engineering accelerated with satellite-based demonstrations of long-distance protocols. China's Micius satellite, launched in 2016 and operational from 2017, achieved over 7,600 km between ground stations and distributed entangled photons over 1,200 km, confirming Bell inequality violations in space and enabling secure key rates of 1.1 kbit/s despite atmospheric losses. These experiments engineered free-space and to mitigate decoherence, paving the way for hybrid satellite-fiber networks. Ground-based quantum repeaters advanced with memories based on rare-earth ions, extending repeater-free distances beyond 100 km by 2020. Quantum sensing and metrology saw practical deployments leveraging nitrogen-vacancy (NV) centers in diamond for high-sensitivity magnetometry. Engineering optimizations, including ensemble NV initialization via optical pumping and microwave control, enabled nanoscale magnetic field detection with sensitivities below 1 nT/√Hz, applied in biomedical imaging and geophysics since the mid-2010s. Recent integrations with atomic clocks and interferometers have pushed precision metrology, as in 2024 demonstrations of distributed quantum sensing networks for gravitational wave detection precursors. The 2022 Nobel Prize in Physics recognized foundational entanglement experiments underpinning these engineered sensors. By the mid-2020s, focus shifted to fault-tolerant engineering, with Google's 2024 surface implementation achieving error rates below the threshold (0.143% per cycle) using 105 physical s for one logical qubit, halving logical error probabilities through increased code distance. IBM's 2025 roadmap incorporates real-time error correction on hybrid classical-quantum processors, targeting utility-scale applications in optimization and by 2026. These milestones reflect causal progress in mitigating decoherence via improved fabrication—such as isotopically pure substrates and high-fidelity —though remains constrained by cryogenic requirements and yield rates below 99.9% for multi-qubit operations.

Key Quantum Phenomena in Engineering

Qubits, Superposition, and Entanglement

In quantum engineering, serve as the basic building blocks of processing systems, representing two-level that encode information in states analogous to the classical bit values of 0 and 1, but with the capacity to occupy superpositions thereof. Unlike classical bits, which remain definitively in one state, are engineered using physical platforms such as superconducting Josephson junctions, ion traps, or neutral atoms, where the is manipulated via precise electromagnetic controls to achieve desired computational outcomes. This implementation enables the qubit to function as a vector in a two-dimensional , typically denoted as α0+β1\alpha |0\rangle + \beta |1\rangle, where α\alpha and β\beta are complex amplitudes satisfying α2+β2=1|\alpha|^2 + |\beta|^2 = 1. Superposition, a core quantum mechanical principle, permits a single to exist simultaneously in multiple states, allowing an ensemble of nn qubits to represent up to 2n2^n classical bit strings in parallel, which underpins the potential exponential speedup in quantum algorithms. In engineering practice, superposition is induced by applying operations like Hadamard , which rotate the qubit state from a basis vector to an equal superposition, as demonstrated in early experiments with superconducting qubits achieving superposition fidelities exceeding 99% under cryogenic conditions. This phenomenon is harnessed in quantum simulation tasks, where engineers design circuits to evolve superposed states for modeling molecular energies or optimization problems intractable for classical computers. Entanglement arises when two or more s are correlated such that the of the system cannot be expressed as a product of individual qubit states, leading to instantaneous dependencies between distant particles upon , a feature Einstein termed "spooky " but now routinely engineered for applications like . For instance, Bell states such as 12(00+11)\frac{1}{\sqrt{2}} (|00\rangle + |11\rangle)
Add your contribution
Related Hubs
User Avatar
No comments yet.