
Understanding Quantum Optimization
Alejandro Giraldo
Ph.D. candidate in Computer Science and Engineering
Alejandro Giraldo is an Electronic Engineer with a master’s degree in Applied Mathematics and a Ph.D. candidate in Computer Science and Engineering at the University of Louisville. For the past seven years, he has conducted research in quantum computing, specialising in quantum optimisation algorithms and hybrid quantum–HPC methodologies. His work has been published in venues such as Springer and IEEE.
Alejandro has led interdisciplinary research initiatives that connect quantum technologies with engineering, education, and social innovation. His contributions have earned distinctions including the IDB–Japan Special Fund Award and the CEMEX-Tec Global Innovation Award, granted to the organisation he co-founded and directed. He also holds excellence certificates from the Qiskit Global Quantum Summer School and MIT’s Introduction to Quantum Computing, and has delivered invited talks on quantum computing at universities in Colombia and the United States.

The University of Louisville J.B. Speed Scientific School was founded in 1925 by Dr. William S. Speed and Mrs. Olive Speed Sackett when they established an endowment in honor of their father, the late James Breckinridge (J.B.) Speed with the assistance of a grant from the James Breckinridge Speed Foundation. Funds from the endowment are still used, to this day, to supplement the Speed School’s activities. The four original Speed School departments (Chemical Engineering, Civil Engineering, and Electrical Engineering, and Mechanical Engineering) and their B.Sc. programs were reviewed and accredited as part of the Engineers’ Council for Professional Development (ECPD)1 inaugural accreditation class in 1936.
Watch this full lecture and over 100 other expert lectures on Quantum Technologies in our community with a single subscription.
Quantum Optimisation Today: From Foundations to Practical Impact
Alejandro, an electronic engineer with advanced training in applied mathematics and computer science, led the session. He brings seven years of research in quantum computing, with a focus on quantum optimisation algorithms and hybrid quantum-HPC methodologies. His applied work spans engineering, education and social innovation, with publications in IEEE and Springer venues.
What the briefing covered
The historical evolution of quantum computing
Core paradigms and concepts
The current technology landscape across theory, hardware, software and business
A deep dive on quantum optimisation, including algorithms, hardware approaches and a worked example
Practical limits, open questions and governance implications
Evolution of quantum computing
Key milestones included early conceptual foundations from Richard Feynman and Yuri Manin, the formalisation of the quantum Turing machine by David Deutsch in 1985, and the breakthrough impact of Shor’s algorithm on RSA cryptography. Around 2000, the field moved from theory to practice with the first multi-qubit systems.
Commercial and public access followed. D-Wave introduced quantum annealing systems in 2011. IBM made gate-based machines available via the cloud from 2016. Since then, progress has accelerated. Free public access now extends to gate-based systems with more than 100 qubits and large-scale annealers, with paid tiers reaching substantially larger devices. The pace is exciting yet operationally challenging.
Fundamental concepts and computing models
The talk reiterated that qubits are superpositions in Hilbert space, and that quantum computation exploits superposition, interference and entanglement. In practice, effective use depends more on mathematical formalism than on full physical interpretation.
Two principal approaches were contrasted:
Gate-based quantum computing
Discrete, controllable operations via quantum gates enable explicit manipulation of entanglement and interference. This is the model used by platforms such as IBM.Quantum annealing
Continuous-time evolution under the adiabatic theorem is particularly useful for optimisation. It is the approach used by D-Wave. There is less control over intermediate states, but the method is already practical at scale.
The current landscape
The ecosystem comprises four interdependent layers:
Theory: information theory, gate models, error criteria such as the DiVincenzo criteria, quantum memories
Hardware: superconducting qubits, trapped ions and photonics, each with distinct trade-offs
Software and frameworks: multiple competing toolchains with no clear winner
Business and integration: application development, hybrid cloud deployment and services
A notable shift is the reliance on classical infrastructure to orchestrate, optimise and evaluate quantum experiments, especially within hybrid workflows.
Why optimisation matters
Optimisation underpins logistics, infrastructure planning, cybersecurity, and machine learning and AI. Classical approaches struggle with local minima, high-dimensional search and tight resource constraints.
Quantum techniques add new mechanisms. Tunnelling and entanglement can help escape local minima. Hybrid quantum-classical pipelines can provide near-term benefits. The gains are often incremental, typically polynomial or quadratic rather than exponential, but they are valuable given scarce compute and time-critical decision making.
Pipelines and algorithms
Four implementation patterns were outlined:
Static quantum circuits: fixed circuits executed many times
Hybrid algorithms: iterative classical-quantum loops
Variational algorithms: parameterised circuits optimised by classical routines
Fully quantum algorithms: dependent on quantum memory and largely theoretical today
Algorithmic kernels discussed included QAOA, Grover-based optimisation, quantum gradient and Hessian estimation, quantum walks, Gibbs sampling and phase estimation. Quantum annealing is currently the most practical route for real optimisation tasks. Gate-based methods are progressing but remain more sensitive to noise.
Worked example: facility location
A maximal covering location problem illustrated the workflow:
Map a linear optimisation problem to a QUBO
Execute using either quantum annealing or gate-based methods
Leverage a characteristic quantum advantage: the ability to surface multiple optimal solutions without prior enumeration, something many classical solvers do not provide by default
The session referenced repositories and reproducible experiments to encourage hands-on exploration.
Governance, AI and open questions
Discussion highlighted that most governments are not yet prepared for the governance implications of AI systems enhanced by quantum-accelerated optimisation. Regulation typically lags capability. Europe appears more proactive, while other regions are more reactive.
On emerging research directions such as real-valued Schrödinger formulations, error mitigation and hot-swappable qubits, the emphasis was on empirical validation over theory alone.
Overall assessment
The session presented quantum optimisation as:
Technically mature for targeted, hybrid use today
Strategically important for AI, infrastructure and security
Constrained by noise, hardware variability and governance gaps
The overarching message is clear: quantum advantage will arrive incrementally through hybrid systems and applied optimisation, not through a single, sudden theoretical breakthrough.




































