Abstract
Through proofs and small scale implementations, quantum computing has shown potential to provide significant speedups in certain applications such as searches and matrix calculations. Recent library developments have introduced the concept of hybrid quantum-classical compute models where quantum processor units could be used as additional hardware accelerators by classical computers. While these developments have opened the prospect of applying quantum computing to machine learning tasks, there are still many limitations of near and midterm quantum computing. If implemented carefully, the advantages of quantum algorithms could be used to accelerate current machine learning models. In this work, a hybrid quantum-classical model is designed to solve a gradient descent problem. The quantum HHL algorithm is used to solve a system of linear equations. The quantum swap test circuit is then used to extract the Euclidean distance between a test point and the quantum solution. The Euclidean distance is then passed to a classical gradient descent algorithm to reduce the number of iterations required by the gradient descent algorithm to converge on a solution.
Library of Congress Subject Headings
Quantum computing; Neural networks (Computer science); Regression analysis
Publication Date
5-2023
Document Type
Thesis
Student Type
Graduate
Degree Name
Computer Engineering (MS)
Department, Program, or Center
Computer Engineering (KGCOE)
Advisor
Sonia Lopez Alarcon
Advisor/Committee Member
Corey Merkel
Advisor/Committee Member
Nathan Cahill
Recommended Citation
Hoffnagle, Martin A., "Quantum Acceleration of Linear Regression for Artificial Neural Networks" (2023). Thesis. Rochester Institute of Technology. Accessed from
https://repository.rit.edu/theses/11437
Campus
RIT – Main Campus
Plan Codes
CMPE-MS