https://www.youtube.com/watch?v=lZ3bPUKo5zc&list=PLUl4u3cNGP...
It's long, and the subject matter is intimidating at times, but watch, re-watch, then go deep by finding papers on subjects like superposition and entanglement, which are the key quantum phenomena that unlock quantum computing.
It also helps to understand a bit about how various qubit modalities are physically operated and affected by the control systems (e.g. how does a program turn into qubit rotations, readouts, and other instruction executions). Some are superconducting chips using electromagnetic wave impulses, some are suspending an ion/atom and using lasers to mutate states, or photonic chips moving light through gates - among a handful of other modalities in the industry and academia.
IBM's Qiskit platform may still have tooling, simulators, and visualizers that help you write a program and step through the operations on the qubit(s) managed by the program:
It's worth noting: the book assumes a fair bit of mathematical background, especially in linear algebra. If you don't have the equivalent of an undergrad CS/math/physics degree (with some linear algebra), it may be better to start with gentler sources.
One such gentler source is the free online text I wrote with Andy Matuschak -- https://quantum.country. I'm sure there are others which are very good, but perhaps that's helpful!
Both books focus on foundations of the field, and don't cover recent innovations -- the book with Ike Chuang is 26 years old! Still, many of the foundations have remained quite stable.
For computation models, the circuit model and measurement-based computation cover most real work. Aaronson’s Quantum Computing Since Democritus and Nielsen & Chuang explain why quantum differs from classical (interference, amplitudes, complexity limits).
For computers/architecture, think of qubits as noisy analog components and error correction as how digital reliability is built on top. Preskill’s NISQ notes are very clear here.
For programming, most work is circuit construction and simulation on classical hardware (Qiskit, Cirq). That’s normal and expected.
Beyond Shor, look at Grover, phase estimation, and variational algorithms—they show how quantum advantage might appear, even if it’s limited today.
Quantum Mechanics and Quantum Computation by Umesh Vazirani (UC Berkeley course) - https://youtube.com/playlist?list=PL74Rel4IAsETUwZS_Se_P-fSE...
It's old, but really good.
Another nice one is:
Introduction to Classical and Quantum Computation by Wong - https://www.thomaswong.net/introduction-to-classical-and-qua... [PDF]
These are really nice.
My favorite QM book is the one by Eisberg, Resnick. I recommend it to other people.
There are some nice recommendations in this thread:
- Nielsen, Chuang
- quantum.country by Nielsen
- The IBM Qiskit ecosystem, community, platform, etc. are active and welcoming
Manning Publication has some books on the theme. It's worth it to search through them.
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
The online tutorial [2] is a good followup, especially if you want to understand Clifford gates / stabilizer states, which are important for quantum error correction.
If you have a more theoretical bent, you may enjoy learning about the ZX-calculus [3] - I found this useful for understanding how measurement-based quantum computing is supposed to work.
[1] https://cs.uwaterloo.ca/~watrous/QC-notes/QC-notes.pdf [2] https://qubit.guide/ [3] https://zxcalculus.com/
1/ Digital and analog - where digital equals qubits and analog equals photonics, diamonds, or a range of other bit replacements.
2/ Qubits and gates are the building blocks and operations in digital. Photons, diamonds, electrons, and so on are the bits in analog, you can encode any of these with information in various ways.
3/Strawberry fields for analog qc, and IBM's qiskit for digital
I work on photonic integrated circuits and adapt them to remove the physical limitations on capacity, such as heat, and information loss.
Goes through qubits, state vectors and Grovers algorithm in a highly visual and intuitive fashion. Doesn’t discuss the underlying quantum mechanics in depth, but does mention and link out to resources for the interested viewer to delve deeper.
More mathy: A. Yu. Kitaev, A. H. Shen, M. N. Vyalyi, "Classical and Quantum Computation"
A killer app: Peter Shor, "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer"
Some course notes: https://math.mit.edu/~shor/435-LN/