Not at all. In a classical computer, the memory is based on bits of information which can either[1] in state 0 or state 1, and (assuming everything has been designed correctly) won’t exist in any other state than the allowed ones. Most classical computers work with binary.
In quantum computing, the memory is based on qubits, which are two-level quantum systems. A qubit can take any linear combination of quantum states |0⟩ and |1⟩. In quantum systems, linear combinations such as Ψ = α|0⟩ + β|1⟩ can use complex coefficients (α and β). Since α = |α|e^jθ is valid for any complex number, this indicates that quantum computing allows bits to have a phase with respect to each other. Geometrically, each bit of memory can “live” anywhere on the Bloch sphere, with |0⟩ at the “south pole” and |1⟩ at the “north pole”.
Quantum computing requires a whole new set of gates, and there’s issues with coherence that I frankly don’t 100% understand yet. And qubits are a whole lot harder to make than classical bits. But if we can find a way to make qubits available to everyone like classical bits are, then we’ll be able to get a lot more computing power.
The hardware works due to a quantum mechanical effect, but it is not “quantum” hardware because it doesn’t implement a two-bit quantum system.
[1] Classical computers can be designed with N-ary digit memory (for example, trinary can take states 0, 1, and 2), but binary is easier to design for.
then we’ll be able to get a lot more computing power.
I think that’s not quite true it depends on what you want to calculate. Some problems have more efficient algorithms for quantum computing (famously breaking RSA and other crypto algorithms). But something like a matrix multiplication probably won’t benefit.
It’s actually expected that matrix inversion will see a polynomial increase in speed, but with all the overhead of quantum computing, we only really get excited about exponential speedups such as in RSA decryption.
Not at all. In a classical computer, the memory is based on bits of information which can either[1] in state 0 or state 1, and (assuming everything has been designed correctly) won’t exist in any other state than the allowed ones. Most classical computers work with binary.
In quantum computing, the memory is based on qubits, which are two-level quantum systems. A qubit can take any linear combination of quantum states |0⟩ and |1⟩. In quantum systems, linear combinations such as Ψ = α|0⟩ + β|1⟩ can use complex coefficients (α and β). Since α = |α|e^jθ is valid for any complex number, this indicates that quantum computing allows bits to have a phase with respect to each other. Geometrically, each bit of memory can “live” anywhere on the Bloch sphere, with |0⟩ at the “south pole” and |1⟩ at the “north pole”.
Quantum computing requires a whole new set of gates, and there’s issues with coherence that I frankly don’t 100% understand yet. And qubits are a whole lot harder to make than classical bits. But if we can find a way to make qubits available to everyone like classical bits are, then we’ll be able to get a lot more computing power.
The hardware works due to a quantum mechanical effect, but it is not “quantum” hardware because it doesn’t implement a two-bit quantum system.
[1] Classical computers can be designed with N-ary digit memory (for example, trinary can take states 0, 1, and 2), but binary is easier to design for.
I think that’s not quite true it depends on what you want to calculate. Some problems have more efficient algorithms for quantum computing (famously breaking RSA and other crypto algorithms). But something like a matrix multiplication probably won’t benefit.
It’s actually expected that matrix inversion will see a polynomial increase in speed, but with all the overhead of quantum computing, we only really get excited about exponential speedups such as in RSA decryption.