As someone who did most of IBMs quantum course, the difficulty didn't come with the generating code part. It came with the application. How do I use this to speed up database access? How do I use this to better commute graphics? Can you make a neural net better with this?
Giving people useful applications will generate more interest than you will need.
> How do I use this to speed up database access? How do I use this to better commute graphics? Can you make a neural net better with this?
It seems that you have missed some of the basics of quantum computing.
For quantum computing to work the system must be isolated from the environment for the period of the computation. You can't use it for anything interactive like graphics.
Any practical problem worth solving with quantum computer needs something like years or millions of years to solve using classical computer. Do you have database index where search takes years or graphics problem where output takes years?
Also consider the amount of qbits needed. For database search you need quantum computer with same order of magnitude qbits as the database index.
Practical problems for quantum computing are mostly scientific problems like simulating chemistry.
There are/seem to be very specific scientific applications, mostly around simulation of other systems whose hamiltonians map nicely to that of the computer. For almost everyone on the planet that isn’t very useful, but for that small handful it can be a game changer.
… doesn’t contribute much here, but worth knowing! It’s exciting either way.
Importantly, Bitcoin and Ethereum are (inadvertently) almost quantum-proof (because the addresses are hashes of the public keys), just not quite (because the public keys still get exposed during transactions). But at least the latter should be fixable (because the community is quite used to changes to the protocol anyway).
I have to object, while not many algorithms exists, one of the most famous ones, Grovers algorithm [0] is also one of the most general ones and quite powerful. It reduces search of 2^n items to about 2^(n/2) while using n q-bits. Thats just the square root of a classical algorithm.
Ya looking at quantum computing (QC) from a programmer's perspective, I see issues with abstraction at multiple levels. Basically what's going on is that it's too narrow (tightly coupled to its implementation details), but also too broad (generalized over many disciplines) to easily transform it into simpler domains that are easier for humans to reason about.
So for example, in computer science we can explain basic data structures like binary trees and hash tables from whichever context a student understands. We can use pointers and references from C-style imperative languages. We can use car and cdr from Lisp-style functional languages. We can even jump up to higher levels of abstraction with associative arrays in Javascript. Or make the whole thing in key-value stores like Redis or even a SQL database or x86 assembly if that floats our boat.
But with QC, even its "assembly language" of Hilbert spaces and tensors is already so difficult to understand, much less apply, that the terminology feels like gatekeeping. There's no way to easily identify the smoking gun(s) which separate a QC simulation from qubits in the real world. It's like asking an assembly language teacher how a full adder works, but they throw up their hands and say that first we must understand Fermi levels and electron diffusion across a PN junction. Which while important, doesn't answer the student's question.
What's needed are simple transforms to go from any existing formula/algorithm to its "optimized" QC equivalent. I don't know if that can be done in a general way, like how we can switch from the time domain to the frequency domain with a Fourier transform, or take the derivative of an equation. Maybe it's more like an integral where the solution must be "found" by composing known integrals from a dictionary using heuristics.
But without those transforms, I see us being stuck in a QC winter for the foreseeable future. My words are imperfect, but I know precisely what I'm getting at here. I can't run with QC the way I would with any other programming language, and it really bothers me. To the point of, I'll probably have to actually learn it from first principles and write it up from a programmer's perspective! Someday.
“The remaining probability was distributed over 01 and 10, which should not be a part of the Bell state. As we discussed before, this phenomenon is due to errors inherent to the quantum processor. As the backend technology improves we expect to get better results.”
I have to object, while not many algorithms exists, one of the most famous ones, Grovers algorithm [0] is also one of the most general ones and quite powerful. It reduces search of 2^n items to about 2^(n/2) while using n q-bits. Thats just the square root of a classical algorithm.
[0] https://en.wikipedia.org/wiki/Grover%27s_algorithm
You're asking whether they can do things that classical computers can do reasonably well already marginally better, while their whole point is that they (one day, hopefully) can do things that current computers can't do at all (at scale, at least). Most of the speculative applications I've heard are in science, cryptography and AI/ML.
QPUs are effectively (and theoretically) high-speed, specialized coprocessors at best. There's no way they're going to be doing anything you mentioned any time soon because that's not what they're good at or built for.
IBM has (had?) a quantum course? Was it a free online course? And if one wants to take it for curiosity and not expecting to use the skills (immediately although maybe in the future) would you recommend it?
Factoring large integers in order to decrypt RSA. Quantum computations do not produce numbers, only probability distributions. This is fine if all you are trying to do is factor integers since the answer can be easily verified.
This is why massive amounts of money is being thrown at quantum computing.
> Tequila is an Extensible Quantum Information and Learning Architecture where the main goal is to simplify and accelerate implementation of new ideas for quantum algorithms. It operates on abstract data structures allowing the formulation, combination, automatic differentiation and optimization of generalized objectives. Tequila can execute the underlying quantum expectation values on state of the art simulators as well as on real quantum devices.
> Welcome to the Tequila Calculator Tutorial. In this tutorial, you will learn how to create a quantum circuit that simulates addition using Tequila. We also compare the performance of various backends that Tequila uses.
A problem that could possibly be quantum-optimized and an existing set of solutions: Scheduling
Optimize the scheduling problem better than e.g. SLURM and then generate a sufficient classical solution that executes in P-space on classical computers.
> Slurm uses a best fit algorithm based on Hilbert curve scheduling or fat tree network topology in order to optimize locality of task assignments on parallel computers.[2]
Additional applications and use cases: "Employee Scheduling" > "Ask HN: What algorithms should I research to code a conference scheduling app"
https://news.ycombinator.com/item?id=22589911
> [...] the Hilbert curve scheduling method turns a multidimensional task allocation problem into a one-dimensional space filling problem using Hilbert curves, assigning related tasks to locations with higher levels of proximity.[1] Other space filling curves may also be used in various computing applications for similar purposes.[2]
I’ll copy paste what I wrote before about quantum country
This doesn’t provide solutions to the problems and doesn’t show you how to do the mathematical calculations. It assumes you know proof based linear algebra and if you’re a math beginner the level that this is written at will be beyond you.
"... are shorthands for the vectors encoding the two basis states of a two dimensional vector space." How many beginner's know what a basis of a vector space is -- how about unitary operators -- Eigen stuff?
The word "beginner" does a disservice to the level you need to be at to start wadding into quantum stuff. At a minimum, you better not be a beginner of linear algebra!
There is, imo, no better way to discourage people than saying this stuff is for "beginners".
Not trolling: is there really a need to train a cohort of quantum programmers? Can these computers actually do something better than their classical counterparts, right now?
> Can these computers actually do something better than their classical counterparts, right now?
You could say quantum computers do something better than classical computers, but you cannot say they can do something useful better than classical computers. At present, “Quantum supremacy” only applies to toy problems.
Thanks, interesting idea but it seems the practical applicability is still far away.
> One of the most celebrated results in quantum computing is the development of a quantum algorithm for factorization that works in time polynomial in n. This algorithm, due to Peter Shor and known as Shor’s algorithm, runs in O (n3 log n) time and uses O (n2 log n log log n) gates. The first experimental implementation of this algorithm on a quantum computer was reported in 2001, when the number 15 was factored. The largest integer factored by Shor’s algorithm so far is 21.
Quantum supremacy, while a very interesting experiment and a milestone in quantum computing, proves hardly anything about usefulness of quantum computers.
It "only" proves that quantum computers are the best quantum computers, and classic ones can't efficiently simulate them. It say nothing about useful operations.
From a theoretical point of view, this is evidence that quantum computing is a more powerful model of computation. I think it would be hard to argue that a model of computation that can solve more problems is any less useful. Applications like quantum simulation do appear to be difficult on a classical computer yet efficiently computable on a quantum computer
They are programmable, that makes them drastically more intetesting than a single-parameter aerodynamic analog computer like the Boeing, from the point of view of Computational Complexity. It does not make them "useful" yet, but it is a big milestone. See https://scottaaronson.blog/?p=5460
Could you elaborate on your last paragraph? Proof of concepts that can not solve "useful" problems seem like an incredibly important milestones to me? Or is your frustration that they are occasionally presented by overly enthusiastic engineers as more than "useless" technology demonstrators (with this frustration I would agree).
The second half of the article I shared covers the parametrization question you raised.
A successful technology does not need to show theoretical supremacy. Its proponents can simply show the useful services and products it makes.
> Proof of concepts that can not solve "useful" problems seem like an incredibly important milestones to me?
Sure, just don't claim supremacy until you can back it up with an actually reasonable definition of computation.
> Or is your frustration that they are occasionally presented by overly enthusiastic engineers as more than "useless" technology demonstrators (with this frustration I would agree).
It is hard to find any news about quantum computing that is not way over exaggerated.
> The second half of the article I shared covers the parametrization question you raised.
His argument may apply to the teapot example, but for, say, a test flight to fine tune the auto pilot behavior, there are plenty of inputs that can be set. I'd argue the aerodynamic computer supremacy example still stands.
To your last point: I think I disagree, because your analog computer (like all analog computers) addresses one fixed-size instance of a problem, not the scalable family of problems. You can not make a computational complexity big-O statement about your analog computer, but you can do that about boson sampling and random circuit sampling. This is a crucial milestone in the creation of a physical realization of a computational device.
In your first paragraph, while I am sympathetic with your annoyance (pardon my imprecise choice of words), I disagree there too. Mostly because I can imagine the same argument used against Babbage, Lovelace, and Turing, who did their theory (and failed experiment) work decades to a century before a scalable classical computer existed.
I, for one, welcome any learning. So even if I am never actually to program a quantum computer, Why would I not like to know how to do such? Why should the ability/knowledge be hidden from any body.
Unfortunately every page of the document has the words "Just accepted" across the whole page making it almost unreadable (as in: can be read, but it's painful)
Giving people useful applications will generate more interest than you will need.