this post was submitted on 23 Jun 2024
1 points (100.0% liked)

Technology

59566 readers
4890 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Ephera@lemmy.ml 0 points 5 months ago (2 children)

Quantum computers won't displace traditional computers. There's certain niche use-cases for which quantum computers can become wildly faster in the future. But for most calculations we do today, they're just unreliable. So, they'll mostly coexist.

[–] amanda@aggregatet.org 0 points 5 months ago* (last edited 5 months ago) (1 children)

Presumably you’d have a QPU in your regular computer, like with other accelerators for graphics etc, or possibly a tiny one for cryptography integrated in the CPU

[–] Tinidril@midwest.social 0 points 5 months ago (1 children)

There would have to be some kind of currently unforseen breakthroughs before something like that would be even remotely possible. In all likelihood, quantum computing would stay in specialized data centers. For the problems quantum would solve, there is really no advantage to having it local anyways.

[–] amanda@aggregatet.org 0 points 5 months ago (1 children)

I assume we need a lot of breakthroughs to even have useful quantum computing at all, but sure.

Isn’t quantum encryption interesting for end users?

[–] hades@lemm.ee 0 points 5 months ago

Quantum encryption isn't something quantum computers can even do. It's not just transforming bits into other bits, it's about building entirely new security properties based on physical properties of matter.

So, even if it is interesting for end users, they would need dedicated hardware anyway.

[–] UraniumBlazer@lemm.ee 0 points 5 months ago

In other words like GPUs. GPUs suck ass at complex calculations. They however, work great for a large number of easy calculations, which is what is needed for graphics processing.