this post was submitted on 26 Jul 2024
2 points (100.0% liked)

Technology

59651 readers
2744 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] gearheart@lemm.ee 0 points 4 months ago (3 children)

This would be funny if it happened to Nvidia.

Hope Intel recovers from this. Imagine if Nvidia was the only consumer hardware manufacturer....

No one wants that.

[–] mlg@lemmy.world 0 points 4 months ago (1 children)

This would be funny if it happened to Nvidia.

Hope Intel recovers from this. Imagine if Nvidia was the only consumer hardware manufacturer…

Lol there was a reason Xbox 360s had a whopping 54% failure rate and every OEM was getting sued in the late 2000s for chip defects.

[–] SomethingBurger@jlai.lu 0 points 4 months ago (3 children)

Isn't the 360's failure rate due to MS rushing to release it before the PS3?

[–] hardcoreufo@lemmy.world 0 points 4 months ago

I think the 360 failed for the same reason lots of early/mid 2000s PCs failed. They had issues with chips lifting due to the move away from leaded solder. Over time the formulas improved and we don't see that as much anymore. At least that's the way I recall it.

[–] Kyrgizion@lemmy.world 0 points 4 months ago (2 children)

AFAIK the cooling was faulty or insufficient which burned the chips out.

[–] icedterminal@lemmy.world 0 points 4 months ago (1 children)

Tagging on here: Both the first model PS3 and Xbox 360 were hot boxes with insufficient cooling. Both suffered from getting too hot too fast for their cooling solutions to keep up. Resulting in hardware stress that caused the chips solder points to weaken until they eventually cracked.

[–] john89@lemmy.ca 0 points 4 months ago* (last edited 4 months ago) (1 children)

Owner of original 60gb PS3 here.

It got very hot and eventually stopped working. It was under warranty and I got an 80gb replacement for $200 cheaper, but lost out on backwards compatibility which really sucked because I sold my PS2 to get a PS3.

[–] lennivelkant@discuss.tchncs.de 0 points 4 months ago (1 children)

Why would you want backwards compatibility? To play games you already own and like instead of buying new ones? Now now, don't be ridiculous.

Sarcasm aside, I do wonder how technically challenging it is to keep your system backwards-compatible. I understand console games are written for specific hardware specs, but I'd assume newer hardware still understands the old instructions. It could be an OS question, but again, I'd assume they would develop the newer version on top of their old, so I don't know why it wouldn't support the old features anymore.

I don't want to cynically claim that it's only done for profit reasons, and I'm certainly out of my depth on the topic of developing an entire console system, so I want to assume there's something I just don't know about, but I'm curious what that might be.

[–] john89@lemmy.ca 0 points 4 months ago

It's my understanding that backwards-compatible PS3s actually had PS2 hardware in them.

We can play PS2 and PS1 games if they are downloaded from the store, so emulation isn't an issue. I think Sony looked at the data and saw they would make more money removing backwards compatibility, so that's what they did.

Thankfully the PS3 was my last console before standards got even lower and they started charging an additional fee to use my internet.

[–] NecroParagon@lemm.ee 0 points 4 months ago

Intercooler + wet towel got me about 30 minutes on Verruckt

[–] brucethemoose@lemmy.world 0 points 4 months ago* (last edited 4 months ago) (1 children)

This would be funny if it happened to Nvidia.

It kinda, has, with Fermi, lol. The GTX 480 was... something.

Same reason too. They pushed the voltage too hard, to the point of stupidity.

Nvidia does not compete in this market though, as much as they'd like to. They do not make x86 CPUs, and frankly Intel is hard to displace since they have their own fab capacity. AMD can't take the market themselves because there simply isn't enough TSMC/Samsung to go around.

[–] Kyrgizion@lemmy.world 0 points 4 months ago (1 children)

There's also Intel holding the x86 patent and AMD holding the x64 patent. Those two aren't going anywhere yet.

[–] wax@feddit.nu 0 points 4 months ago* (last edited 4 months ago)

Actually, looks lhe base patents have expired. All the extentions, SSE, AVX are still in effect though

[–] Poem_for_your_sprog@lemmy.world 0 points 4 months ago

Me too so it keeps AMD on their toes.