this post was submitted on 05 Jun 2024
1 points (100.0% liked)

Technology

59672 readers
3222 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] _sideffect@lemmy.world 0 points 5 months ago

Lmao, stupidity

[–] BombOmOm@lemmy.world 0 points 5 months ago (1 children)

Selling shovels during a gold rush is the best way to get rich. :)

[–] RecallMadness@lemmy.nz 0 points 5 months ago

While suing everyone else that makes shovel handles that work with your shovel heads.

[–] WhyDoYouPersist@lemmy.world 0 points 5 months ago (1 children)

Fuck this stupid world we've built.

load more comments (1 replies)
[–] dogslayeggs@lemmy.world 0 points 5 months ago (6 children)

I didn't know there were that many PC gamers out there. /s

Seriously, though, the pivot from making video cards to investing in AI and crypto is kinda genius. The crypto thing mostly fell into their laps, but they leaned in. The AI thing, though, I'm not sure how they decided to focus on that or who first pitched the idea to the board; but that was business genius.

[–] chrash0@lemmy.world 0 points 5 months ago

same as with crypto. the software community started using GPUs for deep learning, and they were just meeting that demand

[–] dkc@lemmy.world 0 points 5 months ago (1 children)

To your point, when you look at both crypto and AI I see a common theme. They both need a lot of computation, call it super computing. Nvidia makes products that provide a lot of compute. Until Nvidia’s competitors catch up I think they’ll do fine as more applications that require a lot of computation are found.

Basically, I think of Nvidia as a super computer company. When I think of them this way their position makes more sense.

load more comments (1 replies)
[–] kromem@lemmy.world 0 points 5 months ago* (last edited 5 months ago) (1 children)

They were doing that for years before it became popular. The same tech for video graphics just so happened to be useful for AI and big data, and they doubled down on supporting enterprise and research efforts in that when it was a tiny field before their competitors did, and continued to specialize as it grew.

Supporting niche uses of your product can sometimes pay off if that niche hits the lottery.

[–] webghost0101@sopuli.xyz 0 points 5 months ago

Hardware made for heavy computing being good at stuff like this isn’t all that schokking though. The biggest gamble is if new technology will take off at all. Nvidia, just like google has the capital to diversify, bet on all the horses at once to drop the losers later.

[–] swayevenly@lemm.ee 0 points 5 months ago

DLSS was a necessity to make gains at speeds their hardware could not keep up with.

[–] slacktoid@lemmy.ml 0 points 5 months ago

To their credit they've been pushing GPGPUs for a while. They did position themselves well for accelerators. Doesn't mean they don't suck.

[–] RecallMadness@lemmy.nz 0 points 5 months ago* (last edited 5 months ago)

They were first to market with a decent GPGPU toolkit (CUDA) which built them a pretty sizeable userbase.

Then when competitors caught up, they made it as hard as possible to transition away from their ecosystem.

Like Apple, but worse.

I guess they learned from their Gaming heyday that not controlling the abstraction layer (eg OpenGL, DirectX, etc) means they can’t do lock in.

[–] K1nsey6@lemmy.world 0 points 5 months ago

Pelosi's insider trader is paying off for her.

[–] flamingo_pinyata@sopuli.xyz 0 points 5 months ago (2 children)

Time to sell Nvidia stock. Congrats to Huang for pulling it off. Get out when you're on top.

[–] eager_eagle@lemmy.world 0 points 5 months ago

imagine how many leather jackets he can buy now

[–] Lemonyoda@feddit.de 0 points 5 months ago

This is not how you do shares... :o

[–] flop_leash_973@lemmy.world 0 points 5 months ago (2 children)

The real game now is how long will it last before the hype and with the the floor falls out of "AI" and a good chunk of their stock gains with it.

[–] Damage@feddit.it 0 points 5 months ago

Well, they also make good silicon that is apparently useful for different things, that may not change... If it's good for the next fad as well, they'll just stay on top.

[–] bamboo@lemm.ee 0 points 5 months ago (1 children)

I don’t think generative AI is going anywhere anytime soon. The hype will eventually die down, but it’s already proved its usefulness in many tasks.

[–] neshura@bookwormstory.social 0 points 5 months ago (2 children)

Is AI useful? Maybe. But is it profitable? AI will go the same way .com did: there will be a massive crash and at the end of that you'll see who actually had their pants on

[–] Nighed@sffa.community 0 points 5 months ago (1 children)

Nvidia IS making a profit on it though. It's the whole "in a good rush, sell shovels" thing.

[–] neshura@bookwormstory.social 0 points 5 months ago

My Point is more that their revenue stream will temporarily take a giant hit during that, when everyone is busy going bankrupt the few AI companies that make a profit with it have better things to do than buy new Accelerators right that instant.

[–] bamboo@lemm.ee 0 points 5 months ago (2 children)

It can be quite profitable. A ChatGPT subscription is $20/m right now, or $240/year. A software engineer in the US is between $200k and $1m with all benefits and support costs considered. If that $200k engineer can use ChatGPT to save 2.5 hours in a year, then it pays for itself.

[–] neshura@bookwormstory.social 0 points 5 months ago (1 children)

It's quite funny that you think ChatGPT is making a profit on that 20$ subscription if you replace a software dev with it.

The bust won't be because it's not profitable to use AI but because the companies selling the service cannot do so at rates which are both profitable and actually marketable. Case in point: OpenAI has not made a single cent of profit so far (or at least not reported a profit). The way AI is currently shoved in everywhere is not sustainable because the cost of running an AI model cannot be recuperated by most of these new platforms.

[–] bamboo@lemm.ee 0 points 5 months ago (1 children)

OpenAI is a non-profit. Further, US tech companies usually take many years to become profitable. It’s called reinvesting revenue, more companies should be doing that instead of stock buybacks.

Let’s suppose hosted LLMs like ChatGPT aren’t financially sustainable and go bust though. As a user, you can also just run them locally, and as smaller models improve, this is becoming more and more popular. It’s likely how Apple will be integrating LLMs into their devices, at least in part, and Microsoft is going that route with “Copilot+ PCs” that start shipping next week. Integration aside, you can run 70B models on an overpriced $5k MacBook Pro today that are maybe half as useful as ChatGPT. The cost to do so exceeds the cost of a ChatGPT subscription, but to use my numbers from before, a $5k MacBook Pro running llama 3 70B would have to save an engineer one hour per week to pay for itself in the first year. Subsequent years only the electrical costs would matter, which for a current gen MacBook Pro would be about equivalent to the ChatGPT subscription in expensive energy markets like Europe, or half that or less in the US.

In short, you can buy overpriced Apple hardware to run your LLMs, do so with high energy prices, and it’s still super cheap compared to a single engineer such that saving 1 hour per week would still pay for itself in the first year.

[–] neshura@bookwormstory.social 0 points 5 months ago

Yeah I don't know why you keep going on about people using AI when my point was entirely that most of the companies offering AI services don't have a sustainable business model. Being able to do that work locally if anything strengthens my point.

[–] frezik@midwest.social 0 points 5 months ago* (last edited 5 months ago) (1 children)

I've seen pull requests filled with ChatGPT code. I consider my dev job pretty safe.

load more comments (1 replies)
[–] Zatore@lemm.ee 0 points 5 months ago

I'm holding on at least till the stock split

[–] filister@lemmy.world 0 points 5 months ago (1 children)
[–] photonic_sorcerer@lemmy.dbzer0.com 0 points 5 months ago (1 children)

Nvidia and other chipmakers produce actual, useful products. They'll be sitting pretty after the bubble pops.

[–] mal3oon@lemmy.world 0 points 5 months ago (3 children)

Their main growth drivers are data centers, when demand will dry within 2 years, a bubble will pop. Especially when theoretical architecture of Neural Network change, the need for high performance will decrease.

[–] photonic_sorcerer@lemmy.dbzer0.com 0 points 5 months ago (1 children)

Then we'll all get cheaper GPUs! Oh no!

load more comments (1 replies)
load more comments (2 replies)
[–] venusaur@lemmy.world 0 points 5 months ago (1 children)

AI is this decades .com boom. Brace yourself for the crash.

load more comments (1 replies)
[–] zewm@lemmy.world 0 points 5 months ago* (last edited 5 months ago) (4 children)

All that value and they still can’t get their video cards to work worth a shit in Linux.

[–] victorz@lemmy.world 0 points 5 months ago

I'm using a 2080 Super since 2020 and it's been mostly gravy. Granted, I've not been using anything Wayland-related. But I'm gaming on Steam and shit and it works wonderfully. Better performance than on Windows. Though there is some slight audio delay. A few milliseconds over Windows.

I've been looking to switch to Hyprland but it was a bit glitchy with gaming and screen sharing sometimes so I'm holding off on that until I jump over to the AMD ship. It'll be sweet.

[–] mal3oon@lemmy.world 0 points 5 months ago

What card are you using? Their Linux support in the past years is impressive. They even have open source drivers now (still beta). And thanks to proton, gaming is seemless on Linux. I don't see the issue you're describing?

[–] dev_null@lemmy.ml 0 points 5 months ago (2 children)

I've been using Nvidia cards on Linux for many years and never had issues. I did have issues with the laptop cards (Optimus switching), but on the desktop it was always flawless for me.

[–] accideath@lemmy.world 0 points 5 months ago (1 children)

I mean, they work. But the drivers aren’t as feature complete as AMD or intel. Wayland support was a strict no until very recently and gamescope support is still very hit n miss and they are less stable than their competition. They’re completely useable though. My 1650 runs well, most of the time.

[–] dev_null@lemmy.ml 0 points 5 months ago* (last edited 5 months ago) (1 children)

When I was in the market for a new card 2 years ago I looked into AMD, but learned that they don't work as well as Nvidia for GPU passthrough to VMs, which I need to work. I'd love to switch because Nvidia is a shit company, but AMD GPU's just don't work for my use case.

I'm curious though because I don't know what I'm missing. What are the features in AMD drivers that make it more complete?

[–] accideath@lemmy.world 0 points 5 months ago

As I said, AMD works much better with wayland and gamescope, thus has, for example, HDR and VRR support. Besides that, their Linux drivers are open source and more stable.

But to my knowledge, AMD GPUs pass through just fine to VMs? What was your problem with them?

[–] zewm@lemmy.world 0 points 5 months ago (4 children)

I guess you aren’t using Wayland. It’s abysmal with Wayland. Especially electron apps. They just flicker and crash.

load more comments (4 replies)
[–] DaPorkchop_@lemmy.ml 0 points 5 months ago (5 children)

Why does everyone always complain about Nvidia support on Linux? I've been using Nvidia GPUs on Ubuntu and Debian for years and it has never required any more effort than 'sudo apt install nvidia-driver'.

[–] zewm@lemmy.world 0 points 5 months ago (2 children)

It’s not difficult to install the drivers. I recently had to swap out my 3090 for an AMD card because Wayland just crashes and works poorly with Nvidia.

load more comments (2 replies)
load more comments (4 replies)
[–] Setnof@feddit.de 0 points 5 months ago

“Valuable”

[–] MonkderDritte@feddit.de 0 points 5 months ago

So they rip customers off? Got it.

[–] frezik@midwest.social 0 points 5 months ago (1 children)

Last year's Nvidia keynote at Computex had Jensen trying to get the audience to have an awkward, AI-generated sing along. The market thought this was great and sent the market cap over $1T.

For this year's keynote, Jensen wandered the stage like he was looking for his cat while rambling about language models. The market thinks this is great and sent the market cap over $3T.

For the second biggest company on Earth, he is a shockingly bad speaker, and completely ill prepared. For some reason, the market loves this guy.

[–] FlyingSquid@lemmy.world 0 points 5 months ago (1 children)

Is it that the market loves him or is it that a CEO's keynote isn't really that big a deal and is mostly an ego-stroking event?

Because I'm guessing what the market actually loves is the new products that are announced.

[–] frezik@midwest.social 0 points 5 months ago (6 children)

That's the thing: no new products were announced.

[–] FlyingSquid@lemmy.world 0 points 5 months ago

I take back what I said in that case.

load more comments (5 replies)
load more comments
view more: next ›