this post was submitted on 13 Nov 2024
669 points (94.9% liked)

Technology

59566 readers
4828 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] EleventhHour@lemmy.world 10 points 1 week ago

Apparently, there was only so much IP to steal

[–] jpablo68@infosec.pub 10 points 1 week ago (1 children)

I just want a portable self hosted LLM for specific tasks like programming or language learning.

[–] plixel@programming.dev 9 points 1 week ago (1 children)

You can install Ollama in a docker container and use that to install models to run locally. Some are really small and still pretty effective, like Llama 3.2 is only 3B and some are as little as 1B. It can be accessed through the terminal or you can use something like OpenWeb UI to have a more "ChatGPT" like interface.

load more comments (1 replies)
[–] TankovayaDiviziya@lemmy.world 8 points 1 week ago (1 children)

Short on the AI stocks before it crash!

load more comments (1 replies)
[–] aesthelete@lemmy.world 8 points 1 week ago

I hope it all burns.

[–] iAvicenna@lemmy.world 7 points 1 week ago (4 children)

so long, see you all in the next hype. Any guesses?

load more comments (4 replies)
[–] _bcron_@lemmy.world 7 points 1 week ago* (last edited 1 week ago) (3 children)

It'll implode but there are much larger elephants in the room - geopolitical dumbassery and the suddenly transient nature of the CHIPS Act are two biggies.

Third, high flying growth, blue sky darlings, they're flaky. In a downturn growth is worth 0 fucking dollars, throw that shit in a dumpster and rotate into staples. People can push off a phone upgrade or new TV and cut down on subscriptions, but they'll always need Pampers.

The thing propping up AI and semis is an arms race between those high flying tech companies, so this whole thing is even more prone to imploding than tech itself, since a ton of revenue comes from tech. Sensitive sector supported by an already sensitive sector. House of cards with NVDA sitting right at the tippy top. Apple, Facebook, those kinds of companies, when they start trimming back it's over.

But, it's one of those things that is anyone's guess. When you think it's not even possible for everything to still have steam one of the big guys like TSMC posts some really delightful earnings and it gets another second wind, for the 29th time.

Definitely a house of cards tho, and suddenly a lot more precarious because suddenly nobody knows how policy will affect the industry or the market as a whole

They say shipping is the bellwhether of the economy and there's a lot of truth to that. I think semis are now the bellwhether of growth. Sit back and watch the change in the wind

load more comments (3 replies)
[–] Decker108@lemmy.ml 7 points 1 week ago

Nice, looking forward to it! So much money and time wasted on pipe dreams and hype. We need to get back to some actually useful innovation.

[–] nl4real@lemmy.world 7 points 1 week ago

Fingers crossed.

[–] Somecall_metim@lemmy.dbzer0.com 6 points 1 week ago (1 children)

The tech priests of Mars were right; death to abominable intelligence.

load more comments (1 replies)
[–] dog_@lemmy.world 6 points 1 week ago
[–] j4p@lemm.ee 6 points 1 week ago

Sigh I hope LLMs get dropped from the AI bandwagon because I do think they have some really cool use cases and love just running my little local models. Cut government spending like a madman, write the next great American novel, or eliminate actual jobs are not those use cases.

[–] Buffalox@lemmy.world 6 points 1 week ago (5 children)

Seems to me the rationale is flawed. Even if it isn't strong or general AI, LLM based AI has found a lot of uses. I also don't recognize the claimed ignorance among people working with it, about the limitations of current AI models.

[–] ohwhatfollyisman@lemmy.world 10 points 1 week ago (1 children)

while you may be right, one would think that the problem lies in the overestimated peception of the abilities of llms leading to misplaced investor confidence -- which in turn leads to a bubble ready to burst.

load more comments (1 replies)
load more comments (4 replies)
[–] kromem@lemmy.world 5 points 1 week ago

Oh nice, another Gary Marcus "AI hitting a wall post."

Like his "Deep Learning Is Hitting a Wall" post on March 10th, 2022.

Indeed, not much has changed in the world of deep learning between spring 2022 and now.

No new model releases.

No leaps beyond what was expected.

\s

Gary Marcus is like a reverse Cassandra.

Consistently wrong, and yet regularly listened to, amplified, and believed.

[–] finitebanjo@lemmy.world 5 points 1 week ago

Theres no bracing for this, OpenAI CEO said the same thing like a year ago and people are still shovelling money at this dumpster fire today.

load more comments
view more: ‹ prev next ›