this post was submitted on 04 Jun 2024
1 points (100.0% liked)

Technology

58424 readers
4221 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] TheFeatureCreature@lemmy.world 0 points 4 months ago (4 children)

On the plus side, the industry is rapidly moving towards locally-run AI models specifically because they don't want to purchase and run fleets of these absurd things or any other expensive hardware.

[–] Norgur@fedia.io 0 points 4 months ago (2 children)

Yeah, I think the author misses the point in regard to power consumption. Companies will not buy loads of these and use them in addition to existing hardware. They will buy these to get rid of current hardware. It's not clear (yet) if that will increase, decrease or not affect power consumption.

[–] kayazere@feddit.nl 0 points 4 months ago (1 children)

Even if companies were replacing existing hardware, the existing hardware uses less power. So whether it is additional hardware or not, there will be an increase in energy demand, which is bad for climate change.

[–] Norgur@fedia.io 0 points 4 months ago

I have personally worked on a project where we replaced several older nodes in datacenters with only one modern one. That used more power than two older nodes combined, but since we were shutting down 15-20, we saved a lot of power. Not every replacement is 1:1, most aren't.

[–] themoonisacheese@sh.itjust.works 0 points 4 months ago

The lack of last-last gen hardware on the used market suggests this isn't true. Even if it were available, the buyers will run it and the overall energy consumption will still increase. It's not like old hardware disappears after it's replaced with newer models.

[–] FrostyCaveman@lemm.ee 0 points 4 months ago (1 children)

Every cloud has a silver lining it seems (heheh)

[–] lemmylommy@lemmy.world 0 points 4 months ago

You can still upload the results to the cloud

[–] MudMan@fedia.io 0 points 4 months ago (17 children)

The tragic irony of the kind of misinformed article this is linking is that the server farms that would be running this stuff are fairly efficient. The water is reused and recycled, the heat is often used for other applications. Because wasting fewer resources is cheaper than wasting more resources.

But all those locally-run models on laptop CPUs and desktop GPUs? That's grid power being turned into heat and vented into a home (probably with air conditioning on).

The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.

I do hate our media landscape sometimes.

[–] Chee_Koala@lemmy.world 0 points 4 months ago (1 children)

But efficiency is not the only consideration, privacy and self reliance are important facets as well. Your argument about efficiënt computing is 100% valid but there is a lot more to it.

[–] MudMan@fedia.io 0 points 4 months ago (1 children)

Oh, absolutely. There are plenty of good reasons to run any application locally, and a generative ML model is just another application. Some will make more sense running from server, some from client. That's not the issue.

My frustration is with the fact that a knee-jerk reaction took all the 100% valid concerns about wasteful power consumption on crypto and copy-pasted them to AI because they had so much fun dunking on cryptobros they didn't have time for nuance. So instead of solving the problem they added incentive for the tech companies owning this stuff to pass the hardware and power cost to the consumer (which they were always going to do) and minimize the perception of "costly water-chugging power-hungry server farms".

It's very dumb. The entire conversation around this has been so dumb from every angle, from the idiot techbros announcing the singularity to the straight-faced arguments that machine learning models are copy-pasting things they find on the Internet to the hyperbolic figures on potential energy and water cost. Every single valid concern or morsel of opportunity has been blown way out of reasonable proportion.

It's a model of how our entire way of interacting with each other and with the world has changed online and I hate it with my entire self.

[–] iarigby@lemmy.world 0 points 4 months ago (1 children)

Thanks for the perspective. I despise the way the generative models destroy income for entry level artists, the unhealthy amount it is used to avoid learning and homework in schools, and how none of the productivity gains will be shared with the working class. So my view around it is incredibly biased and when I hear any argument that puts AI into bad light I accept it without enough critical thinking.

[–] Norgur@fedia.io 0 points 4 months ago (1 children)

From what I learned over the years: AI isn't likely to destroy income for entry-level artists. They destroy the quagmires those artists got stuck in. The artists this will replace first and foremost are those creating elevator music, unassuming PowerPoint presentation backgrounds, Stock photos of coffee mugs. All those things where you really don't need anything specific and don't really want to think about anything.

Now look how much is being paid for those artworks by the customers on Shutterstock and the like. Almost nothing. Now imagine what Shutterstock pays their artists. Fuck all is what. Artists might get a shred of credit here and there, a few pennies, and that's that. The market AI is “disrupting” as they say, is a self-exploitative freelancing hellhole. Most of those artists cannot live off their work, and to be frank: Their work isn't worth enough to most people to pay them the money they'd need to live.

Yet, while they chase the carrot dangling in front of them, dreaming of fame and collecting enough notoriety through that work to one day do their real art, instead of interchangeable throwaway-stuff made to fit into any situation at once, Corporations continue to bleed them dry, not allowing any progress for them whatsoever. Or do you know who made the last image of a coffee mug you saw in some advert?

The artists who manage to make a living (digital and analog) are those who manage to cultivate a following. Be that through Patreon, art exhibitions, whatever. Those artists will continue to make a living because people want them to do exactly what they do, not an imitation of it. They will continue to get commissioned because ´people want their specific style and ideas.

So in reality, it doesn't really destroy artists, it replaces one corpo-hellhole (freelancing artist) with another (freelancing AI trainer/prompter/etc)

[–] iarigby@lemmy.world 0 points 4 months ago (1 children)

I will keep that perspective in mind, thank you. I am very held back by the amount of resistance and pushback by myself against ai developments, and it is very hard to warm up to something being shoved down by these huge malicious corporations and not be worried about how they will use it against us.

It sounds like one of the most impressive things in recent history and something that would fill me with joy and excitement but we’re in such a hostile environment that I am missing out on all that. I haven’t even managed to get myself to warm up to at least trying one out.

[–] Norgur@fedia.io 0 points 4 months ago (1 children)

It's really not that exciting. Quite the opposite. The rush for AI in everything is absolutely bonkers, since those LLMs are just stupid as fuck and not suited for any sort of productive performance they get hyped up to achieve.

[–] iarigby@lemmy.world 0 points 4 months ago (1 children)

ah so you were only annoyed that people are against doing the stupid computations in the datacenter and there will be less efficient grid version?

[–] Norgur@fedia.io 0 points 4 months ago

I'm annoyed that we're going crazy because computers manage to spew out bullshit that vaguely sounds like the bullshit humans spew out, yet is somehow even less intelligent. At the same time, people think, this empty yapping is more accurate and totally a miracle, while all it really shows is that computers are good at patterns and language and information follow patterns - go figure.

I'm annoyed that Silicon Valley tech evangelists get away with breaking every law they fucking want, once again in the creation of those tools.

Yet, I'm neither worried about the ecological impact nor about the impact on the workforce. Yes, jobs will shift, but that was clear as day since I was a kid. I don't even necessarily think “AI” will be the huge game changer it's made up to be.

When they run out of training data (which is fueled by slave labor, because of fucking course it is) or AIs start ingesting too many AI-generated texts, the models we have today just collapse, disintegrating into a blabbering mess.

[–] rottingleaf@lemmy.zip 0 points 4 months ago (2 children)

The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.

I think the idea was that these things are bad idea locally or otherwise, if you don't control them.

[–] MudMan@fedia.io 0 points 4 months ago

No it wasn't. Here's how I know: all the valid concerns that came about how additional regulation would disproportionately stifle open source alternatives were immediately ignored by the vocal online critics (and the corporate techbros overhyping sci-fi apocalypses). And then when open alternatives appeared anyway nobody on the critical side considered them appropriate or even a lesser evil. The narrative didn't move one bit.

Because it wasn't about openness or closeness, it was a tribal fight, like all the tribal fights we keep having, stoked by greed on one end and viral outrage on the other. It's excruciating to watch.

[–] RedWeasel@lemmy.world 0 points 4 months ago

I wouldn’t say bad, but the generative ai and llm are definitely underbaked and shoving everything under the sun into them is going to create garbage in, garbage out. And using it for customer support where it will inevitably offer either bad advice or open you up to lawsuits seems shortsighted to say the least.

They were calling the rest machine learning(ML) a couple years ago. There are valid uses for ML though. Image/video upscaling and image search are a couple examples.

[–] Melvin_Ferd@lemmy.world 0 points 4 months ago (1 children)

Modern media scares me more than AI

[–] MudMan@fedia.io 0 points 4 months ago (1 children)

Honestly, a lot of the effects people attribute to "AI" as understood by this polemic are ongoing and got ignited by algorithmic searches first and then supercharged by social media. If anything, there are some ways in which the moral AI panic is actually triggering regulation that should have existed for ages.

[–] Melvin_Ferd@lemmy.world 0 points 4 months ago (1 children)

Regulation is only going to prevent regular people from benefiting from AI while keeping it as a tool for the upper crust to continue to benefit. Artists are a Trojan horse on this.

[–] MudMan@fedia.io 0 points 4 months ago (1 children)

We're thinking about different "regulation", and that's another place where extreme opinions have nuked the ground into glass.

Absolutely yeah, techbros are playing up the risks because they hope regulators looking for a cheap win will suddenly increase the cost for competitors, lock out open alternatives and grandfather them in as the only valid stewards of this supposedly apocalyptic technology. We probably shouldn't allow that.

But "maybe don't make an app that makes porn out of social media pictures of your underage ex girlfriend at the touch of a button" is probably reasonable, AI or no AI.

Software uses need some regulation like everything else does. Doesn't mean we need to sell the regulation to disingenuous corporations.

[–] Melvin_Ferd@lemmy.world 0 points 4 months ago* (last edited 4 months ago) (1 children)

We already have laws that protect people when porn is made of them without consent. AI should be a tool that's as free and open to be used as possible and built upon. Regulation is only going to turn it into a tool for the haves and restrict the have not's. Of course you're going to see justifiable reasons just like protecting children made sense during the satanic panics. Abuse happens in daycares across the countries. Satanists do exist. Pen pineapple apple pen.

Its not like you control these things by making arguments that make no sense. They're structured to ensure you agree with them especially during the early phase roll out otherwise it would just become something that again never pans out the way we fear. Media is there to generate the fear and arguments to convince us to hobble ourselves.

[–] MudMan@fedia.io 0 points 4 months ago (1 children)

No, that's not true at all. That's the exact same argument that the fearmongers are using to claim that traditional copyright already covers the use cases of media as AI training materials and so training is copying.

It's not. We have to acknowledge that there is some novel element to these, and novel elements may require novel frameworks. I think IP and copyright are broken anyway, but if the thing that makes us rethink them is the idea that machines can learn from a piece of media without storing a copy and spit out a similar output... well, we may need to look at that.

And if there is a significant change in how easily accessible, realistic or widespread certain abusive practices are we may need some adjustments there.

But that's not the same as saying that AI is going to get us to Terminator within five years and so Sam Altman is the only savior that can keep the grail of knowledge away from bad actors. Regulation should be effective and enable people to compete in the novel areas where there is opportunity.

Both of those things can be true at the same time. I promise you don't need to take the maximalist approach. You don't even need to take sides at all. That's the frustrating part of this whole thing.

[–] Melvin_Ferd@lemmy.world 0 points 4 months ago* (last edited 4 months ago) (2 children)

I think we should stop applying broken and primitive regulations and laws created before any of this technology and ability was ever even dreamed of. Sorry to say but I don't want to protect the lowly artist over the ability for people to collaborate and advance our knowledge and understanding forward. I want to see copyright, IP and other laws removed entirely.

We should have moved more towards the open sharing of all information. We have unnecessarily recreated all the problems of the predigital age and made them worse.

If it was up to me I would abolish copyright and IP laws. I would make every corner of the internet a place for sharing data and information and anyone putting their work online would need to accept it will be recreated, shared and improved upon. We all should have moved in a different direction then what we have now.

load more comments (2 replies)
[–] BrianTheeBiscuiteer@lemmy.world 0 points 4 months ago (1 children)

Efficiency at the consumer level is poor, but industry uses more total energy than consumers.

[–] MudMan@fedia.io 0 points 4 months ago

Yeeeeah, you're gonna have to break down that math for me.

Because if an output takes some amount of processing to generate and your energy cost per unit of compute is higher we're either missing something in that equation or we're breaking the laws of thermodynamics.

If the argument is that the industry uses more total energy because they keep the training in-house or because they do more of the compute at this point in time, that doesn't change things much, does it? The more of those tasks that get offloaded to the end user the more the balance will shift for generating outputs. As for training, that's a fixed cost. Technically the more you use a model the more the cost spreads out per query, and it's not like distributing the training load itself among user-level hardware would make its energy cost go down.

The reality of it is that the entire thing was more than a bit demagogic. People are mad at the energy cost of chatbot search and image generation, but not at the same cost of image generation for videogame upscaling or frame interpolation, even if they're using the same methods and hardware. Like I said earlier, it's all carryover from the crypto outrage more than it is anything else.

load more comments (13 replies)
load more comments (1 replies)
[–] Dariusmiles2123@sh.itjust.works 0 points 4 months ago (1 children)

The article is really interesting and all your comments too.

For now I have a negative bias towards AI as I only see its downsides, but I can see that not everyone thinks like me and it’s great to share knowledge and understanding.

load more comments (1 replies)
[–] mPony@lemmy.world 0 points 4 months ago (4 children)

This article is one of the most down-to-earth, realistic observations on technology I've ever read. Utterly striking as well.

Go Read This Article.

[–] TheBest@midwest.social 0 points 4 months ago (2 children)

Agreed, stop scrolling the comments and go read it random reader.

I used to get so excited by tech advances but now I've gotten to the point where its still cool and a fascinating application of science... but this stuff is legitimately existential. The author raises great points around it.

load more comments (2 replies)
[–] Dkarma@lemmy.world 0 points 4 months ago (5 children)

This article is a regurgitation of every tech article since the microchip. There is literally nothing new here. Tech makes labor obsolete. Tech never considers the ramifications of tech.

These things have been known since the beginning of tech.

[–] akwd169@sh.itjust.works 0 points 4 months ago (1 children)

What about the climate impact? You didn't even address that. That's the worst part of the AI boom, were already way in the red for climate change, and this is going to accelerate the problem rather than slowing or stopping (let alone reversing it)

[–] Not_mikey@slrpnk.net 0 points 4 months ago (3 children)

That's a very solvable problem though, AI can easily be run off green energy and a lot of the new data centers being built are utilizing it, tons are popping up in Seattle with its abundance of hydro energy. Compare that to meat production or transportation via combustion which have a much harder transition and this seems way less of an existential problem then the author makes it out to be.

Also most of the energy needed is for the training which can be done at any time, so it can be run on off peak hours. It can also absorb surpluses from solar energy in the middle of the day which can put strain on the grid.

This is all assuming it's done right, which it may not and could exasperate the ditch were already in, but the technology itself isn't inherently bad.

[–] dustyData@lemmy.world 0 points 4 months ago* (last edited 4 months ago)

AI can easily be run off green energy

This is all assuming it’s done right

That right there is the problem. I don't trust any tech CEO to do the right thing ever, because historically they haven't. For every single technological advancement since the industrial revolution brought forth by the corporate class, masses of people have had to beat them up and shed blood to get them to stop being assholes for a beat and abuse and murder people a little less.

[–] groet@infosec.pub 0 points 4 months ago

It doesn't matter if AI is run on green energy as long as other things are still running on fossil fuels. There is a limit to how fast renewables energy sources are built and if the power consumption of AI eats away all of that growth, then the amount of fossil energy doesn't change.

All increases in energy consumption are not green because they force something else to run on fossil energy for longer.

load more comments (1 replies)
load more comments (4 replies)
load more comments (1 replies)
[–] demonsword@lemmy.world 0 points 4 months ago

I think the worst part of Huang's keynote wasn't that none of this mattered, it's that I don't think anyone in Huang's position is really thinking about any of this at all. I hope they're not, which at least means it's possible they can be convinced to change course. The alternative is that they do not care, which is a far darker problem for the world.

well yeah... they just don't care, after all the climate crisis is somebody else's problem... and what really matters is that the line goes up next quarter, mankind's future be damned

[–] Mrkawfee@lemmy.world 0 points 4 months ago (1 children)
[–] JDPoZ@lemmy.world 0 points 4 months ago

But without even just the cool space station to just stare at longingly…

[–] treadful@lemmy.zip 0 points 4 months ago (10 children)

All these issues are valid and need solving but I'm kind of tired of people implying we shouldn't do certain work because of efficiency.

And tech gets all the scrutiny for some reason (it's transparency?). I can't recall the last time I've seen an article on industrial machine efficiency and how we should just stop producing whatever.

What we really need to do is find ways to improve efficiency on all work while moving towards carbon neutrality. All work is valid.

If I want to compute pi for no reason or drive to the Grand Canyon for lunch, I should be able to do so.

[–] Esqplorer@lemmy.zip 0 points 4 months ago

Anyone with experience in corporate operations will tell you the ROI on process changes is dramatically higher than technology. People invent so many stupid and dangerous ways to "improve" their work area. The worst part is that it just takes a little orchestration to understand their needs and use that creativity to everyone's benefit.

[–] rimu@piefed.social 0 points 4 months ago (3 children)

Efficiency??

This is about the total amount of emissions, not the emissions-per-unit-of-compute (or whatever).

load more comments (3 replies)
[–] Telodzrum@lemmy.world 0 points 4 months ago (1 children)

lol at tech’s transparency. You have an availability heuristic issue with your thought process. Every other industry has similar critiques. Your media diet is leading you to false conclusions.

[–] treadful@lemmy.zip 0 points 4 months ago

We're literally in a technology community followed by tons of industry outsiders, of which there is a similar one on every other similar aggregation site. I don't see any of that for things like plastics manufacturers, furniture makers, or miners. So yeah, I'd say transparency for the general public tends to be higher in tech than most other industries.

load more comments (7 replies)
[–] drawerair@lemmy.world 0 points 4 months ago* (last edited 4 months ago)

I like that the writer thought re climate change. I think it's been 1 of the biggest global issues for a long time. I hope there'll be increasing use of sustainable energy for not just data centers but the whole tech world in the coming years.

I think a digital waiter doesn't need a rendered human face. We have food ordering kiosks. Those aren't ai. I think those suffice. A self-checkout grocer kiosk doesn't need a face too.

I think "client help" is where ai can at least aid. Imagine a firm who's been operating for decades and encountered so many kinds of client complaints. It can feed all those data to a large language model. With that model responding to most of the client complaints, the firm can reduce the number of their client support people. The model will pass complaints that are very complex or that it doesn't know how to address to the client support people.

Idk whether the government or the public should stop ai from taking human jobs or let it. I'm torn. Optimistically, workers can find new jobs. But we should imagine that at least 1 human will be laid off and can't find a new job. He'll be jobless for months. He'll have an epic headache as he can't pay next month's bills.

[–] Teppichbrand@feddit.de 0 points 4 months ago (10 children)

Innovation is a scam, it breeds endless bullshit we keep buying and talking about like 10 year olds with the latest gimmick.
Look, we replaced this button with A TOUCHSCREEN!
Look! This artficial face has PORES NOW!
LOOK! This coffee machine costs 2000$ now and uses PROPRIATARY SUPEREXPENSIVE CAPSULES!!
We need progress, which is harder to do because it takes paradigm shift on an Individual and social level. It's much less gadgety.

load more comments (10 replies)
[–] electric_nan@lemmy.ml 0 points 4 months ago (2 children)

Boiling the oceans for deepfake porn, scamcoins and worse web search.

load more comments (2 replies)
[–] sudo42@lemmy.world 0 points 4 months ago (5 children)

So if each GPU takes 1,800W, isn’t that the equivalent of what a handheld hair dryer consumes?

load more comments (5 replies)
load more comments
view more: next ›