this post was submitted on 19 Aug 2024
1 points (100.0% liked)

Technology

59587 readers
3117 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Popular iPad design app Procreate is coming out against generative AI, and has vowed never to introduce generative AI features into its products. The company said on its website that although machine learning is a “compelling technology with a lot of merit,” the current path that generative AI is on is wrong for its platform. 

Procreate goes on to say that it’s not chasing a technology that is a threat to human creativity, even though this may make the company “seem at risk of being left behind.”

Procreate CEO James Cuda released an even stronger statement against the technology in a video posted to X on Monday.

top 50 comments
sorted by: hot top controversial new old
[–] Imgonnatrythis@sh.itjust.works 0 points 3 months ago

So definitely gonna have AI baked in by next year.

[–] eager_eagle@lemmy.world 0 points 3 months ago (2 children)
[–] brucethemoose@lemmy.world 0 points 3 months ago

The more you buy, the more you save!

[–] Plopp@lemmy.world 0 points 3 months ago (1 children)

I don't trust them. They better fire him and hire a Jim Abacus.

[–] Gork@lemm.ee 0 points 3 months ago (1 children)

The CEO should ideally have the exact same name as the company. Like Tim Apple.

Or Sam Sung.

[–] MossyFeathers@pawb.social 0 points 3 months ago (1 children)

Doug Bowser of Nintendo springs to mind. Also Gary Bowser, the guy they used the US courts to make an example of.

[–] Jerkface@lemmy.world 0 points 3 months ago

Is that what he was saying in Mario 64? "So long, Gary Bowser!"

[–] li10@feddit.uk 0 points 3 months ago (3 children)

Ironically, I think AI may prove to be most useful in video games.

Not to outright replace writers, but so they instead focus on feeding backstory to AI so it essentially becomes the characters they’ve created.

I just think it’s going to be inevitable and the only possible option for a game where the player truly chooses the story.

I just can’t be interested in multiple choice games where you know that your choice doesn’t matter. If a character dies from option a, then option b, c, and d kill them as well.

Realising that as a kid instantly ruined telltale games for me, but I think AI used in the right way could solve that problem, to at least some degree.

Something like using a LLM to make actually unique side quests in a Skyrim-esque game could be interesting.

The side quest/bounty quest shit in something like Starfield was fucking awful because it was like, 5 of the same damn things. Something capable of making at least unique sounding quests would be a shockingly good use of the tech.

[–] MossyFeathers@pawb.social 0 points 3 months ago (1 children)

Honestly, I think that...

  • AI is going to revolutionize the game industry.

  • AI is going to kill the game industry as it currently exists.

  • Generative AI will lead to a lot of real-time effects and mechanics that are currently impossible, like endless quests that don't feel hollow, realistic procedural generation that can convincingly create everything from random clutter to entire galaxies, true photorealistic graphics (look up gaussian splatting, it's pretty cool), convincing real-time art filters (imagine a 3d game that looks like an animated Van Gogh painting), and so on.

  • Generative AI is going to result in a hell of a lot of layoffs and will likely ruin people's lives.

  • Generative AI will eventually open the door to small groups of devs being able to compete with AAA releases on all metrics.

  • Generative AI will make studios with thousands of employees obsolete. This is a double-edged sword. Fewer employees means fewer ideas; but on the other side, you get a more accurate vision of what the director originally intended. Fewer employees also will also mean that you will likely have to be a genuinely creative person to get ahead, instead of someone who knows how to use Maya or Photoshop but is otherwise creatively bankrupt. Your contribution matters far more in a studio of <50 than it does in a studio of >5,000; as such, your creative skill will matter more.

  • A lot of people will have to be retrained because they will no longer be creative enough to make a living off of making games.

Tbh, I think game development is one of the few places that generative AI will actually have a significant benefit; however I also think it will completely scramble the industry once it starts being widely adopted, and it'll be a long time before the dust settles.

[–] mke@lemmy.world 0 points 3 months ago (1 children)

I've no idea where you're getting these predictions from. I think some of them are fundamentally flawed if not outright incorrect, and don't reflect real life trends of generative AI development and applications.

Gonna finish this comment in a few, please wait.

[–] MossyFeathers@pawb.social 0 points 3 months ago (1 children)

I think the big difference is that you seem to think that AI has peaked or is near its peak potential, while I think AI is still just getting started. Will generative AI ever progress beyond being a gimmick? I don't know, but I suspect it will eventually.

For example, indies do not have the budget to license expensive actors (e.g. Call of Duty, Cyberpunk 2077), brands (e.g. racing games), and so on. GenAI will not change this. Hell, GenAI will certainly not pay for global advertising.

Admittedly I had not thought about the licensing and advertising aspect. That's a bit of a blind spot for me because it's not something I tend to care about. You're correct there.

If hiring them is no longer advantageous due to financial incentives to adopt AI, that's not their fault for being insufficiently creative.

I mean, maybe I could have phrased it better, but what else are you gonna do? They have to make a living somehow and if they can't get hired in the game industry anymore, you gotta help them find somewhere else they can work.

[–] mke@lemmy.world 0 points 3 months ago* (last edited 3 months ago) (1 children)

I think the big difference is that you seem to think that AI has peaked or is near its peak potential, while I think AI is still just getting started.

That's a fair assessment. I'm still not sure if popular AI tech is on an exponential or a sigmoid curve, but I tend towards the latter. Note, however, that the industry at large is starting to believe it's just not worth it. Even worse, the entities at the forefront of AI are unsustainable—they're burning brightly right now, but the cash flow required to keep a reaction on this scale going is simply too large. If you've got time and are willing, please check the linked article by Ed (burst damage).

I mean, maybe I could have phrased it better, but what else are you gonna do?

My bad, I try to trim down the fat while editing, but I accidentally removed things I shouldn't. As I said, it's a nitpick, and I understand the importance of helping those who find themselves unhirable. Maybe it's just me, but I thought it came across a little mean, even if it wasn't your intent. I try to gently "poke" folks when I see stuff like this because artists get enough undeserved crap already.

load more comments (1 replies)
[–] brucethemoose@lemmy.world 0 points 3 months ago* (last edited 3 months ago)

Yeah, ultimately a lof of devs are trying to make "story generators" relying on the user's imagination to fill in the blanks, hence rimworld is so popular.

There's a business/technical model where "local" llms would kinda work for this too, if you set it up like the Kobold Horde. So the dev hosts a few GPU instances for GPUs that can't handle the local LLM, but users with beefy PCs also generate responses for other users (optionally, with a low priority) in a self hosted horde.

[–] RobotToaster@mander.xyz 0 points 3 months ago* (last edited 3 months ago) (2 children)

Didn't krita say the same thing at one time?

It's currently one of the best programs to generate AI art using self hosted models.

[–] SynopsisTantilize@lemm.ee 0 points 3 months ago (1 children)
[–] greybeard@lemmy.one 0 points 3 months ago

Generate images with self hosted models, or integrate it with art programs? Because yes to both.

[–] MossyFeathers@pawb.social 0 points 3 months ago

I think that's kinda comparing apples to oranges. Krita is FOSS, and FOSS developers can be just as affected by community pressure as proprietary developers; possibly moreso. I dunno the circumstances around Krita's decision to walk back and include AI, but I speculate it may have come from community pressure. Procreate isn't FOSS so the community has a much harder time forcing their hand (the community can't exactly fork the code and push everyone to migrate to a pro-AI version of Procreate). The other side of this, however, is that as proprietary developers, they feel more pressure from money.

My prediction is that they'll stick to this as long as it's profitable. If they break away from it then it's either because the CEO was replaced with a more profit-hungry CEO, they're no longer profitable and they believe adding AI would fix that, or they believe they've found a use for AI that wouldn't sacrifice creativity.

[–] MossyFeathers@pawb.social 0 points 3 months ago (2 children)

Wow, I'm actually kinda impressed. I'm not sure I'm 100% behind their stance, but it's better than companies that blindly chase profits.

Tbh I think generative AI can be used creatively and artistically, but knowing how to use generative AI doesn't automatically make you creative or artistic. It's like making someone paint a picture for you. Just making someone paint a picture for you doesn't make you an artist, but an artist could say something by making someone paint for them. To put it another way, the AI element has to be more than just a means to an end; it has to justify itself somehow.

"But normal artists don't have to justify themselves!"

You're right! That's because it's assumed that the amount of time, effort and practice that is required to create art "manually" leads to the artist thinking deeply about their artwork before and during its creation; and 99% of the time, that's completely true (the other 1% is "eye candy" like Kinkade; which is what AI is 99% of the time). Most people don't understand this because they have never truly attempted to make "art", however artists obsess over the details. You think that red truck in the bottom corner was "just there"? No, the artist probably put it there for a reason. Hell, the truck being red likely has a reason behind it. Maybe the artist wanted to say something about red trucks, or maybe the truck just looked better in red. Either way, that was a decision the artist was required to make.

That said, AI can do some really cool stuff that would take humans years to reproduce, or would be extremely tedious and mind-numbing. A good example I recently came across is using AI to split music into stems or even into individual instruments. This makes it a lot easier for DJs, musicians and producers to get clean samples. It also makes it significantly easier for people to make custom tracks for Fuser (that's how I found out about it).

I guess what I'm trying to say is that I don't think they should write-off AI entirely, but instead try and think of areas where AI would help artists. Maybe you use it to allow people to rescale their artwork without potentially having to redraw blurry lines. Maybe it's AI that's designed to separate photographs into individual pieces for the purpose of collages. Maybe it's an AI designed to interpolate animation frames better than human-written algorithms. AI can do a lot of stuff other than just making eye candy.

That said, I think rejecting generative AI entirely is better than blindly chasing the money, so good on you.

[–] pycorax@lemmy.world 0 points 3 months ago

They specifically called out generative AI though. Stuff like separating photographs to individual pieces doesn't require generative AI specifically. Machine learning models that fall into the general umbrella of AI already exist for object segmentation.

[–] FaceDeer@fedia.io 0 points 3 months ago (1 children)

They're chasing profit too, though. "Taking a stand" means they're advertising, trying to differentiate themselves from their competitors and draw in people who hold anti-AI views.

That will last until that segment of users becomes too small to be worth trying to base their business on.

[–] mke@lemmy.world 0 points 3 months ago* (last edited 3 months ago)

Well, sounds great. I almost wish more companies would advertise to that market, really.

It's like... I know you're lying, and I know you probably don't actually care, but some of your competitors couldn't even be bothered to do this much. Those companies thought shitting on things I care about to maximize profits was the better strategy. I'll take that into consideration in my future decisions.

And if the situation changes, if they turn around and go full in on generative AI, we'll just have to consider that too. That's life.

Of course, I believe using alternatives that are more resistant to these kinds of market trends (community built software, perhaps?) would be ideal, but it's not always an option.

[–] chiisana@lemmy.chiisana.net 0 points 3 months ago

Never eh? Well someone won’t exist under the same name/promise in decade or two.

[–] net00@lemm.ee 0 points 3 months ago (6 children)

Built on a foundation of theft

Sums up all AI

[–] douglasg14b@lemmy.world 0 points 3 months ago

No, it sums up a very specific type of AI...

Blanket statement are dumb.

[–] BumpingFuglies@lemmy.zip 0 points 3 months ago (2 children)

Can you explain how you came to that conclusion?

The way I understand it, generative AI training is more like a single person analyzing art at impossibly fast speeds, then using said art as inspiration to create new art at impossibly fast speeds.

[–] Clasm@ttrpg.network 0 points 3 months ago (2 children)

The art isn't being made btw so much as being copy and pasted in a way that might convince you it was new.

Since the AI cannot create a new style or genre on its own, without source material that already exists to train it, and that source material is often scraped up off of databases, often against the will and intent of the original creators, it is seen as theft.

Especially if the artists were in no way compensated.

[–] paw@feddit.org 0 points 3 months ago (1 children)

To add to your excellent comment:

It does not ask if it can copy the art nor does it attribute its generated art with: "this art was inspired by ..."

I can understand why creators unhappy with this situation.

[–] ReCursing@lemmings.world 0 points 3 months ago

Do you go into a gallery and scream "THIS ART WAS INSPIRED BY PICASSO. WHY DOESN'T IT SAY THAT! tHIS IS THEFT!" - no, I suspect you don't because that would be stupid. That's what you sound like here

[–] FatCrab@lemmy.one 0 points 3 months ago (4 children)

This is absolutely wrong about how something like SD generates outputs. Relationships between atomic parts of an image are encoded into the model from across all training inputs. There is no copying and pasting. Now whether you think extracting these relationships from images you can otherwise access constitutes some sort of theft is one thing, but characterizing generative models as copying and pasting scraped image pieces is just utterly incorrect.

load more comments (4 replies)
[–] gap_betweenus@lemmy.world 0 points 3 months ago (1 children)

With this logic photography is a painting, painted at an impossible high speed - but for some reasons we make a difference between something humans make and machines make.

[–] ReCursing@lemmings.world 0 points 3 months ago (3 children)

Amusingly, every argument against ai art was made against photography over a hundred years ago, and I bet you own a camera - possibly even on the device you wrote your stupid comment on!

load more comments (3 replies)
[–] Mirodir@discuss.tchncs.de 0 points 3 months ago (1 children)

Does it? I worked on training a classifier and a generative model on freely available galaxy images taken by Hubble and labelled in a citizen science approach. Where's the theft?

[–] unsignedbit@lemm.ee 0 points 3 months ago (1 children)

Hard to say. Training models is generative; training a model from scratch is costly. Your input may not infringe copyright but the input before or after may have.

[–] Mirodir@discuss.tchncs.de 0 points 3 months ago

I trained the generative models all from scratch. Pretrained models are not that helpful when it's important to accurately capture very domain specific features.

One of the classifiers I tried was based on zoobot with a custom head. Assuming the publications around zoobot are truthful, it was trained exclusively on similar data from a multitude of different sky surveys.

[–] ribhu@lemmy.world 0 points 3 months ago

That's a blanket statement. While I understand the sentiment, what about the thousands of "AIs" trained on private, proprietary data for personal or private use by organizations that own the said data. It's the not the technology but the lack of regulation and misaligned incentives.

[–] PrinzKasper@feddit.org 0 points 3 months ago (1 children)

I assume you mean all generative AI? Because I don't think AI that autonomously learns to play Super Mario is theft https://youtu.be/qv6UVOQ0F44

load more comments (1 replies)
[–] ReCursing@lemmings.world 0 points 3 months ago

Nope. Stop with the luddite lies please

[–] net00@lemm.ee 0 points 3 months ago (1 children)

Built on a foundation of theft

Sums up all AI

[–] TheGrandNagus@lemmy.world 0 points 3 months ago

That's a very reactionary take, IMO.

There's plenty of AI out there that's not built on theft. You can train them solely on your own data if you want them to. There's open source models out there trained only on data they were expressly given consent to use.

You can get machine learning algorithms to learn how to play basic games completely on their own, etc.

[–] demesisx@infosec.pub 0 points 3 months ago (2 children)

Procreate is amazing. I bought it for my neurodivergent daughter and used it as a non-destructive coloring book.

I’d grab a line drawing of a character that she wanted to color from a google image search, add it to the background layer, lock the background so she can’t accidentally move or erase it, then have her color on the layer above it using the multiply so the black lines can’t be painted over. She got the point where she prefers to have the colorized version alongside the black and white so she can grab the colors from the original and do fun stuff like mimic its shading and copy paste in elements that might have been too difficult for her to render. Honestly, she barely speaks but on that program, she’s better than most adults already even at age 8. Her work looks utterly perfect and she knows a lot of advanced blending and cloning stuff that traditional media artists don’t usually know.

[–] Allero@lemmy.today 0 points 3 months ago* (last edited 3 months ago) (1 children)

That's heartwarming. Good luck to her! (and you)

You're a great techno-parent

load more comments (1 replies)
load more comments (1 replies)
[–] mindlight@lemm.ee 0 points 3 months ago

While a honorable move, "never" doesn't exist in a world based on quarterly financials...

[–] daniskarma@lemmy.dbzer0.com 0 points 3 months ago (4 children)

As with everything the problem is not AI technology the problem is capitalism.

End capitalism and suddenly being able to share and openly use everyone's work for free becomes a beautiful thing.

[–] exocortex@discuss.tchncs.de 0 points 3 months ago* (last edited 3 months ago)

I agree, but as long as we still have capitalism I support measures that at least slow down the destructiveness of capitalism. AI is like a new powertool in capitalism's arsenal to dismantle our humanity. Sure we can use it for cool things as well. But right now it's used mostly to automate stuff that makes us human - art, music and so on. Not useful stuff like loading the dishwasher for me. More like writing a letter for me to invite my friends to my birthday. Very cool. But maybe the work I put in doing this myself is making my friends feel appreciated?

Edit: It's also nice to at least have an app that takes this maximalist approach. Then people can choose. If they're half-assing it there will be more and more ai-features creeping in over time. One compromise after the next until it's like all the other apps. It's also important to have such a maximalist stand in order to gauge the scale in a way.

[–] Allero@lemmy.today 0 points 3 months ago

This, over and over again.

Going against AI is being a luddite, not aware of the core underlying issue.

load more comments (2 replies)
[–] Blackmist@feddit.uk 0 points 3 months ago (22 children)

Generative AI steals art.

Procreate's customers are artists.

Stands to reason you don't piss your customer base off.

load more comments (22 replies)
[–] Aceticon@lemmy.world 0 points 3 months ago* (last edited 3 months ago) (4 children)

No doubt his decision was helped by the fact that you can't really fit full image generation AI on iPads - for example Stable Diffusion needs at the very least 6GB of GPU memory to work.

That said, since what they sell is a design app, I applaud him for siding with the interests of at least some of his users.

PS: Is it just me that finds it funny that the guy's last name is "Cuda" and CUDA is the Nvidia technology for running computing on their GPUs and hence widelly used for this kind of AI?

[–] MashedTech@lemmy.world 0 points 3 months ago (1 children)

They're all run on cloud, for commercial products.

[–] Aceticon@lemmy.world 0 points 3 months ago* (last edited 3 months ago)

Good point, I had forgotten that :/

load more comments (3 replies)
load more comments
view more: next ›