this post was submitted on 18 Sep 2024
1 points (100.0% liked)
Technology
59587 readers
4578 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But that's exactly my point. Synthetic data is made by AI, but it doesn't cause collapse. The people who keep repeating this "AI fed on AI inevitably dies!" Headline are ignorant of the way this is actually working, of the details that actually matter when it comes to what causes model collapse.
If people want to oppose AI and wish for its downfall, fine, that's their opinion. But they should do so based on actual real data, not an imaginary story they pass around among themselves. Model collapse isn't a real threat to the continuing development of AI. At worst, it's just another checkbox that AI trainers need to check off on their "am I ready to start this training run?" Checklist, alongside "have I paid my electricity bill?"
It was, before we had AI. Turns out that that's another aspect of synthetic data creation that can be greatly assisted by automation.
For example, the Nemotron-4 AI family that NVIDIA released a few months back is specifically intended for creating synthetic data for LLM training. It consists of two LLMs, Nemotron-4 Instruct (which generates the training data) and Nemotron-4 Reward (which curates it). It's not a fully automated process yet but the requirement for human labor is drastically reduced.
But that guarantee isn't needed. AI-generated data isn't a magical poison pill that kills anything that tries to train on it. Bad data is bad, of course, but that's true whether it's AI-generated or not. The same process of filtering good training data from bad training data can work on either.