this post was submitted on 24 Jan 2025
189 points (93.2% liked)
Technology
61456 readers
4085 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
No, a few million hits from bots is routine for anything that's facing the public at all. Others have posted on this thread (or others like it, this article's been making the rounds a lot in the past few days) that even the most basic of sites can get that sort of bot traffic, and that it's just a simple recursion depth limit setting to avoid the "infinite maze" aspect.
As for AI training, the access log says nothing about that. As I said, AI training sets are not made by just dumping giant piles of randomly scraped text on AIs any more. If a trainer scraped one of those "infinite maze" sites the quality of the resulting data would be checked, and if it was generated by anything remotely economical for the site to be running it'd almost certainly be discarded as junk.
The main angle is not to 'poisen' the training set. it is to waste time, energy and resources. the site loads deliberately slow and produces garbage, which has to be filtered out.
as i said: not a silver bullet. but at least some threads where tied up collecting garbage painfully slow. as the data is useless, whatever their cleanup process is, has more to do. or it might even be tricked into discarding the whole website, as the signal to noise ratio is bad.
so i would still say the author achieved his goal.
The site producing the nonsense has to produce lots of it any time a bot comes along, the trainers only have to filter it once. As others have pointed out it's likely easy for an automated filter to spot. I don't see it as being a clear win.