this post was submitted on 10 Jul 2024
170 points (100.0% liked)

Technology

37739 readers
500 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mozz@mbin.grits.dev 34 points 4 months ago (2 children)

In addition to probation, the teens will also be required to attend classes on gender and equality, as well as on the "responsible use of information and communication technologies,"

What?

Have you not interacted with teenage boys?

I can think of not much more of a better way to teach them there are no consequences and they can keep doing this as long as they smirk and say they’re sorry whenever they get caught

[–] OsrsNeedsF2P@lemmy.ml 46 points 4 months ago (1 children)

The minors were charged with 20 counts of creating child sex abuse images and 20 counts of offenses against their victims’ moral integrity

Punishment or not, those charges are still scary. I think the probation and courses are a good addition.

[–] mozz@mbin.grits.dev 10 points 4 months ago (1 children)

I don’t think those are additions, I think those are the punishments for those charges, in full. I could be wrong but that’s how I read it.

[–] Chozo@fedia.io 31 points 4 months ago (1 children)

Teens sentenced in Spain were between the ages of 13 and 15. According to the Guardian, Spanish law prevented sentencing of minors under 14, but the youth court "can force them to take part in rehabilitation courses."

Some of them are too young to receive real sentencing. It's important to remember that they're children, too.

[–] jonne@infosec.pub 19 points 4 months ago (2 children)

Yeah, it's probably more important to make sure we don't have child porn generation machines available to anyone online.

[–] zurohki@aussie.zone 14 points 4 months ago (1 children)

Since anyone can download and train their own AI, that ship has probably sailed.

[–] jonne@infosec.pub 11 points 4 months ago (1 children)

I think training your own image generator on existing child porn is probably beyond most high schoolers. I'd be happy if at least commercial options were held responsible for distributing generated CP, which is already illegal BTW.

[–] cygnus@lemmy.ca 14 points 4 months ago (2 children)

I don't think the models are trained on CP. They're likely trained on widely-available porn.

[–] zurohki@aussie.zone 9 points 4 months ago

This. If you ask an image generator for a bed in the shape of a pineapple, it probably has no pineapple-shaped beds in its training data but it has pineapples and beds and can mash the concepts together.

[–] Onihikage@beehaw.org 1 points 4 months ago (1 children)

Technically, any model trained on LAION-5B before December 2023 was trained on CSAM.

But yeah, I expect any porn model trained on a sufficient diversity of adult actors could be used to make convincing CP even without having it in the training data. AI image generation is basically the digital equivalent of a chainsaw - a tool for a particular messy job that can really hurt people if used incorrectly. You wouldn't let a typical kid run around unattended with one, that's for sure.

[–] cygnus@lemmy.ca 2 points 4 months ago

I expect any porn model trained on a sufficient diversity of adult actors could be used to make convincing CP even without having it in the training data.

I know I'm wading into the danger zone here, but let's also remember we're talking about teenagers. A (for example) 15 year old's body type will be closer to an 18 year old's than a 5 year old's, so the perfectly legal porn model would work just fine for that, uh, purpose.

[–] mozz@mbin.grits.dev 3 points 4 months ago* (last edited 4 months ago)

Good luck with that

I mean you can do a significant amount by making it illegal to offer it on the open web, which might be the way to go, but creating awesome things that can be had once you go outside the law actually carries its own little long-term consequences

[–] unconfirmedsourcesDOTgov@lemmy.sdf.org 28 points 4 months ago (2 children)

I disagree, these children are minors and the their behavior, while abhorrent, belies a fundamental lack of perspective and empathy.

I've been a teenage boy before and I did some bone-headed things. Maybe not this bad, but still, I agree with the judge in this instance that it would be inappropriate to impose permanent consequences on these kids before their life even gets started because they were stupid, horny, teenage boys.

Even if we assume that these kids don't all have well-meaning parents who who will impose their own punishments, having a probation officer in high school is not going to help with popularity. Then, mandatory classes that will force these boys to evaluate the situation from another perspective seems like a great add-on.

I know it doesn't feel like justice, but our goal as a society shouldn't be to dole out maximum punishment in every instance. The goal is to allow all of us to peacefully coexist and contribute to society - throwing children in a dark hole somewhere to be forgotten isn't going to help with that.

Having said all of the above, it feels like a good time to emphasize that we still don't have any good ideas for solving the core problem here, which is the malicious use of this technology that was dumped on society without any regard for the types of problems that it would create, and entirely without a plan to add guard rails. While I'm far from the only one considering this problem, it should be clear enough by now that dragging our feet on creating regulation isn't getting us any closer to a solution.

At a minimum it feels like we need to implement a mandatory class on the responsible use of technology, but the obvious question there is how to keep the material relevant. Maybe it's something that tech companies could be mandated to provide to all users under 18 - a brief, recurring training (could be a video, idc) and assessment that minors would have to complete quarterly to demonstrate that they understand their responsibilities.

[–] mozz@mbin.grits.dev 18 points 4 months ago* (last edited 4 months ago)

I've been a teenage boy before and I did some bone-headed things. Maybe not this bad, but still, I agree with the judge in this instance that it would be inappropriate to impose permanent consequences on these kids before their life even gets started because they were stupid, horny, teenage boys.

Completely agree with 100% of this

I’m just saying that I think the answer lies somewhere between “take some classes and promise not to do it again” and “adult prison”. They imposed significant harm to another human being, in a way that’s so significant that we all agreed it should be illegal. Yes, I know that probably wasn’t the intent on their part. But this kind of “oh but I just got horny and just kind of didn’t care / wasn’t focused on what the impact was” is not a thing you wanna teach them there’s some wiggle room with as long as they make sure to apologize about it after.

Community service? Home arrest? Juvenile detention for 21 days? Fuckin something? I’m not saying put them in the hole.

[–] kent_eh@lemmy.ca 7 points 4 months ago

I've been a teenage boy before and I did some bone-headed things

Same.

I would be surprised if anyone with the same history didnt do at least a few completely boneheaded things at one point in their youth.