170
this post was submitted on 10 Jul 2024
170 points (100.0% liked)
Technology
37739 readers
500 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As far as I understand; it's not the tools used that makes this illegal, but the realism/accuracy of the final product regardless of how it was produced.
If you were to have a high proficiency with manual Photoshop and produced similar quality fakes, you'd be committing the same crime(s)
and
The thing is, AI tools are becoming more and more accessible to teens. Time, effort, and skill are no longer roadblocks to creating these images; which leaves very very little in an irresponsible teenagers way...
Which still seems kinda dumb. How realistic is too realistic? You could make a legal standard of "photography-like", or something, just to define who to convict, but you still haven't really justified why.
The sentence in this case is just classes, though, so I'll leave my pitchfork in the shed.
Did... Did you just ask; why creating photo-realistic sexually explicit material of real children, should be illegal?
Keep in mind these were other kids their age. We're not talking about pedo stuff here.
All the recent stuff about deepfakes feels a bit moral-panic-y to me. I think we should have a better reason than just ick before anyone gets thrown in jail.
Do you want an explanation of why creating and sharing sexually explicit material of other people without consent is problematic and damaging, and especially for children?
This is a really good idea. Perhaps this is what should be happening in the first place rather than resorting to direct legal enforcement, which can be problematic and damaging, especially for children.
If you cant understand that sharing naked photos of people is bad, then you probably should have to face the court systems.
Like what? I don't care how horny you are as a teenager, it takes a real fucking idiot, and a huge shitstain to go and share those photos. They absolutely deserve the book being thrown at them.
Yes.
I can see why we'd prohibit it, but somehow doing it in writing without involving the subject is pretty accepted (see: every fanfiction involving characters played by a specific real actor), and mentally doing it is like an informal human right.
I'm honestly not trying to be obtuse here. It seems arbitrary to me. People have pictured me in all kinds of horrifying situations, I'm sure (probably more violent then sexy, but still). I'm not bothered, nor would I be if they made a collection of depictions (unless they sent some to me).
They shared sexually explicit images in whatsapp groups. You consider that similar to having personal thoughts nobody will know of or written stories?
Have you dismissed this quote? I don't know where to start explaining how it's different from what you described because of how far off it is. I have no idea where the baseline is to argue from.
Humans are a social creature. We form groups, and want to be part of groups. Teens are especially vulnerable with a developing personality, social norms, and social belonging. Breaking norms and violating common personal barriers and control of self-expression and self-presentation is deeply violating in a vulnerable phase of life.
They didn't create a personal collection. They shared in their social groups.