A. My understanding is an underage teen who takes a nude selfie and sends it can be charged with distributing "child porn." This is a serious problem and should not be true. If other people forward it, they should be charged, but a teenager should not get in legal trouble for making a nude selfie and sending it privately.
B. We need to culturally get over a lot of our hangups about certain things. His girlfriend was attending church. The church should be telling kids "This is not worth killing yourself over."
C. We somehow need to come up with reasonable accommodation for the reality that teens have phones and budding sexualities and these two things are colliding horrifically in a legal and cultural system strongly rooted in assuming "child porn" is made solely by abusive adults and not willingly by underage teens who don't think it's a big deal at the time the photo is snapped.
Taking a nude selfie should be safer than baring your body in person. No risk of STDs or pregnancy are involved. And yet we've turned it into this hugely dangerous thing for anyone who hasn't spent years thinking about sexual morality who isn't prepared to say to law enforcement and the world "Nudes of me? Big fucking deal."
No idea how to further any of those things, but this is hardly the first article I've read about teen selfies being involved in extreme levels of harsh consequences that I just think morally should not happen and I think have roots in cultural and legal stuff going bad places in part because we aren't keeping up with the times.
"Meta has a portal for police to file requests to preserve records of accounts connected to criminal investigations. Like other social media companies, it has to hold the records—including emails, IP addresses, message transcripts and general usage history—for 90 days. It only hands over user data if it’s ordered to do so by a court.
There’s one way to expedite the request: file it as an emergency, meaning a child could be harmed or there’s risk of death. Larson believed this case qualified. He told Meta that a 17-year-old was already dead, and there was a high probability other kids were in danger, too.
*Meta declined his request within an hour, he says. “The request you submitted does not rise to the level of an emergency,” the company responded.*"
Also iMessage really does not seem to be well designed to handle abuse. Your only option is to block, which is a couple of taps away from a conversation. The block does not instantly take effect cross device, so you still get messages on your Apple Watch until you reboot it. Your only option for reporting anything is when deleting a conversation, and all you can do is say it’s spam.
(1) Most young people are insecure as anything and have unhealthy self-esteem.
(2) Culturally the west is very (and I mean very) ashamed of nudity.
Problem (1) isn't something that can be done half-assed. The kid will need a healthy family, friends, support system. They will need to be free of traumatic influences... While (2) I see as being more practical.
If you look at European countries they are much more naturalist regarding nudity. Like Dutch have their Freikörperkultur (FKK) free body movement where there's many places where you can go and do activities nude. Mixed gender saunas where people are naked are common. Then there's Japan where parents commonly bathe with their children. It teaches them not to be ashamed of their body.
We don't really have anything like that in the west. It's really quite dangerous because its just like: do we expect literal teenagers to practice good opsec when adults can't even get that shit right?
Are they supposed to monitor the content of the chats? Some would call that eavesdropping.
The sad fact is that desperate people exist. And these desperate people are willing to do despicable things to make a buck.
[1] https://www.michigan.gov/mdhhs/-/media/Project/Websites/mdhh...
https://www.bbc.com/news/world-australia-68720247
I'm not real confident this is solvable with law enforcement in a world where the police press release is - "located in a slum in Nigeria with a population of 25 million people"
There is a strong network to support rape victims, and it should be used for such cases.
I.e. extortion.
Beyond it being a truly despicable crime, it's interesting how this extortion hadn't changed much since the early 2000s the only difference is that it's done on a larger scale and probably more done for money than for other nefarious reasons.
https://archive.ph/I9jm2