DoreenMichele
Some thoughts I've had for years:

A. My understanding is an underage teen who takes a nude selfie and sends it can be charged with distributing "child porn." This is a serious problem and should not be true. If other people forward it, they should be charged, but a teenager should not get in legal trouble for making a nude selfie and sending it privately.

B. We need to culturally get over a lot of our hangups about certain things. His girlfriend was attending church. The church should be telling kids "This is not worth killing yourself over."

C. We somehow need to come up with reasonable accommodation for the reality that teens have phones and budding sexualities and these two things are colliding horrifically in a legal and cultural system strongly rooted in assuming "child porn" is made solely by abusive adults and not willingly by underage teens who don't think it's a big deal at the time the photo is snapped.

Taking a nude selfie should be safer than baring your body in person. No risk of STDs or pregnancy are involved. And yet we've turned it into this hugely dangerous thing for anyone who hasn't spent years thinking about sexual morality who isn't prepared to say to law enforcement and the world "Nudes of me? Big fucking deal."

No idea how to further any of those things, but this is hardly the first article I've read about teen selfies being involved in extreme levels of harsh consequences that I just think morally should not happen and I think have roots in cultural and legal stuff going bad places in part because we aren't keeping up with the times.

SoftTalker
In this day and age, if you fall for something like that, just refuse to pay and say you'll tell people it's an AI fake. I think the days of this sort of scam being very effective are over, or will be soon.
dockerd
Meta didn't accept emergency request from police and needed county magistrate order to help assist in investigation.

"Meta has a portal for police to file requests to preserve records of accounts connected to criminal investigations. Like other social media companies, it has to hold the records—including emails, IP addresses, message transcripts and general usage history—for 90 days. It only hands over user data if it’s ordered to do so by a court.

There’s one way to expedite the request: file it as an emergency, meaning a child could be harmed or there’s risk of death. Larson believed this case qualified. He told Meta that a 17-year-old was already dead, and there was a high probability other kids were in danger, too.

*Meta declined his request within an hour, he says. “The request you submitted does not rise to the level of an emergency,” the company responded.*"

abrookewood
God this is awful. If you have kids, go talk to them. I've already spoken to mine. Aside from the usual (don't send nudes; be careful on the Internet), I also told them that if something like this does happen, it isn't worth your life: everyone masturbates, it isn't anything shameful and I'll still love you anyway.
MarkSweep
I’m not a teenage boy, but I got targeted by one these. While it was misdirected (they were trying to blackmail me with someone else’s photos), the whole thing was quite disturbing.

Also iMessage really does not seem to be well designed to handle abuse. Your only option is to block, which is a couple of taps away from a conversation. The block does not instantly take effect cross device, so you still get messages on your Apple Watch until you reboot it. Your only option for reporting anything is when deleting a conversation, and all you can do is say it’s spam.

andersa
I don't understand what would motivate someone to ever send an explicit photo of themselves to a stranger. The whole premise of this scam makes no sense. Like, is this not the most basic of the basics of online safety?
junon
In my opinion, those two (or three) deserve way longer than 15 years. This wasn't some act of desperation. This was just pure, unfiltered evil that IMO cannot be "corrected".
Uptrenda
I see this as a two fold problem:

(1) Most young people are insecure as anything and have unhealthy self-esteem.

(2) Culturally the west is very (and I mean very) ashamed of nudity.

Problem (1) isn't something that can be done half-assed. The kid will need a healthy family, friends, support system. They will need to be free of traumatic influences... While (2) I see as being more practical.

If you look at European countries they are much more naturalist regarding nudity. Like Dutch have their Freikörperkultur (FKK) free body movement where there's many places where you can go and do activities nude. Mixed gender saunas where people are naked are common. Then there's Japan where parents commonly bathe with their children. It teaches them not to be ashamed of their body.

We don't really have anything like that in the west. It's really quite dangerous because its just like: do we expect literal teenagers to practice good opsec when adults can't even get that shit right?

gttalbot
Why would Facebook not just block messages from Nigeria to rando North American towns at this point? Shouldn't the network analysis required to detect this sort of crime ring be a slam-dunk at this point? I don't get how this is still even possible.
boffinAudio
This could all be solved if the OS vendors would add a 'share my childs screen' function to their operating systems. These incidents happen in isolation - make it possible for an adult to actually .. you know .. have oversight over their childrens online activities and it would be less dangerous.
hannofcart
While my heart goes out to all the victims' families, what are platforms like Facebook supposed to do here?

Are they supposed to monitor the content of the chats? Some would call that eavesdropping.

The sad fact is that desperate people exist. And these desperate people are willing to do despicable things to make a buck.

analyte123
The victim's parents would technically have been guilty of a felony if this happened today, by failing to lock up the firearm and the minor causing injury to another, including their own self [1]. Michigan's safe storage law was not in effect in 2022, but the truth stands that guns really need to be locked up when there are teens around. For some reason the narrative has been on 6-year-olds accidentally picking up a gun, but that type of accidental child firearm death is absolutely dwarfed by teen firearm suicides.

[1] https://www.michigan.gov/mdhhs/-/media/Project/Websites/mdhh...

aaron695
Same story in Australia, 16 year old boy killed himself, different two men in Lagos

https://www.bbc.com/news/world-australia-68720247

I'm not real confident this is solvable with law enforcement in a world where the police press release is - "located in a slum in Nigeria with a population of 25 million people"

throehcifj
Why is it called "scam" and "fraud"? It is typical revenge porn, sexual abuse and distribution of child pornography.

There is a strong network to support rape victims, and it should be used for such cases.

impish9208
[dead]
0dayz
Pay wall but making some quick assumption this is people who are impersonating girls/guys which lures young guys with sending nudes and then proclaims that their nudes will be sent everywhere maybe even used to be sent to women/young girls so that when the authorities crackfown it'll be the young guy facing the crime, but this can all be averted by the victim paying.

I.e. extortion.

Beyond it being a truly despicable crime, it's interesting how this extortion hadn't changed much since the early 2000s the only difference is that it's done on a larger scale and probably more done for money than for other nefarious reasons.

fromthegut
Black Mirror. S3 E3
RecycledEle
[flagged]