"...but i have 200 followers on twitter so I can't leave :/"

submitted by edited

https://lemmy.ca/pictrs/image/6a2de774-8a20-4149-a8dc-10ed1e71c70e.jpeg

"...but i have 200 followers on twitter so I can't leave :/"

1000 child porn ai pictures an hour is the current estimate. Being made, on Musk’s twitter, now known as shit.

Some are suggesting Musk is creating lists of Child porn makers on twitter, who dont realise they’re being filmed, 👈 so they can be blackmailed into criminal acts, like Putin compromised Trump. 👈

OG

12
401

Log in to comment

12 Comments

Thank you for this it unfortunately shall be very useful.

except it’s not a person but serious government institutions in many countries somehow

Pre musk Twitter had real life CP problem. Musk Twitter has Ai CP problem… Progress?

Before Grok, Elon made reporting CSAM a lot harder for some reason, so that became more common.

Before Elon, Twitter had problem with not removing reported CSAM at all. It’s not good VS bad, it’s shit all the way, just the flavor is different.

But at least I’ve heard less people fumbling into CSAM. But by judging your avatar, you might be just defending your beloved AI slop at all cost.

Don’t blame the tool, blame the user for misusing that tool

Still, guardrails should be set up, in which Epstein island visitor, Elon “on Mars the age of consent could be 14” Musk is clearly not interested in…

Still, guardrails should be set up,

I don’t disagree. Anti-nudity filters in most common image generation models are available since the day 1

Elon fucked up hard.

They were fine with the nazism, and they will be fine with this as well.

Network Effect and Vendor-lock in are a hell of a drug.