To everyone building AI systems, I’m asking you to take responsibility for the technology you create. Deepfakes and AI‑generated explicit images of real people — including K‑pop idols like BLACKPINK, Stray Kids, Enhypen, and countless others — are causing real harm. These images violate privacy, destroy trust, and traumatize the people targeted. This isn’t “fan edits.” This isn’t “creativity.” This is non‑consensual s3xu4l content, and it needs to be treated with the seriousness it deserves. I’m asking you to: Build stronger protections against generating explicit images of real people Detect and block deepfake Nud1ty before it spreads Add clearer reporting tools so harmful content can be removed quickly Enforce strict policies that protect public figures, minors, and everyday people Prioritize consent, safety, and dignity in every model you release AI should never be a weapon used to humiliate or violate someone. You have the power — and the responsibility — to make sure your technology doesn’t become one. Please do better. People deserve to feel safe. — A concerned fan who cares about the humans behind the spotlight