I don’t mean to sound dismissive, your frustration is completely understandable.
That said, this does follow a very old and well-documented pattern: build a consumer image generation tool, and a significant portion of users will try to push it toward sexual or nude imagery, especially involving women.
Even companies with massive resources struggle here. Try generating anything even mildly suggestive involving women with ChatGPT and see how many hoops you have to jump through and that’s after multiple layers of prompt and output filtering.
At that point, content moderation becomes an arms race. Keywords, rate limits, paid tiers, moderation APIs, users will route around all of it. Without huge ongoing investment, it’s a battle that’s very hard to win.
So your conclusion that the problem isn’t the product but the market resonates. A B2B pivot makes a lot of sense, because the incentives and user behavior are fundamentally different.
Edit: Even OpenAI seems to be acknowledging the limits here and has indicated plans for some form of adult mode next year. It will be interesting to see whether that also includes more relaxed image generation policies.
Yes,earlier we were fully focused on B2C for the image-editing app. At the time, AI photo transformation was trending hard, so adoption was fast.
But a big chunk of users started misusing the paid plans—jailbreaking prompts and pushing boundaries. Even though we’re using Google’s Nano Banana model (which doesn’t generate full NSFW), it can still produce partial outputs like lingerie-style images. That still feels wrong, especially when people misuse images that aren’t theirs.
Because of that, shifting to B2B makes more sense. The Nano Banana Pro model really shines there—strong text rendering, true 4K quality, and it solves real business use cases. I’m confident we’ll attract higher-quality users and build something more sustainable in the B2B segment.
I built picxstudio.com, an AI image generator. Got $850 in revenue. Then users started abusing it for NSFW content.
The problem started 3 years ago with my first image gen app. Used Stable Diffusion with ComfyUI on Azure GPU VMs. Users from Asia uploaded photos of women trying to generate nude images. I tried every safeguard. They jailbroke all of them. I shut it down.
This year I tried again. Used Replicate and Fal AI instead of self-hosting. Built proper UI. Used Nano Banana model for headshots and viral images. Small team curating prompts. Started getting traction.
Same problem came back. NSFW abuse.
What I tried:
Content moderation APIs. Users found edge cases. Banned keywords. They used synonyms. Removed free tier. They paid anyway. Manual review. Doesn't scale.
The truth is B2C image generation attracts the wrong crowd. The product wasn't the problem. The market was.
Now pivoting to B2B. Businesses want product photos, branded visuals, AI headshots. They don't upload random photos of women.
What we're building different:
Brand Kit: save your brand URL, we extract logo/colors/fonts, every image stays on-brand 4 Variations: generate using GPT, Claude, Gemini, Mistral. Each has unique strengths. Team workspace without per-seat pricing No monthly subscription. Pay for credits, not monthly fees.
Questions for HN:
Anyone dealt with NSFW abuse in consumer products? How did you handle it? Anyone did B2C to B2B pivot? What worked? Any content moderation tools I'm missing?
If you failed at something I'm about to try, I want to hear it.
What if you would have allowed the use for some more time without pivoting? I faced a spam issue from RMG companies and we allowed them to use at double price and after few weeks they stopped themselves.
I was thinking the same thing. Once we remove free credits and move fully to paid plans, we’ll automatically reduce NSFW usage. We can also increase pricing—maybe even double it.
B2C is tough because pricing becomes a blocker; users just want to casually edit images and won’t pay much. But if we shift focus to B2B, the pricing becomes affordable for them because it’s tied to real business value. Brands don’t see it as editing images—they see it as saving time and money.
I don’t mean to sound dismissive, your frustration is completely understandable.
That said, this does follow a very old and well-documented pattern: build a consumer image generation tool, and a significant portion of users will try to push it toward sexual or nude imagery, especially involving women.
Even companies with massive resources struggle here. Try generating anything even mildly suggestive involving women with ChatGPT and see how many hoops you have to jump through and that’s after multiple layers of prompt and output filtering.
At that point, content moderation becomes an arms race. Keywords, rate limits, paid tiers, moderation APIs, users will route around all of it. Without huge ongoing investment, it’s a battle that’s very hard to win.
So your conclusion that the problem isn’t the product but the market resonates. A B2B pivot makes a lot of sense, because the incentives and user behavior are fundamentally different.
Edit: Even OpenAI seems to be acknowledging the limits here and has indicated plans for some form of adult mode next year. It will be interesting to see whether that also includes more relaxed image generation policies.
Yes,earlier we were fully focused on B2C for the image-editing app. At the time, AI photo transformation was trending hard, so adoption was fast.
But a big chunk of users started misusing the paid plans—jailbreaking prompts and pushing boundaries. Even though we’re using Google’s Nano Banana model (which doesn’t generate full NSFW), it can still produce partial outputs like lingerie-style images. That still feels wrong, especially when people misuse images that aren’t theirs.
Because of that, shifting to B2B makes more sense. The Nano Banana Pro model really shines there—strong text rendering, true 4K quality, and it solves real business use cases. I’m confident we’ll attract higher-quality users and build something more sustainable in the B2B segment.
I built picxstudio.com, an AI image generator. Got $850 in revenue. Then users started abusing it for NSFW content.
The problem started 3 years ago with my first image gen app. Used Stable Diffusion with ComfyUI on Azure GPU VMs. Users from Asia uploaded photos of women trying to generate nude images. I tried every safeguard. They jailbroke all of them. I shut it down.
This year I tried again. Used Replicate and Fal AI instead of self-hosting. Built proper UI. Used Nano Banana model for headshots and viral images. Small team curating prompts. Started getting traction.
Same problem came back. NSFW abuse.
What I tried:
Content moderation APIs. Users found edge cases. Banned keywords. They used synonyms. Removed free tier. They paid anyway. Manual review. Doesn't scale. The truth is B2C image generation attracts the wrong crowd. The product wasn't the problem. The market was.
Now pivoting to B2B. Businesses want product photos, branded visuals, AI headshots. They don't upload random photos of women.
What we're building different:
Brand Kit: save your brand URL, we extract logo/colors/fonts, every image stays on-brand 4 Variations: generate using GPT, Claude, Gemini, Mistral. Each has unique strengths. Team workspace without per-seat pricing No monthly subscription. Pay for credits, not monthly fees. Questions for HN:
Anyone dealt with NSFW abuse in consumer products? How did you handle it? Anyone did B2C to B2B pivot? What worked? Any content moderation tools I'm missing? If you failed at something I'm about to try, I want to hear it.
What if you would have allowed the use for some more time without pivoting? I faced a spam issue from RMG companies and we allowed them to use at double price and after few weeks they stopped themselves.
I was thinking the same thing. Once we remove free credits and move fully to paid plans, we’ll automatically reduce NSFW usage. We can also increase pricing—maybe even double it.
B2C is tough because pricing becomes a blocker; users just want to casually edit images and won’t pay much. But if we shift focus to B2B, the pricing becomes affordable for them because it’s tied to real business value. Brands don’t see it as editing images—they see it as saving time and money.