Why should we have an AI to re-clothe images of naked people, rather than just hiding them from feeds entirely?

If your AI tool can detect a naked person to clothe them, it can detect a naked person to hide the post altogether.

Reply to this note

Please Login to reply.

Discussion

Any idea how the dignifai detect component performs in real time vs iOS sensitive content detector?

I have no idea, unfortunately.

Perhaps you don't want to hide them, you just don't want to get to know them that intimately.

Maybe so. I have a hard time seeing that use case, but that may just be a blind spot on my part.

Well, you could, at worst, say that it's the equivalent of the thug life meme makeovers people were getting a couple years ago.

Instead of making them look cooler, you make them look more modest.

#m=image%2Fjpeg&dim=408x555&blurhash=_UGIcSD%24.99F%7EpE1%3Fb%3FHocR.IVV%40t7WAShS5V%5Bs%3AoejFt6NGkDjFoes%3AaebI%252oej%3DoJa%23WXRkt7aeocoJj%5BbIRkbdWXjYj%40jss.afW%3DWXayj%5Baxf5oft6oJofa%23WVbHWC&x=665375476184bfd3fd31ed0fb5f3456b4c4c334202f1cf00908541675707223c

I could see a use case for AI in producing family-friendly music albums. Make modest album covers and automagically filter out explicit content. Make it easy for record labels to put out non-explicit versions of music.

I mute guys on here because they post seminaked pics. I could just put a digital T-Shirt on them and unmute them.

The "sensitive content" filter is better than nothing, but I often click on things, anyway, because it's usually not sensitive content. I've accidentally applied it, myself, and sometimes the AI is off.

With dignify AI, you can just look at all of the pictures. You can see the topic of the picture.