I'm still trying to think through the ramifications of DignifAI.
In some ways, it seems not that different from using AI to make deepfake porn of someone. Just because the end is different doesn't automatically make the means moral.
Additionally, this could contribute to the lie some men buy into that says "It's not my fault I'm lustful, it's those scantily-clad women." It could become a crutch to replace true virtue. Our goal should be to see everyone as human beings, regardless of how they're dressed (or not).
On the other hand, a tool like this would make an excellent self-defense for kids or adults who just want to use the internet without being exposed to pornography at random. It could catch what slips past ad blockers and other web content filters.
Lastly, applying a tool like this to social media would decrease click rates for those people who are trying to sell their sexuality online. Reduce the incentives, and fewer people will engage in that kind of vicious behavior.
Thoughts?
I think it's less like making deepfake porn and more like making a political deepfake.
You're not trying to sexually violate them.
It's a protest against degeneracy and a promotion of beauty.
Also a safety filter for women and children. These people (unfortunately, men are also like this) are often quite disgusting.
Imagine being able to flip a switch and all of the private bits remain private forever. Like a visual mute button.
We can't stop them from posting it, but they can't force us to look at it.
It would also help against deepfake porn, by putting clothes back on the poor girl.
Can we reverse-deepfake deepfake-porn?
Change the video to everyone fully-dressed and having a lovely tea party in the garden?
I think the better option would just be to eliminate deepfakes entirely, but where that's not possible, this is a decent alternative.
Hmm. I don't know.
I think changing their images to something wholesome is a powerful political statement.
I'm tired of Christians always giving way to public vice. There should be push-back from us.
We have an obligation to our children to not let the nation turn into a gigantic red light district.
Sex work is seedy.

On the other hand, it may be a more powerful statement to prevent those images from ever surfacing on social media feeds.
If the images are simply "reclothed," the account owners still gets likes, reposts, comments, etc. Instead, let's just make it so it doesn't pay to post smut. Every such post should just die a quiet death in obscurity because it never surfaces on anyone's feed.
Adjust the incentives to adjust behavior.
Some of these people have already deleted their accounts.
Those images used to not be allowed, but the standards kept being lowered, bit by bit, and nobody protested. Well, someone finally started protesting and they now have one of the most-popular accounts on X.
I suspect the images will soon start being disallowed, again.
Enough us enough.
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
I'm not so sure a political deepfake is okay, either.
Regardless, I definitely see the value as a safety filter. Use the AI to fight the AI.
Political deepfakes fall under caricature laws and are explicitly protected under free speech.
I do think they should be clearly marked, tho, as it is getting difficult to discern the difference.
Thread collapsed
I'm thinking less about laws as such and more about what is morally right or wrong. Perhaps it should be legal to do political deepfakes, but does that qualify morally as lying? Maybe. It's certainly different when the deepfake is not obvious, unlike a political cartoon, impression, or funny video/audio edit.
Germany has a strong culture of political impersonation cabaret, so we see this a digital version. It just has to be identifiable as a fake.
It's acting, not lying.
Well, I suppose you could argue that acting is always lying.
In acting, the audience knows that they're seeing an actor, not the real person. That's the difference.
Deepfakes present themselves as the real person, in some cases.
That's why they have to be identifiable, by law.
Otherwise, the acting can be too convincing.
Like having a lifelike puppet.
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
Why should we have an AI to re-clothe images of naked people, rather than just hiding them from feeds entirely?
If your AI tool can detect a naked person to clothe them, it can detect a naked person to hide the post altogether.
Any idea how the dignifai detect component performs in real time vs iOS sensitive content detector?
I have no idea, unfortunately.
Thread collapsed
Thread collapsed
Perhaps you don't want to hide them, you just don't want to get to know them that intimately.
Maybe so. I have a hard time seeing that use case, but that may just be a blind spot on my part.
Well, you could, at worst, say that it's the equivalent of the thug life meme makeovers people were getting a couple years ago.
Instead of making them look cooler, you make them look more modest.
#m=image%2Fjpeg&dim=408x555&blurhash=_UGIcSD%24.99F%7EpE1%3Fb%3FHocR.IVV%40t7WAShS5V%5Bs%3AoejFt6NGkDjFoes%3AaebI%252oej%3DoJa%23WXRkt7aeocoJj%5BbIRkbdWXjYj%40jss.afW%3DWXayj%5Baxf5oft6oJofa%23WVbHWC&x=665375476184bfd3fd31ed0fb5f3456b4c4c334202f1cf00908541675707223c
I could see a use case for AI in producing family-friendly music albums. Make modest album covers and automagically filter out explicit content. Make it easy for record labels to put out non-explicit versions of music.
Thread collapsed
Thread collapsed
Thread collapsed
I mute guys on here because they post seminaked pics. I could just put a digital T-Shirt on them and unmute them.
The "sensitive content" filter is better than nothing, but I often click on things, anyway, because it's usually not sensitive content. I've accidentally applied it, myself, and sometimes the AI is off.
With dignify AI, you can just look at all of the pictures. You can see the topic of the picture.
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
Also would like to point out that a lot of these images aren't even of human beings.
Thread collapsed