Jump to content

Please sign and share the petition 'Tighten regulation on taking, making and faking explicit images' at Change.org initiated by Helen Mort to the w:Law Commission (England and Wales) to properly update UK laws against synthetic filth. Only name and email required to support, no nationality requirement. See Current and possible laws and their application @ #SSF! wiki for more info on the struggle for laws to protect humans.

Ban Covert Modeling! wiki has moved to Stop Synthetic Filth! wiki

Adequate Porn Watcher AI: Difference between revisions

m
improving wording
m (space)
m (improving wording)
Line 5: Line 5:
The purpose of the '''APW_AI''' is providing safety and security to its users, who can briefly upload a model they've gotten of themselves and then the APW_AI will either say <font color="green">'''nothing matching found'''</font> or it will be of the opinion that <font color="red">'''something matching found'''</font>.
The purpose of the '''APW_AI''' is providing safety and security to its users, who can briefly upload a model they've gotten of themselves and then the APW_AI will either say <font color="green">'''nothing matching found'''</font> or it will be of the opinion that <font color="red">'''something matching found'''</font>.


If people are '''able to check''' whether there is '''synthetic porn''' that looks like themselves, this causes synthetic hate-illustration industrialists' product lose destructive potential and less destructive as attacks may be exposed by the APW_AI and thus decimate the monetary value of these disinformation weapons to the criminals.
If people are '''able to check''' whether there is '''synthetic porn''' that looks like themselves, this causes synthetic hate-illustration industrialists' product lose destructive potential and the attacks that happen are less destructive as they may be exposed by the APW_AI and thus decimate the monetary value of these disinformation weapons to the criminals.


Looking up if matches are found for '''anyone else's model''' is '''forbidden''' and this should probably be enforced with a facial biometric app that checks that the model you want checked is yours and that you are awake.
Looking up if matches are found for '''anyone else's model''' is '''forbidden''' and this should probably be enforced with a facial biometric app that checks that the model you want checked is yours and that you are awake.
We use only those cookies necessary for the functioning of the website.