Latest revision |
Your text |
Line 1: |
Line 1: |
| <center><big>Latest version of this article can be found in [https://stop-synthetic-filth.org/wiki/How_to_protect_yourself_and_others_from_covert_modeling '''How to protect yourself and others from covert modeling''' at the stop-synthetic-filth.org wiki]</big></center>
| |
|
| |
| [[File:ESPER_LightCage.jpg|thumb|left|480px|'''Do not agree''' and '''do not be fooled''' to '''[[Glossary#Reflectance capture|having your reflectance field captured]]''' on a '''[[Glossary#Light stage|light stage]]''', such as the he ESPER LightCage in the picture..]] | | [[File:ESPER_LightCage.jpg|thumb|left|480px|'''Do not agree''' and '''do not be fooled''' to '''[[Glossary#Reflectance capture|having your reflectance field captured]]''' on a '''[[Glossary#Light stage|light stage]]''', such as the he ESPER LightCage in the picture..]] |
| [[File:Muslim woman in Yemen.jpg|thumb|right|300px|{{Q|I feel pretty confident that mister photograph man will not be selling much of my data to the '''[[Glossary#No camera|no camera]]''' scene.|Honestly made up quote|the protecting power of e.g. [[Glossary#Niqāb|niqāb]]}}]] | | [[File:Muslim woman in Yemen.jpg|thumb|right|300px|{{Q|I feel pretty confident that mister photograph man will not be selling much of my data to the '''[[Glossary#No camera|no camera]]''' scene.|Honestly made up quote|the protecting power of e.g. [[Glossary#Niqāb|niqāb]]}}]] |
Line 8: |
Line 6: |
| [[File:20180613 Folkemodet Bornholm burka happening 0118 (42739707262).jpg|thumb|left|400px|Some humans in '''[[Glossary#Burqa|burqa]]s''' at the Bornholm burqa happening]] | | [[File:20180613 Folkemodet Bornholm burka happening 0118 (42739707262).jpg|thumb|left|400px|Some humans in '''[[Glossary#Burqa|burqa]]s''' at the Bornholm burqa happening]] |
|
| |
|
| = Help in case of appearance theft =
| |
|
| |
| '''[https://support.google.com/websearch/answer/9116649?hl=en Information on removing involuntary fake pornography from Google at support.google.com]''' if it shows up in Google. '''[https://support.google.com/websearch/troubleshooter/3111061#ts=2889054%2C2889099%2C2889064%2C9171203 Form for removing involuntary fake pornography at support.google.com]''', select 'I want to remove: A fake nude or sexually explicit picture or video of myself'
| |
|
| |
| Google added “'''involuntary synthetic pornographic imagery'''” to its '''ban list''' in September 2018, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”<ref name="WashingtonPost2018">https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/ </ref>
| |
| <!-- FIX THIS WHEN GET THE CITATION TEMPLATES {{cite web
| |
| |url= https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/
| |
| |title= Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'
| |
| |last= Harwell
| |
| |first= Drew
| |
| |date= 2018-12-30
| |
| |website=
| |
| |publisher= [[The Washington Post]]
| |
| |access-date= 2019-10-13
| |
| |quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}}
| |
| -->
| |
|
| |
|
| |
| On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019">https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/</ref>
| |
|
| |
| <!--
| |
| {{cite web
| |
| | last = Mihalcik
| |
| | first = Carrie
| |
| | title = California laws seek to crack down on deepfakes in politics and porn
| |
| | website = [[cnet.com]]
| |
| | publisher = [[CNET]]
| |
| | date = 2019-10-04
| |
| | url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/
| |
| | access-date = 2019-10-13 }}
| |
| </ref>
| |
| -->
| |
|
| |
|
| = Protect your appearance from covert modeling = | | = Protect your appearance from covert modeling = |
Line 93: |
Line 59: |
|
| |
|
| If '''[[Glossary#Media forensics|media forensics]]''' proves beyond suspicion the genuinity of the media in question or if '''credible witness''' to its creation '''is found''', the media should be considered evidence. | | If '''[[Glossary#Media forensics|media forensics]]''' proves beyond suspicion the genuinity of the media in question or if '''credible witness''' to its creation '''is found''', the media should be considered evidence. |
|
| |
| = References =
| |
| <references/>
| |