821
edits
Please sign and share the petition 'Tighten regulation on taking, making and faking explicit images' at Change.org initiated by Helen Mort to the w:Law Commission (England and Wales) to properly update UK laws against synthetic filth. Only name and email required to support, no nationality requirement. See Current and possible laws and their application @ #SSF! wiki for more info on the struggle for laws to protect humans.
Juho Kunsola (talk | contribs) m (mv content around) |
Juho Kunsola (talk | contribs) (linking to the Stop-Synthetic-Filth.org for most recent information) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
<center><big>Latest version of this article can be found in [https://stop-synthetic-filth.org/wiki/How_to_protect_yourself_and_others_from_covert_modeling '''How to protect yourself and others from covert modeling''' at the stop-synthetic-filth.org wiki]</big></center> | |||
[[File:ESPER_LightCage.jpg|thumb|left|480px|'''Do not agree''' and '''do not be fooled''' to '''[[Glossary#Reflectance capture|having your reflectance field captured]]''' on a '''[[Glossary#Light stage|light stage]]''', such as the he ESPER LightCage in the picture..]] | [[File:ESPER_LightCage.jpg|thumb|left|480px|'''Do not agree''' and '''do not be fooled''' to '''[[Glossary#Reflectance capture|having your reflectance field captured]]''' on a '''[[Glossary#Light stage|light stage]]''', such as the he ESPER LightCage in the picture..]] | ||
[[File:Muslim woman in Yemen.jpg|thumb|right|300px|{{Q|I feel pretty confident that mister photograph man will not be selling much of my data to the '''[[Glossary#No camera|no camera]]''' scene.|Honestly made up quote|the protecting power of e.g. [[Glossary#Niqāb|niqāb]]}}]] | [[File:Muslim woman in Yemen.jpg|thumb|right|300px|{{Q|I feel pretty confident that mister photograph man will not be selling much of my data to the '''[[Glossary#No camera|no camera]]''' scene.|Honestly made up quote|the protecting power of e.g. [[Glossary#Niqāb|niqāb]]}}]] | ||
Line 21: | Line 23: | ||
|access-date= 2019-10-13 | |access-date= 2019-10-13 | ||
|quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}} | |quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}} | ||
--> | |||
On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019">https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/</ref> | |||
<!-- | |||
{{cite web | |||
| last = Mihalcik | |||
| first = Carrie | |||
| title = California laws seek to crack down on deepfakes in politics and porn | |||
| website = [[cnet.com]] | |||
| publisher = [[CNET]] | |||
| date = 2019-10-04 | |||
| url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/ | |||
| access-date = 2019-10-13 }} | |||
</ref> | |||
--> | --> | ||