Jump to content

Please sign and share the petition 'Tighten regulation on taking, making and faking explicit images' at Change.org initiated by Helen Mort to the w:Law Commission (England and Wales) to properly update UK laws against synthetic filth. Only name and email required to support, no nationality requirement. See Current and possible laws and their application @ #SSF! wiki for more info on the struggle for laws to protect humans.

Ban Covert Modeling! wiki has moved to Stop Synthetic Filth! wiki

Digital sound-alikes: Difference between revisions

+ As of 2019 Symantec research knows of 3 cases where digital sound-alike technology has been used for crimes + <ref name="WaPo2019">
(+ == Documented digital sound-alike attacks == + 'An artificial-intelligence first: Voice-mimicking software reportedly used in a major theft', a 2019 Washington Post article)
(+ As of 2019 Symantec research knows of 3 cases where digital sound-alike technology has been used for crimes + <ref name="WaPo2019">)
Line 1: Line 1:
When it cannot be determined by human testing, is some synthesized recording a simulation of some person's speech, or is it a recording made of that person's actual real voice, it is a '''digital sound-alike'''.
When it cannot be determined by human testing, is some synthesized recording a simulation of some person's speech, or is it a recording made of that person's actual real voice, it is a '''digital sound-alike'''.  
 
As of '''2019''' Symantec research knows of 3 cases where digital sound-alike technology '''has been used for crimes'''.<ref name="WaPo2019">
{{cite web
|url= https://www.washingtonpost.com/technology/2019/09/04/an-artificial-intelligence-first-voice-mimicking-software-reportedly-used-major-theft/
|title= An artificial-intelligence first: Voice-mimicking software reportedly used in a major theft
|last= Drew
|first= Harwell
|date= 2019-09-04
|website=
|publisher=
|access-date= 2019-09-089
|quote= }}
</ref>


Living people can defend¹ themselves against digital sound-alike by denying the things the digital sound-alike says if they are presented to the target, but dead people cannot. Digital sound-alikes offer criminals new disinformation attack vectors and wreak havoc on provability.  
Living people can defend¹ themselves against digital sound-alike by denying the things the digital sound-alike says if they are presented to the target, but dead people cannot. Digital sound-alikes offer criminals new disinformation attack vectors and wreak havoc on provability.  
We use only those cookies necessary for the functioning of the website.