But the authors argued that the technology already exists, and its capabilities are important to expose so that governments and companies can proactively consider privacy risks and the need for safeguards and regulations.
但该论文的作者表示,这些技术早已存在,曝光其功能很关键,因为这样政府和公司才能主动关注其隐私风险,以及进行管理防范的必要性。
"It's certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes," said Nick Rule, an associate professor of psychology at the University of Toronto, who has published research on the science of gaydar. "If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that's really bad."
多伦多大学心理学教授尼克•鲁尔曾发表过关于“同志雷达”的研究。他表示:“这当然是令人不安的,它就像任何新工具一样,如果心术不正的人得到它,就会用来做坏事。如果我们开始以外表来分析一个人,由此得出判断,并对他们做出恐怖的事情,那就太糟糕了。”
【人工智能“gay达”可凭一张照片判断性向 准确率超80%】相关文章:
★ 悲惨的2012年
最新
2020-09-15
2020-09-15
2020-09-15
2020-09-15
2020-09-15
2020-09-15