最新消息来自两位斯坦福大学教授,二人指出,谷歌的人工智能翻译以前在把西语资讯翻成英语时,会把描写女性的句子自动切换为“他说”或“他写道”。
“Flawed algorithms can amplify biases through feedback loops,” professors James Zou and Londa Schiebinger wrote in a paper titled “AI can be sexist and racist - it's time to make it fair.” “Each time a translation program defaults to ‘he said,’ it increases the relative frequency of the masculine pronoun on the web - potentially reversing hard-won advances toward equality.”
詹姆斯和隆达教授在论文《人工智能是性别歧视者兼种族歧视者——是时候矫正了》中写到:“有缺陷的算法能通过反馈漏洞放大偏见。每次翻译程序自动翻译成‘他说’,就增加了网上男性代词的相关使用频率,这对性别平等来说又造成了重重障碍。”
According to Google, the flaw, which “inadvertently replicated gender biases that already existed,” was learned from already-translated examples online.
谷歌方面表示这个缺陷是“无意中复制了早已存在的性别歧视”,翻译都是从网上已有的例子中学习的。
After acknowledging and remedying the issue, the tech giant announced its plans to extend the gender-specific translations to more languages as well as launch it on its other Translate surfaces such as IOS and Android.
【谷歌翻译被说性别歧视?这事真没你想的那么大】相关文章:
最新
2020-09-15
2020-09-15
2020-09-15
2020-09-15
2020-09-15
2020-09-15