行業(yè)英語(yǔ) 學(xué)英語(yǔ),練聽力,上聽力課堂! 注冊(cè) 登錄
> 行業(yè)英語(yǔ) > 職場(chǎng)英語(yǔ) > 職場(chǎng)人生 >  內(nèi)容

機(jī)器人招聘系統(tǒng)的人類偏見

所屬教程:職場(chǎng)人生

瀏覽:

2017年12月30日

手機(jī)版
掃描二維碼方便學(xué)習(xí)和分享
Advances in artificial intelligence and the use of big data are changing the way many large companies recruit for entry level and junior management positions. These days, graduates’ CVs may well have to impress an algorithm rather than an HR executive.

人工智能(AI)的進(jìn)步和大數(shù)據(jù)的使用,正在改變?cè)S多大公司招聘入門級(jí)和初級(jí)管理職位員工的方式。如今,畢業(yè)生的簡(jiǎn)歷很可能不得不打動(dòng)某個(gè)算法,而不是一位人力資源高管。

“There’s been a dramatic increase in the use of automation in [high] volume selection processes over the past two years,” says Sophie Meaney, managing director, client solutions and strategic development at Amberjack, which provides and advises on automated recruitment processes.

“過(guò)去兩年里,在大流量篩選的過(guò)程中使用自動(dòng)化的情況出現(xiàn)戲劇性增加,”Amberjack負(fù)責(zé)客戶解決方案和戰(zhàn)略發(fā)展的董事總經(jīng)理索菲•米尼(Sophie Meaney)說(shuō)。該公司提供自動(dòng)化招聘流程以及相關(guān)咨詢服務(wù)。

While algorithms supposedly treat each application equally, experts are divided about whether so-called robo-recruitment promises an end to human bias in the selection process — or whether it may in fact reinforce it.

盡管算法理應(yīng)平等對(duì)待每份申請(qǐng),但在機(jī)器人招聘(robo-recruitment)將會(huì)終結(jié)遴選過(guò)程中的人類偏見、還是實(shí)際上也許會(huì)強(qiáng)化人類偏見的問(wèn)題上,專家們看法不一。

“AI systems are not all equal,” says Loren Larsen, chief technology officer for HireVue, which has developed an automated video interview analysis system. It has been used by companies including Unilever, the consumer goods group, Vodafone, the telecoms company, and Urban Outfitters, the retailer. “I think you have to look [at] the science team behind the work,” says Mr Larsen.

“AI系統(tǒng)并非完全平等,”HireVue首席技術(shù)官洛倫•拉森(Loren Larsen)說(shuō)。該公司開發(fā)出一套自動(dòng)化的視頻面試分析系統(tǒng)。包括消費(fèi)品集團(tuán)聯(lián)合利華(Unilever)、電信運(yùn)營(yíng)商沃達(dá)豐(Vodafone)和零售商Urban Outfitters在內(nèi)的很多公司已采用了該系統(tǒng)。“我認(rèn)為,你必須考察一下這項(xiàng)工作背后的科學(xué)團(tuán)隊(duì),”拉森說(shuō)。

The problem, experts say, is that to find the best candidates an algorithm has first to be told what “good” looks like in any given organisation. Even if it is not fed criteria that seem discriminatory, an efficient machine-learning system will quickly be able to replicate the characteristics of existing workers. If an organisation has favoured white male graduates from prestigious universities, the algorithm will learn to select more of the same.

專家們表示,問(wèn)題在于,要想找出最佳候選人,首先必須告訴算法在任何一個(gè)給定組織里,“好”是什么樣子。即便沒有饋入似乎有成見的標(biāo)準(zhǔn),一套高效率的機(jī)器學(xué)習(xí)系統(tǒng)將很快能夠復(fù)制現(xiàn)有員工的特點(diǎn)。如果某個(gè)組織喜歡知名大學(xué)的白人男性畢業(yè)生,算法將學(xué)會(huì)選出更多這一類別的人。

The growing reliance on automation to judge suitability for everything from a loan to a job or even to probation in the criminal justice system, worries Yuriy Brun, an associate professor specialising in software engineering at the University of Massachusetts.

從一筆貸款、一份工作,到刑事司法系統(tǒng)中的緩刑決定,在判斷眾多事情的合適性方面越來(lái)越依賴自動(dòng)化,讓馬薩諸塞大學(xué)(University of Massachusetts)軟件工程副教授尤里•布朗(Yuriy Brun)感到不安。

“A lot of the time a company will put out software but they don’t know if it is discriminatory,” he says. He points to the Compas tool in use in several US states to help assess a person’s likelihood to reoffend, which was reported to have discriminated against African Americans.

“很多時(shí)候,一家公司推出軟件,卻不知道軟件是否有成見,”他說(shuō)。他提到了美國(guó)好幾個(gè)州正在使用的幫助評(píng)估一個(gè)人再犯罪可能性的Compas工具。據(jù)報(bào)道,該工具傾向于歧視非洲裔美國(guó)人。

Prof Brun explains that, given the use of big data, algorithms will inevitably learn to discriminate. “People see that this is a really important problem. There’s a real danger of making things worse than they already are,” he says. His concern led him to co-develop a tool that tests systems for signs of bias.

布朗教授解釋稱,鑒于大數(shù)據(jù)的使用,算法將不可避免地學(xué)會(huì)歧視。“人們看到,這是一個(gè)真正重要的問(wèn)題。有一種讓事情變得比現(xiàn)在更糟糕的真實(shí)危險(xiǎn),”他說(shuō)。這種擔(dān)心導(dǎo)致他與人聯(lián)合開發(fā)出一種檢測(cè)系統(tǒng)偏見跡象的工具。

Many of those working with robo-recruiters are more optimistic. Kate Glazebrook, chief executive of Applied, a hiring platform, says her mission is to encourage hiring managers to move away from what she calls “proxies for quality” — indicators such as schools or universities — and move to more evidence-based methods.

與機(jī)器人招聘合作的很多人更為樂(lè)觀。招聘平臺(tái)Applied首席執(zhí)行官凱特•格萊茲布魯克(Kate Glazebrook)表示,她的使命是鼓勵(lì)招聘經(jīng)理遠(yuǎn)離她所說(shuō)的“質(zhì)素指標(biāo)”,比如學(xué)?;虼髮W(xué)等,轉(zhuǎn)向在更大程度上基于證據(jù)的方法。

“In general, the more you can make the hiring process relevant, the more likely that you will get the right person for the job,” she says. “總體來(lái)說(shuō),你能讓招聘流程變得越相關(guān),你就越有可能為工作崗位找到合適人選,”她說(shuō)。

Applied anonymises tests that candidates complete online and feeds them, question by question, to human assessors. Every stage of the process has been designed to strip out bias.

Applied把候選人在線完成的測(cè)試隱去姓名,然后把所有問(wèn)題逐一提供給人類評(píng)估者。整個(gè)流程每一階段的設(shè)計(jì)都是為了剔除偏見。

With the same aim, Unilever decided in 2016 to switch to a more automated process for its graduate-level entry programme, which has about 300,000 applicants a year for 800 positions.

帶著相同目的,聯(lián)合利華在2016年決定將其畢業(yè)生招聘計(jì)劃轉(zhuǎn)向一個(gè)自動(dòng)化程度更高的流程。每年有大約30萬(wàn)名候選人申請(qǐng)?jiān)摴镜?00個(gè)工作崗位。

Unilever worked with Amberjack, HireVue and Pymetrics, another high volume recruitment company, which developed a game-based test in which candidates are scored on their ability to take risks and learn from mistakes, as well as on emotional intelligence.

聯(lián)合利華跟Amberjack、HireVue以及另一家大流量招聘公司Pymetrics合作。Pymetrics開發(fā)了一種基于游戲的測(cè)試,通過(guò)測(cè)試對(duì)候選人在承擔(dān)風(fēng)險(xiǎn)和從錯(cuò)誤中學(xué)習(xí)的能力、以及情商進(jìn)行打分。

Unilever says the process has increased the ethnic diversity of its shortlisted candidates and has been more successful at selecting candidates who will eventually be hired.

聯(lián)合利華表示,這種方法提高了入圍候選人名單的民族多樣性,而且在遴選最終將被聘用的候選人方面更為成功。

“The things that we can do right now are stunning, but not as stunning as we’re going to be able to do next year or the year after,” says Mr Larsen.

“我們現(xiàn)在做得到的事情令人驚嘆,但我們明年或后年能夠做到的事情將會(huì)更加令人驚嘆,”拉森說(shuō)。

Still, robo-recruiters must be regularly tested in case bias has crept in, says Frida Polli, chief executive of Pymetrics. “The majority of algorithmic tools are most likely perpetuating bias. The good ones should have auditing.”

話雖如此,Pymetrics首席執(zhí)行官弗里達(dá)•波利(Frida Polli)說(shuō),機(jī)器人招聘系統(tǒng)仍必須接受定期測(cè)試,以防偏見滲入。“大部分算法工具很可能會(huì)強(qiáng)化偏見。好的算法應(yīng)當(dāng)有審核。”
 


用戶搜索

瘋狂英語(yǔ) 英語(yǔ)語(yǔ)法 新概念英語(yǔ) 走遍美國(guó) 四級(jí)聽力 英語(yǔ)音標(biāo) 英語(yǔ)入門 發(fā)音 美語(yǔ) 四級(jí) 新東方 七年級(jí) 賴世雄 zero是什么意思北京市建西苑北里英語(yǔ)學(xué)習(xí)交流群

  • 頻道推薦
  • |
  • 全站推薦
  • 推薦下載
  • 網(wǎng)站推薦