英語(yǔ)閱讀 學(xué)英語(yǔ),練聽(tīng)力,上聽(tīng)力課堂! 注冊(cè) 登錄
> 輕松閱讀 > 英語(yǔ)漫讀 >  內(nèi)容

如何抗擊隱形的歧視?

所屬教程:英語(yǔ)漫讀

瀏覽:

2016年11月29日

手機(jī)版
掃描二維碼方便學(xué)習(xí)和分享
Six months ago, tech entrepreneur Rohan Gilkes tried to rent a cabin in Idaho over the July 4 weekend, using the website Airbnb. All seemed well, until the host told him her plans had changed: she needed to use the cabin herself. Then a friend of Rohan’s tried to book the same cabin on the same weekend, and his booking was immediately accepted. Rohan’s friend is white; Rohan is black.

半年前,科技創(chuàng)業(yè)者羅恩•吉爾克斯(Rohan Gilkes)嘗試通過(guò)Airbnb網(wǎng)站預(yù)訂愛(ài)達(dá)荷州的一間小屋,在美國(guó)獨(dú)立日長(zhǎng)周末使用。一切似乎都很順利,直到房主告訴他,她的計(jì)劃有變:她自己需要使用那間小屋。之后,羅恩的一個(gè)朋友試著在同樣時(shí)間預(yù)訂那間小屋,他的預(yù)訂被迅速接受了。羅恩的朋友是白人;羅恩是黑人。

This is not a one-off. Late last year, three researchers from Harvard Business School — Benjamin Edelman, Michael Luca and Dan Svirsky — published a working paper with experimental evidence of discrimination. Using fake profiles to request accommodation, the researchers found that applicants with distinctively African-American names were 16 per cent less likely to have their bookings accepted. Edelman and Luca have also published evidence that black hosts receive lower incomes than whites while letting out very similar properties on Airbnb. The hashtag #AirbnbWhileBlack has started to circulate.

這并非一次性事件。哈佛商學(xué)院(Harvard Business School)的3名研究人員——本杰明•埃德?tīng)柭? Benjamin Edelman)、邁克爾•盧卡(Michael Luca)和丹•斯維爾斯基(Dan Svirsky)去年年末發(fā)布了一份工作論文,其中的實(shí)驗(yàn)證據(jù)證明了歧視的存在。研究人員使用假的資料來(lái)申請(qǐng)訂房,他們發(fā)現(xiàn),如果看申請(qǐng)者的姓名明顯像是非裔美國(guó)人,其預(yù)訂被接受的可能性要低16%。埃德?tīng)柭捅R卡還發(fā)布了一些證據(jù),表明在Airbnb上出租類(lèi)似房源時(shí),黑人房主的租房所得會(huì)比白人房主低。“#AirbnbWhileBlack(Airbnb上的黑人)”的話題標(biāo)簽開(kāi)始傳播。

Can anything be done to prevent such discrimination? It’s not a straightforward problem. Airbnb condemns racial discrimination but, by making names and photographs such a prominent feature of its website, it makes discrimination, conscious or unconscious, very easy.

可以做些什么來(lái)防止這種歧視嗎?這不是一個(gè)簡(jiǎn)單明了的問(wèn)題。Airbnb譴責(zé)種族歧視,但Airbnb網(wǎng)站的一個(gè)突出特征就是顯示姓名和照片,這讓有意或者無(wú)意的歧視變得非常容易。

“It’s a cheap way to build trust,” says researcher Michael Luca. But, he adds, it “invites discrimination”.

“這是一種成本低廉的建立信任的方式,”研究員邁克爾•盧卡說(shuō)。但他補(bǔ)充道,這“招來(lái)了歧視”。

Of course there’s plenty of discrimination to be found elsewhere. Other studies have used photographs of goods such as iPods and baseball cards being held in a person’s hand. On Craigslist and eBay, such goods sell for less if held in a black hand than a white one. An unpleasant finding — although in such cases it’s easy to use a photograph with no hand visible at all.

當(dāng)然,其他地方也可以發(fā)現(xiàn)很多歧視現(xiàn)象。另一些研究使用了賣(mài)家手持商品(如iPod或者棒球卡)拍下的商品照片。在Craigslist和eBay上,黑人手持的商品賣(mài)價(jià)會(huì)比白人手持商品的賣(mài)價(jià)低。這個(gè)發(fā)現(xiàn)令人不舒服——盡管在這種情況賣(mài)家想避免受到歧視很容易,只需使用不露出手的商品照片就可以了。

The Harvard Business School team have produced a browser plug-in called “Debias Yourself”. People who install the plug-in and then surf Airbnb will find that names and photographs have been hidden. It’s a nice idea, although one suspects that it will not be used by those who need it most. Airbnb could impose the system anyway but that is unlikely to prove tempting.

哈佛商學(xué)院的團(tuán)隊(duì)制作了一個(gè)叫做“Debias Yourself”的防偏見(jiàn)瀏覽器插件。安裝這個(gè)插件的人在瀏覽Airbnb的時(shí)候會(huì)發(fā)現(xiàn)姓名和照片被隱藏了。這是個(gè)好主意,不過(guò)我懷疑那些最需要這個(gè)功能的人不會(huì)使用它。Airbnb可以強(qiáng)行實(shí)施這個(gè)系統(tǒng),但這樣做不太可能有吸引力。

However, says Luca, there are more subtle ways in which the platform could discourage discrimination. For example, it could make profile portraits less prominent, delaying the appearance of a portrait until further along in the process of making a booking. And it could nudge hosts into using an “instant book” system that accelerates and depersonalises the booking process. (The company recently released a report describing efforts to deal with the problem.)

然而,盧卡表示,平臺(tái)還可以使用一些更含蓄的方式來(lái)阻止歧視。比如,平臺(tái)可以讓資料中的個(gè)人照片變得不那么突出,在預(yù)訂進(jìn)行到一定階段后再顯現(xiàn)照片。平臺(tái)還可以敦促房主使用“即時(shí)預(yù)訂”系統(tǒng),這種系統(tǒng)即能加快預(yù)訂過(guò)程,又能去除預(yù)訂過(guò)程中的個(gè)人因素。(該公司最近發(fā)布了一份報(bào)告,描述了為處理這一問(wèn)題做出的努力。)

But if the Airbnb situation has shone a spotlight on unconscious (and conscious) bias, there are even more important manifestations elsewhere in the economy. A classic study by economists Marianne Bertrand and Sendhil Mullainathan used fake CVs to apply for jobs. Some CVs, which used distinctively African-American names, were significantly less likely to lead to an interview than identical applications with names that could be perceived as white.

如果說(shuō)Airbnb的情況讓人們關(guān)注到無(wú)意識(shí)(和有意識(shí))的偏見(jiàn),那么在經(jīng)濟(jì)的其他領(lǐng)域,還有一些更重要的反映出偏見(jiàn)的情況。經(jīng)濟(jì)學(xué)家瑪麗安娜•貝特朗(Marianne Bertrand)和森德希爾•穆萊納坦(Sendhil Mullainathan)所做的一項(xiàng)經(jīng)典研究使用了假簡(jiǎn)歷來(lái)申請(qǐng)工作。使用明顯是非裔美國(guó)人姓名的簡(jiǎn)歷得到面試的幾率,要低于內(nèi)容一樣但使用可能被認(rèn)為是白人姓名的簡(jiǎn)歷。

Perhaps the grimmest feature of the Bertrand/Mullainathan study was the discovery that well-qualified black applicants were treated no better than poorly qualified ones. As a young black student, then, one might ask: why bother studying when nobody will look past your skin colour? And so racism can create a self-reinforcing loop.

或許貝特朗和穆萊納坦進(jìn)行的研究中最令人沮喪的一點(diǎn)是,完全夠格的黑人申請(qǐng)者得到的待遇和不那么夠格的申請(qǐng)者一樣糟糕。那么,一個(gè)年輕的黑人學(xué)生或許會(huì)問(wèn):如果沒(méi)人在乎你膚色以外的東西,為何還要費(fèi)力學(xué)習(xí)呢?因此,種族主義可能會(huì)導(dǎo)致一個(gè)自我加強(qiáng)的循環(huán)。

What to do?

該怎么辦?

One approach, as with “Debias Yourself”, is to remove irrelevant information: if a person’s skin colour or gender is irrelevant, then why reveal it to recruiters? The basic idea behind “Debias Yourself” was proven in a study by economists Cecilia Rouse and Claudia Goldin. Using a careful statistical design, Rouse and Goldin showed that when leading professional orchestras began to audition musicians behind a screen, the recruitment of women surged.

有一種策略,就像“Debias Yourself”防偏見(jiàn)插件一樣,是去除無(wú)關(guān)信息:既然一個(gè)人的膚色或者性別不影響其錄用,那何必把這些信息透露給招聘人員呢?經(jīng)濟(jì)學(xué)家塞西莉亞•勞斯(Cecilia Rouse)和克勞迪婭•戈?duì)柖?Claudia Goldin)的一項(xiàng)研究證明了“Debias Yourself”所依據(jù)的基本理念是正確的。通過(guò)細(xì)心的統(tǒng)計(jì)設(shè)計(jì),勞斯和戈?duì)柖”砻?,?dāng)一流的專(zhuān)業(yè)管弦樂(lè)團(tuán)開(kāi)始隔著屏風(fēng)面試音樂(lè)家時(shí),女性被錄取的幾率激增。

Importantly, blind auditions weren’t introduced to fight discrimination against women — orchestras didn’t think such discrimination was a pressing concern. Instead, they were a way of preventing recruiters from favouring the pupils of influential teachers. Yet a process designed to fight nepotism and favouritism ended up fighting sexism too.

重要的是,在這里,盲試的引入并不是為了抗擊對(duì)女性的歧視——管弦樂(lè)團(tuán)并不認(rèn)為他們?cè)谛詣e歧視方面存在緊迫問(wèn)題。事實(shí)上,盲試是為了防止招聘者偏袒具有影響力的教師的學(xué)生。然而,這種旨在打擊裙帶關(guān)系和徇私行為的程序最終也打擊了性別歧視。

 . . .   . . . 

A new start-up, “Applied”, is taking these insights into the broader job market. “Applied” is a spin-off from the UK Cabinet Office, the Behavioural Insights Team and Nesta, a charity that supports innovation; the idea is to use some simple technological fixes to combat a variety of biases.

新創(chuàng)立的企業(yè)Applied正把這些洞見(jiàn)應(yīng)用到更廣泛的就業(yè)市場(chǎng)中。Applied是由“行為研究小組”(Behavioural Insights Team,由英國(guó)內(nèi)閣辦公室(Cabinet Office)和支持創(chuàng)新的慈善機(jī)構(gòu)英國(guó)國(guó)家科技藝術(shù)基金會(huì)(Nesta)合作成立)和Nesta合作成立的公司,其創(chuàng)辦理念是通過(guò)一些簡(jiǎn)單的技術(shù)性修正來(lái)抗擊各種偏見(jiàn)。

A straightforward job application form is a breeding ground for discrimination and cognitive error. It starts with a name — giving clues to nationality, ethnicity and gender — and then presents a sequence of answers that are likely to be read as one big stew of facts. A single answer, good or bad, colours our perception of everything else, a tendency called the halo effect.

一份直觀的工作申請(qǐng)表為偏見(jiàn)和認(rèn)知錯(cuò)誤提供了溫床。這種表格把暴露申請(qǐng)者國(guó)籍、族裔和性別的姓名放在最開(kāi)頭,它接下來(lái)提供的一系列答案可能被看做各種事實(shí)的大雜燴。只需一個(gè)我們喜歡或不喜歡的答案,就會(huì)影響我們對(duì)其余一切答案的看法,這是一種叫做光暈效應(yīng)的傾向。

A recruiter using “Applied” will see “chunked” and “anonymised” details — answers to the application questions from different applicants, presented in a randomised order and without indications of race or gender. Meanwhile, other recruiters will see the same answers, but shuffled differently. As a result, says Kate Glazebrook of “Applied”, various biases simply won’t have a chance to emerge.

一個(gè)使用Applied服務(wù)的招聘人員將會(huì)看到“區(qū)塊化”和“匿名化”的細(xì)節(jié)——將不同申請(qǐng)者對(duì)申請(qǐng)表問(wèn)題的答案用隨機(jī)順序列出來(lái),不體現(xiàn)種族或者性別。同時(shí),其他招聘人員將看到同樣的答案,但以不同順序列出。Applied的凱特•格萊茲布魯克(Kate Glazebrook)表示,這樣一來(lái),各種偏見(jiàn)根本沒(méi)有機(jī)會(huì)產(chǎn)生。

When the Behavioural Insights Team ran its last recruitment round, applicants were rated using the new process and a more traditional CV-based approach. The best of the shuffled, anonymised applications were more diverse, and much better predictors of a candidate who impressed on the assessment day. Too early to declare victory — but a promising start.

當(dāng)“行為研究小組”進(jìn)行最后一輪招聘時(shí),有的申請(qǐng)人接受的是新程序的評(píng)分,有的接受的是基于簡(jiǎn)歷的更傳統(tǒng)方式的評(píng)分。使用被打亂順序、匿名化的申請(qǐng)表選出的最佳申請(qǐng)人更加背景各異,在評(píng)估日令人印象深刻的幾率也大大提高。宣布勝利還為時(shí)過(guò)早——但這是一個(gè)充滿希望的開(kāi)端。
 


用戶搜索

瘋狂英語(yǔ) 英語(yǔ)語(yǔ)法 新概念英語(yǔ) 走遍美國(guó) 四級(jí)聽(tīng)力 英語(yǔ)音標(biāo) 英語(yǔ)入門(mén) 發(fā)音 美語(yǔ) 四級(jí) 新東方 七年級(jí) 賴世雄 zero是什么意思日照市華潤(rùn)置地廣場(chǎng)(山東東路)英語(yǔ)學(xué)習(xí)交流群

網(wǎng)站推薦

英語(yǔ)翻譯英語(yǔ)應(yīng)急口語(yǔ)8000句聽(tīng)歌學(xué)英語(yǔ)英語(yǔ)學(xué)習(xí)方法

  • 頻道推薦
  • |
  • 全站推薦
  • 推薦下載
  • 網(wǎng)站推薦