Facebook正因其審核其平臺(tái)上內(nèi)容的方式而再次受到關(guān)注。此前,一項(xiàng)由報(bào)紙開展的調(diào)查披露了Facebook對(duì)員工的指導(dǎo)意見,該意見用于指導(dǎo)員工處理涉及暴力、仇恨性言論、虐待兒童及色情等問題的帖子。
The training manuals, which were published by the Guardian on Monday, reveal how the social media group’s 4,500 global moderators judge when to remove or allow offensive content.
由《衛(wèi)報(bào)》(The Guardian)在周一發(fā)布的Facebook指導(dǎo)手冊(cè),揭示了這家社交媒體集團(tuán)的4500名全球?qū)徍巳藛T如何判斷何時(shí)該刪除或保留冒犯性內(nèi)容。
They show how posts that threaten to kill Donald Trump, US president, are banned because heads of state are considered “vulnerable”, but violent threats against others are permitted.
指導(dǎo)手冊(cè)顯示,那些威脅要?dú)⑺烂绹?guó)總統(tǒng)唐納德•特朗普(Donald Trump)的帖子會(huì)被封禁,因?yàn)檎啄X被視為“易受攻擊”。不過,暴力威脅其他人的帖子則被允許保留。
For example, the manuals guide moderators that it is permissible to allow users to post messages such as “to snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, on the grounds that the company does not consider them “credible” threats.
比如,手冊(cè)指導(dǎo)審核人員可以允許用戶發(fā)布諸如“要想捏斷一個(gè)婊子的脖子,就必須確保用盡全力掐她喉嚨中間”的帖子,理由是Facebook不認(rèn)為這些是“可信的”威脅。
The manuals also show how the platform, which now has 1.94bn users, will allow livestreaming of attempts to self harm because Facebook “doesn’t want to censor or punish people in distress who are attempting suicide”.
手冊(cè)還顯示,這個(gè)如今擁有19.4億用戶的平臺(tái)允許用戶直播自我傷害企圖,原因是它“不想審查或處罰正企圖自殺的抑郁人士”。
Facebook said safety organisations had advised the social media group that leaving posts of this nature online allowed people to seek help.
Facebook表示,安全組織曾建議這家社交媒體集團(tuán),在網(wǎng)上保留這類帖子的話,人們將能夠?qū)で髱椭?/p>
The revelations come as the group is under mounting pressure from politicians and campaigners who argue that it should take more responsibility for the content that appears on its website.
這些爆料出現(xiàn)之際,F(xiàn)acebook正面臨政界人士及活動(dòng)人士越來越大的壓力。這些人聲稱,F(xiàn)acebook應(yīng)為出現(xiàn)在其網(wǎng)站上的內(nèi)容承擔(dān)更多責(zé)任。
Last month, Facebook removed a video of a man in Thailand who murdered his baby daughter before killing himself. Two days later, another man in the US livestreamed his suicide by gunshot.
上月,F(xiàn)acebook刪除了一名泰國(guó)男子的一段視頻,這名男子在殺害了襁褓中的女兒后自殺身亡。兩天后,美國(guó)另一名男子直播了用槍自殺的過程。
The guidelines only come into play after users flag potentially offensive content to Facebook. The company says it already has automated systems in place to stop the publication of certain types of material such as child sex abuse and terrorism.
只有在用戶向Facebook舉報(bào)了可能有冒犯性的內(nèi)容后,這些指導(dǎo)意見才會(huì)發(fā)揮作用。Facebook表示,該公司已經(jīng)配備了自動(dòng)化系統(tǒng),以防止發(fā)布特定類型的內(nèi)容——比如兒童性虐待和恐怖主義內(nèi)容。
Facebook confirmed the authenticity of the manuals, some of which were reproduced by the Guardian, but added that some of the material was out of date.
Facebook證實(shí)了手冊(cè)(其中一些內(nèi)容由《衛(wèi)報(bào)》轉(zhuǎn)載)的真實(shí)性,但補(bǔ)充稱一些內(nèi)容已經(jīng)過時(shí)。
The company said the manuals were drawn up by a group of “highly trained people” with regular advice and input from external groups such as safety campaigners and non-governmental organisations. It added that the material was under constant review with the group holding weekly meetings.
該公司稱,手冊(cè)由一組“訓(xùn)練有素的人”制訂,定期接受來自安全活動(dòng)組織和非政府組織等外部組織的建議和意見。Facebook補(bǔ)充稱,該小組每周舉行會(huì)議,不斷對(duì)手冊(cè)內(nèi)容進(jìn)行審閱。
Monica Bickert, Facebook’s head of global policy management, admitted that there were “grey areas” in policing content on its website.
Facebook全球政策管理的負(fù)責(zé)人莫妮卡•比克特(Monica Bickert)承認(rèn),在監(jiān)督網(wǎng)站內(nèi)容方面確實(shí)存在“灰色區(qū)域”。
“For instance the line between satire and humour and inappropriate content is sometimes very grey,” Ms Bickert told the Guardian. “It’s very difficult to decide whether some things belong on the site or not.”
“例如諷刺幽默和不當(dāng)內(nèi)容之間的界線有時(shí)會(huì)非常模糊,”比克特向《衛(wèi)報(bào)》表示,“很難決定一些內(nèi)容是否適合放在網(wǎng)上。”
In the UK, politicians have been stepping up the pressure on the social network. Last week, the Conservative party manifesto set out plans to reform the way technology and social media groups operate.
在英國(guó),政界人士正加大對(duì)這家社交網(wǎng)站的施壓。上周,保守黨的競(jìng)選宣言提出了改革科技和社交媒體公司運(yùn)營(yíng)方式的計(jì)劃。
“We want social media companies to do more to help redress the balance and will take action to make sure they do,” said Theresa May, prime minister.
“我們希望社交媒體公司付出更大努力恢復(fù)平衡,我們也將采取行動(dòng)確保他們這么做,”英國(guó)首相特里薩•梅(Theresa May)表示。
A report by the select committee for culture, media and sport said earlier this month that “the biggest and richest social media companies are shamefully far from taking sufficient action to tackle illegal or dangerous content”.
本月早些時(shí)候,由文化、媒體和體育事務(wù)特別委員會(huì)發(fā)表的一份報(bào)告稱,“可恥的是,最大、最富有的社交媒體公司遠(yuǎn)未采取足夠行動(dòng)來處理非法或危險(xiǎn)內(nèi)容”。
Charlie Beckett, director of Polis, a media think-tank based at the London School of Economics, said he was reassured by the leaks for suggesting that Facebook was taking the issue of content on its platform “seriously”.
倫敦政治經(jīng)濟(jì)學(xué)院(London School of Economics)媒體智庫Polis的主任查利•貝克特(Charlie Beckett)稱,曝光的Facebook指導(dǎo)手冊(cè)表明Facebook正在“嚴(yán)肅”對(duì)待其平臺(tái)上的內(nèi)容問題,這令他感到安慰。
But he added: “Facebook is making taste decisions. What we may find offensive in the UK is likely to be very different to what people in Saudi Arabia find offensive. I’m concerned that freedom of expression campaigners will say even if it’s offensive, why not have it out there?”
但他補(bǔ)充稱:“Facebook眾口難調(diào)。我們?cè)谟?guó)認(rèn)為冒犯性的內(nèi)容,與身在沙特阿拉伯的人認(rèn)為冒犯性的內(nèi)容,可能非常不同。我擔(dān)心支持言論自由的活動(dòng)人士會(huì)說即使具有冒犯性,為什么不能表達(dá)出來?”