如今人們很難跟得上涉及Facebook和推特(Twitter)等社交網(wǎng)絡的大大小小的丑聞。從不經(jīng)意間幫助俄羅斯破壞選舉,到發(fā)現(xiàn)自己被極端主義者和色情作品制作者利用,社交網(wǎng)絡陷入一個又一個麻煩。
The latest is YouTube failing to stop videos of children being commented on by paedophiles, while letting advertisements appear alongside them. Only months after Alphabet’s video platform faced an advertiser boycott over extremist videos and had to apologise humbly, companies such as Diageo and Mars are again removing ads.
最新丑聞是YouTube沒能制止戀童癖者評論兒童視頻,還讓廣告出現(xiàn)在這些評論旁邊。就在這家Alphabet旗下視頻平臺因極端主義視頻而遭到廣告商抵制、不得不謙卑地道歉幾個月后,帝亞吉歐(Diageo)和瑪氏(Mars)等公司再度撤下了廣告。
Each scandal produces fresh calls for networks to be treated like publishers of news, who are responsible for everything that appears under their names. Each one forces them further to tighten their “community standards” and hire more content checkers. By next year, Facebook intends to employ 20,000 people in “community operations”, its censorship division.
每次丑聞都引發(fā)了新呼吁,要求把社交網(wǎng)絡當做新聞出版商那樣對待。新聞出版商對于以其名義發(fā)表的所有內(nèi)容負責。每次丑聞都強迫社交網(wǎng)絡收緊其“社區(qū)標準”,并聘用更多的內(nèi)容審核員。到明年,F(xiàn)acebook打算在其審查部門“社區(qū)運營”雇用2萬名員工。
Tempting as it is for publications that have lost much of their digital advertising to internet giants to believe they should be treated as exact equivalents, it is flawed: Facebook is not just a newspaper with 2.1bn readers. But being a platform does not absolve them of responsibility. The opposite, in fact — it makes their burden heavier.
對于被互聯(lián)網(wǎng)巨頭搶走很多數(shù)字廣告的出版公司而言,讓平臺受到同等對待是一個誘人的想法。但這是有問題的:Facebook不僅僅是一份擁有21億讀者的報紙。但是,起到一個平臺的作用并不免除社交媒體的責任。事實恰恰相反,這一點加重了它們的擔子。
A better way to think of Russian political ads, extremist videos, fake news and all the rest is as the polluters of common resources, albeit ones that are privately owned. The term for this is the tragedy of the commons. Open ecosystems that are openly shared by entire communities tend to get despoiled.
對于俄羅斯政治廣告、極端主義視頻和假新聞之類,更好的思考方式是把它們視為公共資源的污染者,盡管這些資源是私有的。描述這種狀況的術語是“公地悲劇”。由整個社區(qū)公開共享的開放生態(tài)系統(tǒng),往往會遭到糟蹋。
Garrett Hardin, the US ecologist and philosopher who coined the phrase in 1968, warned that “the inherent logic of the commons remorselessly generates tragedy”, adding gloomily that, “Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons.”
在1968年發(fā)明這一短語的美國生態(tài)學家、哲學家加勒特•哈丁(Garrett Hardin)警告稱,“公地的內(nèi)在邏輯無情地生成悲劇”。他沮喪地補充道,“毀滅是所有人沖向的目的地,在信奉公地自由的社會里,每個人都追逐著自己的最大利益。”
His prime example was the overgrazing of common land, when the number of farmers and shepherds seeking to use the resource of free feed for animals becomes too high. He also cited companies polluting the environment with sewage, chemical and other waste rather than cleaning up their own mess. Rational self-interest led to the commons becoming barren or dirty.
他的主要例子是公地被過度放牧,原因是太多農(nóng)牧業(yè)者尋求利用免費飼料資源來飼養(yǎng)動物。他也提到了企業(yè)任由自己排放的污水、化學物和其他廢棄物污染環(huán)境,而不去清理自己造成的爛攤子。理性自利導致公地變得貧瘠或骯臟。
Here lies the threat to social networks. They set themselves up as commons, offering open access to hundreds of millions to publish “user-generated content” and share photos with others. That in turn produced a network effect: people needed to use Facebook or others to communicate.
這也是社交網(wǎng)絡面臨的威脅所在。它們自命為公地,向數(shù)億人開放平臺,讓他們發(fā)布“用戶生成內(nèi)容”,并與其他人分享照片。這進而造成了一種網(wǎng)絡效應:人們需要使用Facebook或其他平臺進行溝通。
But they attract bad actors as well — people and organisations who exploit free resources for money or perverted motives. These are polluters of the digital commons and with them come over-grazers: people guilty of lesser sins such as shouting loudly to gain attention or attacking others.
但它們也吸引了危險分子——那些出于金錢或變態(tài)動機而濫用免費資源的個人和組織。這些是數(shù)字公地的污染者,伴隨著他們的還有過度放牧者:過錯較輕的人,比如大聲喊叫以吸引注意力,或者攻擊別人。
As Hardin noted, this is inevitable. The digital commons fosters great communal benefits that go beyond being a publisher in the traditional sense. The fact that YouTube is open and free allows all kinds of creativity to flourish in ways that are not enabled by the entertainment industry. The tragedy is that it also empowers pornographers and propagandists for terror.
正如哈丁所指出的,這是不可避免的。數(shù)字公地營造了巨大的社區(qū)效益,超出傳統(tǒng)意義上的出版商范疇。YouTube既開放又免費的事實,讓各種創(chuàng)意得以迸發(fā),這是娛樂業(yè)做不到的。悲劇在于,它也為色情作品制作者和恐怖主義宣傳者提供了便利。
So when Mark Zuckerberg, Facebook’s founder, denounced Russia’s fake news factory — “What they did is wrong and we’re not going to stand for it” — he sounded like the police chief in Casablanca who professes to be shocked that gambling is going on in a casino. Mr Zuckerberg’s mission of “bringing us all together as a global community” is laudable but it invites trouble.
所以,當Facebook創(chuàng)始人馬克•扎克伯格(Mark Zuckerberg)譴責俄羅斯的假新聞加工廠(“他們的行為是錯的,我們將不會容忍這類行為”)時,他聽起來像是卡薩布蘭卡的警察局長,承認對賭場正在進行的賭博感到震驚。扎克伯格的使命宣言(“讓我們所有人匯聚在一個全球社區(qū)”)值得贊賞,但也招來了麻煩。
Hardin was a pessimist about commons, arguing that there was no technical solution and that the only remedy was “mutual coercion, mutually agreed upon by the majority”. The equivalent for Facebook, Twitter and YouTube would be to become much more like publishers, imposing tight rules about entry and behaviour rather than their current openness.
當年哈丁對公地感到悲觀。他辯稱,對此并沒有技術解決方案,唯一的補救措施是“大多數(shù)人同意的集體脅迫”。對于Facebook、Twitter和YouTube,這種性質(zhì)的補救將是讓它們變得更像出版商,在進入和行為方面實行嚴格規(guī)則,而不是像目前這樣開放。
They resist this partly because it would bring stricter legal liability and partly because they want to remain as commons. But every time a scandal occurs, they have to reinforce their editorial defences and come closer to the kind of content monitoring that would change their nature.
它們抵制這么做,一方面因為這會帶來更嚴苛的法律責任,另一方面因為它們想維持公地的狀態(tài)。但每次爆發(fā)丑聞后,它們都不得不強化自己的內(nèi)容防御,向著改變它們性質(zhì)的那種內(nèi)容監(jiān)控走近一步。
It would cross the dividing line if they reviewed everything before allowing it to be published, rather than removing offensive material when alerted. Defying Hardin, they aspire to a technical solution: using artificial intelligence to identify copyright infringements and worse before their users or other organisations flag them for review.
如果它們不是在得到警告后才刪除冒犯性材料,而是在所有內(nèi)容發(fā)布之前都要對其進行審核,那將跨越分界線。跟哈丁意見相左的是,它們有志拿出一項技術解決方案:在用戶或其他組織提醒它們進行審核之前,使用人工智能來識別侵犯版權或更惡劣的內(nèi)容。
More than 75 per cent of extremist videos taken down by YouTube are identified by algorithms, while Facebook now finds automatically 99 per cent of the Isis and al-Qaeda material it removes. It is like having an automated fence around a territory to sort exploiters from legitimate entrants.
如今,被YouTube下線的極端主義視頻有75%以上是通過算法識別出來的,而被Facebook清除的伊斯蘭國(ISIS)和基地組織(al-Qaeda)材料中,有99%是自動查找出來的。這就像圍繞一塊領土建起一道自動化柵欄,把合法進入者跟作惡者區(qū)分開來。
Machines cannot solve everything, though. If they could exclude all miscreants, the commons would turn into something else. The vision of an unfettered community is alluring but utopias are always vulnerable.
不過,機器無法解決所有事情。如果機器能夠阻擋所有壞人,那么公地將變成其他事物。一個不受限制的社區(qū)的愿景是誘人的,但烏托邦永遠是脆弱的。
[email protected] 譯者/何黎