Facebook里面有15个月的新鲜地狱


街道的 瑞士达沃斯于2018年1月25日晚上被冰冻,这为乔治索罗斯的年度宴会徒步前往Hotel Seehof酒店增添了一丝危险。这位年迈的金融家有传统在世界经济论坛上举办晚宴,在那里他向大亨,部长和记者展示他对世界状况的看法。那天晚上,他开始警告他对核战争和气候变化的安静,摇晃的匈牙利口音。然后他转向他的下一个全球威胁的想法:谷歌和Facebook。 “采矿和石油公司利用物理环境;社交媒体公司利用社交环境,“他说。 “平台巨头的所有者认为自己是宇宙的主人,但实际上他们是保留其主导地位的奴隶……达沃斯是宣布他们的日子已经屈指可数的好地方。”

穿过城镇,一个 包括首席运营官Sheryl Sandberg和全球通讯副总裁Elliot Schrage在内的Facebook高级管理人员组成了一个临时总部,靠近山脚下的托马斯·曼(Thomas Mann)放置虚构的疗养院。世界上最大的公司经常在这个世界上最大的精英公司中建立接待室,但今年Facebook的展馆并不是通常的通风现场。它更像是一个沙坑,它看到了与索罗斯的宽边一直点头的同样的大亨,部长和记者的紧张会议。

在过去的一年里,Facebook的股票像往常一样上涨,但其声誉正迅速下降至垃圾债券状态。世界已经了解到俄罗斯情报人员如何使用该平台来操纵美国选民。缅甸的种族灭绝僧侣和菲律宾的暴君都喜欢这个平台。该公司的中层员工变得更加曲折,更有能力,各地的批评者都在争论Facebook的工具助长了部落主义和愤怒。这个论点得到了唐纳德特朗普的每一句话的信任,那天早上抵达达沃斯,这是全球主义者花园派对中令人发指的部落主义臭鼬。

首席执行官马克扎克伯格最近承诺花2018年试图修复Facebook。但即便是公司新生的改革尝试也被视为可能宣布民主制度的战争。本月早些时候,Facebook公布了其新闻Feed排名的重大变化,以支持该公司所谓的“有意义的社交互动”。新闻Feed是Facebook的核心 – 流动宝贝图片,新闻报道,新时代公案的中心流,和俄罗斯制造的模因显示撒旦赞同希拉里克林顿。这些变化有利于朋友之间的互动,这意味着他们会不喜欢媒体公司发布的故事。不过,该公司承诺,对于因用户驱动的“可信度”指标而获得高分的本地新闻和出版物而言,这种打击将会有所缓和。

达沃斯为许多媒体高管提供了第一次与Facebook领导人就这些变化进行对抗的机会。因此,一个又一个,暴躁的出版商和编辑们整个星期都在达沃斯广场上跋涉到Facebook的总部,他们的靴子上贴着冰夹,寻求清晰。 Facebook在新闻机构的生活中已成为一种反复无常的,神圣的力量;它为他们提供了大约三分之一的推荐流量,同时吞噬了媒体行业所依赖的广告收入的越来越大的份额。现在这个。为什么?为什么被假新闻所困扰的公司会对真正的新闻不屑一顾? Facebook的算法认为值得信赖的是什么?媒体高管甚至会看到他们自己的分数吗?

Facebook没有准备好回答所有这些问题;当然不是它想要的。最后一个 – 特别是关于可信度得分 – 很快激发了公司在达沃斯的高管和他们在门洛帕克的同事之间激烈的争论。包括Schrage在内的一些领导人想告诉出版商他们的分数。这是公平的。与此同意的还有该公司与新闻出版商的首席联络人坎贝尔•布朗(Campbell Brown),他的工作描述包括吸收Facebook和新闻业相互崩溃时的一些影响。

但加利福尼亚州的工程师和产品经理说,这是愚蠢的。当时负责News Feed的负责人亚当·莫塞里(Adam Mosseri)在电子邮件中指出,如果出版商知道他们的分数,他们就会对系统进行游戏。另外,他们太过简单,无法理解方法,而且无论如何分数都会不断变化。更糟糕的是,该公司还没有可靠的可信度衡量标准。

加热的电子邮件在瑞士和门洛帕克之间来回飞来飞去。解决方案被提出并被击落。这是一个典型的Facebook困境。该公司的算法使选择变得如此复杂和相互依赖,以至于任何人都难以掌握它。如果你解释一些正在发生的事情,人们会感到困惑。他们也倾向于在巨大的方程式中迷恋微小的因素。所以在这种情况下,就像多年来的其他许多人一样,Facebook选择了不透明度。在达沃斯没有任何东西可以透露,之后什么都不会透露。媒体高管会不满意地走开。

在周四晚上索罗斯的演讲之后,那些编辑和出版商回到了他们的酒店,许多人写了,编辑或者至少阅读了有关亿万富翁长篇大论的所有新闻。文章中的文章中出现了“他们的日子已经编号”的字样。第二天,桑德伯格向施拉格发送了一封电子邮件,询问他是否知道索罗斯是否已经缩短了Facebook的股票。

与此同时,Facebook的产品工程师远离达沃斯,开始着手实施扎克伯格愿景的精确算法业务。如果你想为数十亿人推广值得信赖的新闻,你首先要指出什么是值得信赖的,什么是新闻。 Facebook两者都很难过。为了确定可信度,该公司正在测试人们如何回应有关他们对不同出版商的印象的调查。为了定义新闻,工程师们从之前的项目中删除了一个分类系统 – 一个将该类别作为涉及“政治,犯罪或悲剧”的故事。

这个特殊的选择,这意味着算法对各种类型都不那么好 其他 新闻 – 从健康和科学到技术和体育 – 不是Facebook高管在达沃斯与媒体领导人讨论的事情。虽然它经过了与高级经理的评审,但并非公司的每个人都知道这一点。当一位Facebook高管最近在与一名下级工程师的简报中了解到这一点时,他们说他们“几乎落在他妈的地板上”。

令人困惑的有意义的社交互动 – 以内部异议,激烈的外部批评,真正的改革努力和愚蠢的错误为标志 – 为Facebook 2018年的舞台奠定了基础。这是关于annus horribilis的故事,基于对65名当前和前者的采访雇员。这最终是关于世界上最大的社交网络中发生的最大变化的故事。但它也是关于一家陷入其自身病态的公司,并且反过来说,也是由于其自身成功秘诀的必然逻辑。

Facebook强大的网络效应使广告商无法逃离,如果你在Facebook拥有Instagram上的人,那么整体用户数量仍然健康。但该公司的原始文化和使命不断创造了一系列残酷的债务,这些债务在过去16个月中经常出现。该公司陷入困境,解散并道歉。即使说实话,人们也不相信。批评者出现在各方面,要求从根本到矛盾到不可能的变化。随着危机的成倍增加和分歧,即使是公司自己的解决方案也开始相互蚕食。
这部故事中最关键的一集 – 最严重的危机 – 开始于达沃斯之后不久,当时一些记者来自 纽约时报守护者和英国的第四频道新闻来了。他们对一家名为Cambridge Analytica的英国阴暗公司了解了一些令人不安的事情,他们有一些问题。

II。

它是在 某些方面,一个古老的故事。早在2014年,剑桥大学的一位名叫亚历山大·科根的年轻学者建立了一个名为thisisyourdigitallife的个性问卷调查应用程序。几十万人注册,让Kogan不仅可以访问他们的Facebook数据,而且还因为Facebook当时的隐私政策松散,而且他们的朋友网络中有多达8700万人。 Kogan并没有简单地将所有这些数据用于研究目的,而是他已经允许这样做,而是将这个数据传递给剑桥分析公司(Cambridge Analytica),这是一家战略咨询公司,谈论了一个关于其为政治客户建模和操纵人类行为的能力的大型游戏。 2015年12月, 守护者 据报道,剑桥分析公司曾使用这些数据来帮助特德克鲁兹的总统竞选活动,此时Facebook要求删除数据。

Facebook在2018年的最初几个月就知道了这一点。公司也知道 – 因为每个人都知道 – 在Ted Cruz退出竞选之后,剑桥Analytica继续与特朗普竞选合作。 Facebook的一些人担心他们公司与Cambridge Analytica关系的故事还没有结束。一位前Facebook通讯官员记得在2017年夏天被一位经理警告说剑桥Analytica故事的未解决的元素仍然是一个严重的漏洞。然而,Facebook上没有人确切知道未爆弹药的确切时间和地点。 “该公司还不知道它还不知道什么,”经理说。 (经理现在否认这么说。)

该公司在2月底首次听说过 守护者 有一个故事即将到来,但负责制定回应的部门是一个分裂的房子。在秋季,Facebook聘请了一位名叫雷切尔·惠特斯通(Rachel Whetstone)的辉煌但又火热的科技行业公关老手。她来自优步,为Facebook的WhatsApp,Instagram和Messenger运营通讯。不久,她与扎克伯格一起参加公共活动,加入桑德伯格的高级管理层会议,并制定决策,如公共关系公司外部裁员或保留 – 这通常与那些正式负责Facebook的300人通信商店的人一起休息。工作人员迅速分为粉丝和仇敌。

因此,一个混乱和不稳定的沟通团队与管理层挤在一起讨论如何回应 监护人 记者。标准方法应该是纠正错误信息或错误,并改变公司的故事。 Facebook最终选择了另一种策略。它将推动媒体的发布:在故事发布的前夕,将大量信息公之于众,希望能够将它们放在首位。这是一种短期利益但长期成本的策略。调查记者就像斗牛犬。踢他们一次,他们永远不会再相信你。

据多名参与者称,Facebook决定承担这一风险是一个紧密的决定。但是在3月16日星期五的晚上,该公司宣布将剑桥Analytica暂停其平台。这是一个致命的选择。 “这就是为什么 讨厌我们,“一位高级管理人员说。另一位通讯官员说:“去年,我不得不与记者交谈,他们担心我们会前瞻他们。这是最糟糕的。无论什么微积分,都不值得。“

这种策略也行不通。第二天,故事集中在一个名叫Christopher Wylie的粉红色头发的魅力举报人在欧洲和美国爆炸。 Wylie,前剑桥Analytica员工,声称该公司没有删除它从Facebook获取的数据,并且可能已经利用这些数据来推动美国总统大选。第一句话 守护者该报道称,这是“科技巨头有史以来最大的数据泄露事件之一”,剑桥分析公司曾使用这些数据“建立一个强大的软件程序来预测和影响投票箱的选择。”

这个故事是一个女巫酿造的俄罗斯特工,隐私侵犯,混乱数据和唐纳德特朗普。它触及了当下几乎所有令人担忧的问题。政客们要求监管;用户呼吁抵制。 Facebook在一天内损失了360亿美元的市值。因为许多员工都是根据股票的表现获得补偿,所以Menlo Park并没有忽视这一下降。

对于这个情感故事,Facebook有一个程序员的理性回应。几乎每一个事实 守护者其领导人相信,其开头段落具有误导性。公司没有遭到破坏 – 一名学者在获得许可的情况下公平地下载了数据,然后不公平地将其移交。剑桥Analytica构建的软件并不强大,也无法预测或影响投票箱的选择。

但这一切都不重要。当一位名叫Alex Stamos的Facebook高管试图在Twitter上争辩说这个词 突破口 被滥用了,他被打倒了。他很快删除了他的推文。他的立场是正确的,但谁在乎呢?如果有人用枪指着你并举起一个手指向上的标志,你就不应该担心撇号。这个故事是许多人中第一个阐述Facebook挣扎的核心讽刺之一的故事。该公司的算法帮助维持了一个优先考虑愤怒的新闻生态系统,而新闻生态系统正在学习如何引发对Facebook的愤怒。

随着故事的传播,公司开始融化。前员工记得混乱的场面,精疲力竭的高管在扎克伯格的私人会议室(称为水族馆)和桑德伯格的会议室里滑进,他的名字,只有好消息,似乎越来越不协调。一名员工记得各地的罐头和零食包装;通往水族馆的大门将会打开,您可以看到人们的头部在他们的手中,并感受到身体所有热量的温暖。在故事发生之前说了太多之后,该公司之后说得太少了。高级管理人员恳请桑德伯格和扎克伯格公开对抗这个问题。两人都保持公开沉默。

“我们有数百名记者涌入我们的收件箱,我们没有什么可告诉他们的,”当时的一位通讯员说。 “我记得走到其中一个自助餐厅,偷听其他Facebookers说,'为什么我们不说什么?为什么没有发生?'“

根据参与的众多人士的说法,许多因素促成了Facebook令人费解的决定,保持五天静音。在2016年大选之后,高管们并不希望重复扎克伯格的可耻表现,当时大多数人只是袖手旁观,他认为假新闻影响了结果是“一个非常疯狂的想法”。他们仍然相信人们会发现剑桥Analytica的数据毫无用处。根据一位高管的说法,“你可以从第三方广告网络中购买所有这些他妈的东西,所有这些数据来自全球各地的第三方广告网络。你可以通过从Facebook窃取它来获取比所有这些数据经纪人更多隐私侵犯数据的方式。

“这五天非常非常长,”桑德伯格说,他现在承认延迟是一个错误。她说,公司陷入瘫痪,因为它并不了解所有事实;它认为Cambridge Analytica删除了数据。它没有一个特定的问题需要解决。允许Kogan收集如此多数据的宽松隐私政策在几年前就收紧了。 “我们不知道如何在不完善的信息系统中做出回应,”她说。

Facebook的另一个问题是,它不理解过去两年中反对它的反抗情绪。它的主要决策者已成功运行相同的剧本十五年:做他们认为最适合平台增长的(通常以牺牲用户隐私为代价),如果有人抱怨就道歉,并继续推进。或者,正如旧口号所说:快速行动并打破局面。现在公众认为Facebook打破了西方民主。 这个 隐私侵犯 – 不像之前的许多其他人 – 不是人们会简单地克服的。

最后,周三,该公司决定扎克伯格应该接受电视采访。在冷落CBS和PBS之后,该公司召集了一名CNN记者,通讯员工信任该记者的合理性。该网络的摄像机工作人员被视为潜在的间谍,一名通信官员记得即使他们去洗手间也需要对他们进行监控。 (Facebook现在说这不是公司协议。)在采访中,扎克伯格道歉。但他也是具体的:对于想要访问Facebook数据的人来说,会有审计和更严格的规则。 Facebook将建立一个工具,让用户知道他们的数据是否已经结束了Cambridge Analytica。他承诺Facebook会确保这种崩溃再也不会发生。

随后进行了一系列其他采访。那个星期三,WIRED得到了一个安静的单挑,我们可以在下午晚些时候与扎克伯格聊天。大约凌晨4点45分,他的通讯负责人打电话说他将在5点打电话。在那次采访中,扎克伯格再次道歉。但是当他转向其中一个话题时,他变得更加明亮,据接近他的人说,他真正投入了他的想象力:使用人工智能来防止人类污染Facebook。这不是对剑桥Analytica丑闻的回应,而是对积压的指控,自2016年以来收集,Facebook已成为毒性病毒性的污水池,但这实际上是一个问题,他想知道如何解决。他不认为AI可以完全消除仇恨言论或裸露或垃圾邮件,但它可能会接近。他告诉WIRED,“我对食品安全的理解是,当鸡肉进行加工时会有一定量的灰尘进入鸡肉,而且量不大 – 需要的量很少。”

采访只是扎克伯格下一次挑战的热身:4月份在三个国会委员会回答有关Cambridge Analytica和其他几个丑闻的问题之前的一系列公开电视节目。国会议员一直呼吁他作证约一年,他成功地避开了他们。现在是游戏时间,Facebook的大部分人都对它将如何发展感到害怕。

事实证明,大多数立法者都被证明是惊人的不知情,而且首席执行官大部分时间都在大力度假。回到家里,一些Facebook员工站在他们的小隔间里欢呼。当一个沉闷的参议员奥林哈奇询问Facebook究竟如何在免费提供服务的同时赚钱时,扎克伯格自信地回应说:“参议员,我们投放广告”,这句话很快被印在门洛帕克的T恤上。

亚当迈达

III。

星期六之后 桑德伯格告诉剑桥Analytica丑闻,Facebook的高级律师莫莉卡特勒(Molly Cutler)创建了一个危机应对团队。桑德伯格说,确保我们再也没有对此类重大问题做出延迟回应。她把卡特勒的新办公桌放在她旁边,以保证卡特勒能够说服部门负责人与她合作。 “我星期一开始担任这个角色,”卡特勒说。 “我从来没有回到我原来的办公桌前。几个星期后,法律团队的某个人给我发了消息并说:“你想让我们打包你的东西吗?好像你不回来了。'“

然后,桑德伯格和扎克伯格开始大肆招募人才,以便密切关注平台。很快,如果没有被告知成千上万加入公司的内容主持人,你就无法听取简报或会见行政人员。到2018年底,大约有30,000人从事安全保障工作,这大致相当于美国所有报纸的新闻编辑员工人数。其中约有15,000名内容审核人员,主要是承包商,在全球20多家大型审查工厂工作。

Facebook也在努力制定明确的规则来执行其基本政策,有效地为该平台的15亿日常用户编写宪法。仅仅缓和仇恨言论的指令就超过了200页。主持人必须经过80小时的培训才能开始。除其他外,他们必须能说流利的表情符号;例如,他们研究的文件显示,皇冠,玫瑰和美元符号可能意味着皮条客正在提供妓女。整个公司大约有100人每周二见面审查政策。一个类似的小组每周五召开一次会议,审查内容政策执行问题,就像7月初发生的那样,该公司将“独立宣言”标记为仇恨言论。

由于批评者的压力,该公司在很大程度上雇用了所有这些人。然而,这也是该公司的命运,同样的批评者发现在Facebook上调节内容可能是一个悲惨的,灵魂灼热的工作。正如Casey Newton在对Verge的调查中所报告的那样,亚利桑那州Facebook承包商前哨站的平均内容主持人每年赚28,000美元,其中许多人说他们因工作而出现类似PTSD的症状。其他人花了很多时间来查看阴谋论,他们自己也成了信徒。

最终,Facebook知道这项工作必须主要由机器完成 – 这无论如何都是公司的偏好。机器可以整天浏览色情内容而不需要修补,他们还没有学会加入工会。因此,该公司同时在CTO Mike Schroepfer的带领下进行了巨大的努力,以创建人工智能系统,该系统可以大规模识别Facebook希望从其平台中消除的内容,包括垃圾邮件,裸体,仇恨言论,ISIS宣传,和儿童被放入洗衣机的视频。一个更棘手的目标是确定Facebook想要降级的东西但不是消除类似误导性的clickbait垃圾。在过去几年中,Facebook的核心AI团队每年的规模翻了一番。

即使是基本的机器学习系统也可以非常可靠地识别和阻止色情或图形暴力图像。仇恨言论要困难得多。一句话可能是仇恨或骄傲,取决于谁说出来。 “你不是我的婊子,然后婊子你已经完成,”可能是死亡威胁,灵感,或来自Cardi B的抒情诗。想象一下,试图用西班牙语,普通话或缅甸语解码类似的复杂系列。虚假新闻同样棘手。 Facebook不希望平台上出现谎言或公牛。但它知道真相可以是一个万花筒。善意的人在互联网上搞错了;恶意的演员有时会把事情做对。

Schroepfer的工作就是让Facebook的人工智能在追赶这些极其模棱两可的内容时嗤之以鼻。每个类别的工具和成功率都有所不同。但基本技术大致相同:您需要一组已分类的数据,然后您需要在其上训练机器。对于垃圾邮件和裸露,这些数据库已经存在,这些数据库是在无辜的日子里手工创建的,当时在线威胁是假的伟哥和山羊模因,而不是弗拉基米尔普京和纳粹。在其他类别中,您需要自己构建标记数据集 – 理想情况下,不需要雇佣一大群人来这样做。

Schroepfer与WIRED热情讨论的一个想法是,从人类识别为仇恨言论的几个例子开始,然后使用AI生成类似的内容并同时标记它。就像科学家对啮齿动物和大鼠梗犬进行生物工程学一样,这种方法将使用软件来创造和识别更复杂的辱骂,侮辱和种族主义废话。最终,专门训练超级大鼠的猎犬可以在整个Facebook上放松。

该公司在人工智能中筛选内容的工作大约在三年前。但Facebook很快就成功地将垃圾邮件和支持恐怖的帖子分类。现在,在平台上的任何人标记它之前,已经识别出在这些类别中创建的99%以上的内容。与人类其他生活一样,性更为复杂。识别裸露的成功率为96%。讨厌的言论更加艰难:Facebook在用户之前只发现了52%。

这些是Facebook高管喜欢谈论的问题。它们涉及数学和逻辑,在公司工作的人是你将遇到的最合乎逻辑的人。但剑桥Analytica主要是一个隐私丑闻。 Facebook最明显的回应是加强内容审核,旨在保持平台安全和民用。然而,有时涉及的两大价值 – 隐私和文明 – 却遭到了反对。如果你给人们保护他们的数据完全保密的方法,你也会创建一个秘密隧道,让老鼠可以在未被发现的情况下匆匆忙忙。

换句话说,每一种选择都需要权衡,而每次权衡都意味着一些价值被摒弃了。而你所摒弃的每一个价值 – 特别是当你在2018年成为Facebook时 – 意味着锤子会落在你的头上。

IV。

危机提供机会。 它们会迫使您进行一些更改,但它们也为您一直希望进行的更改提供保护。在扎克伯格在国会作证之后四周,该公司发起了历史上最大规模的改组。大约十几名高管转移了椅子。最重要的是,作为Facebook内核称为Blue App的长期负责人Chris Cox现在也将监督WhatsApp和Instagram。考克斯可能是扎克伯格最亲密,最信任的知己,而且看起来像是继任计划。 Adam Mosseri搬到了Instagram上运行产品。

Instagram由Kevin Systrom和Mike Krieger于2010年创立,于2012年以10亿美元被Facebook收购。当时的价格看起来非常高:对于一家拥有13名员工的公司来说,这笔钱多少钱?不久价格看起来很低廉:世界上发展最快的社交网络只有几十亿美元?在内部,Facebook起初以骄傲的态度看待Instagram的无情增长。但是,据一些人说,骄傲变成了怀疑,因为学生的成功匹配,然后超过教授的。

Systrom炙手可热的新闻报道并没有帮助。据直接参与的人士称,2014年,扎克伯格下令没有其他高管在未经桑德伯格批准的情况下参加杂志档案。有些人记得这是为了让竞争对手更难找到员工偷猎;其他人还记得它是包含Systrom的直接努力。 Facebook的高管们也认为Instagram的增长正在蚕食蓝色应用程序。 2017年,考克斯的团队向高级管理人员展示了数据,表明人们在蓝色应用程序中共享的内容较少,部分原因是因为Instagram。对某些人来说,这听起来就像他们只是提出要解决的问题。其他人都惊呆了,并将其视为Facebook的管理层更关心他们所生产的产品而不是他们所采用的产品。

当Cambridge Analytica丑闻曝光时,Instagram创始人Kevin Systrom和Mike Krieger已经担心扎克伯格会对他们感到厌恶。

大多数Instagram和Facebook的一些人太讨厌照片共享应用程序的增长,无论如何都可以看作是麻烦。是的,人们使用Blue App和Instagram更多。但这并不意味着Instagram正在偷猎用户。也许离开蓝色应用程序的人会花时间在Snapchat或观看Netflix或修剪他们的草坪。如果Instagram快速增长,也许是因为产品很好? Instagram有它的问题 – 欺凌,羞辱,FOMO,宣传,腐败的微观影响者 – 但它的内部架构帮助它避免了一些困扰该行业的恶魔。帖子难以转发,这会降低病毒式传播速度。外部链接更难嵌入,这使得假新闻提供商远离。极简主义设计也将问题降至最低。多年来,Systrom和Krieger为使Instagram免于汉堡包而感到自豪:在屏幕的一角打开菜单,由三条水平线组成的图标。 Facebook到处都有汉堡包和其他菜单。

Systrom和Krieger似乎还在Menlo Park的同事面前预测了他们的技术。甚至在特朗普当选之前,Instagram就已将有毒评论作为其首要任务,并于2017年6月推出了人工智能过滤系统。到2018年春,公司正在开发一款产品,提醒用户“你们都是当他们看到他们的Feed中的所有新帖子时就赶上了。换句话说,“把你该死的手机放下来和朋友聊聊。”这可能是一种违反直觉的增长方式,但从长远来看,获得善意确实有帮助。而牺牲其他目标的增长并不是Facebook的风格。

根据熟悉他们思想的人士说,当剑桥分析公司的丑闻袭来时,西斯特罗姆和克里格已经担心扎克伯格会对他们感到厌恶。他们被允许合理独立经营公司六年,但现在扎克伯格正在施加更多控制并提出更多要求。当关于重组的谈话开始时,Instagram的创始人们推动了Mosseri。他们喜欢他,他们认为他是扎克伯格内圈最值得信赖的成员。他有设计背景和数学思维。他们失去了自主权,所以他们不妨从母舰中获得最值得信赖的使者。或者正如林登·约翰逊所说的关于J.埃德加·胡佛的说法,“让他在帐篷内蹲出来比在帐篷外面撒尿更好。”

与此同时,WhatsApp,Brian Acton和Jan Koum的创始人已经离开了Facebook的帐篷并开始了火灾。扎克伯格在2014年以190亿美元收购了加密消息平台,但这些文化从未完全融合。 The two sides couldn’t agree on how to make money—WhatsApp’s end-to-end encryption wasn’t originally designed to support targeted ads—and they had other differences as well. WhatsApp insisted on having its own conference rooms, and, in the perfect metaphor for the two companies’ diverging attitudes over privacy, WhatsApp employees had special bathroom stalls designed with doors that went down to the floor, unlike the standard ones used by the rest of Facebook.

Eventually the battles became too much for Acton and Koum, who had also come to believe that Facebook no longer intended to leave them alone. Acton quit and started funding a competing messaging platform called Signal. During the Cambridge Analytica scandal, he tweeted, “It is time. #deletefacebook.” Soon afterward, Koum, who held a seat on Facebook’s board, announced that he too was quitting, to play more Ultimate Frisbee and work on his collection of air-cooled Porsches.

The departure of the WhatsApp founders created a brief spasm of bad press. But now Acton and Koum were gone, Mosseri was in place, and Cox was running all three messaging platforms. And that meant Facebook could truly pursue its most ambitious and important idea of 2018: bringing all those platforms together into something new.

V.

By the late spring, news organizations—even as they jockeyed for scoops about the latest meltdown in Menlo Park—were starting to buckle under the pain caused by Facebook’s algorithmic changes. Back in May of 2017, according to Parse.ly, Facebook drove about 40 percent of all outside traffic to news publishers. A year later it was down to 25 percent. Publishers that weren’t in the category “politics, crime, or tragedy” were hit much harder.

At WIRED, the month after an image of a bruised Zuckerberg appeared on the cover, the numbers were even more stark. One day, traffic from Facebook suddenly dropped by 90 percent, and for four weeks it stayed there. After protestations, emails, and a raised eyebrow or two about the coincidence, Facebook finally got to the bottom of it. An ad run by a liquor advertiser, targeted at WIRED readers, had been mistakenly categorized as engagement bait by the platform. In response, the algorithm had let all the air out of WIRED’s tires. The publication could post whatever it wanted, but few would read it. Once the error was identified, traffic soared back. It was a reminder that journalists are just sharecroppers on Facebook’s giant farm. And sometimes conditions on the farm can change without warning.

Inside Facebook, of course, it was not surprising that traffic to publishers went down after the pivot to “meaningful social interactions.” That outcome was the point. It meant people would be spending more time on posts created by their friends and family, the genuinely unique content that Facebook offers. According to multiple Facebook employees, a handful of executives considered it a small plus, too, that the news industry was feeling a little pain after all its negative coverage. The company denies this—“no one at Facebook is rooting against the news industry,” says Anne Kornblut, the company’s director of news partnerships—but, in any case, by early May the pain seemed to have become perhaps excessive. A number of stories appeared in the press about the damage done by the algorithmic changes. And so Sheryl Sandberg, who colleagues say often responds with agitation to negative news stories, sent an email on May 7 calling a meeting of her top lieutenants.

That kicked off a wide-ranging conversation that ensued over the next two months. The key question was whether the company should introduce new factors into its algorithm to help serious publications. The product team working on news wanted Facebook to increase the amount of public content—things shared by news organizations, businesses, celebrities—allowed in News Feed. They also wanted the company to provide stronger boosts to publishers deemed trustworthy, and they suggested the company hire a large team of human curators to elevate the highest-quality news inside of News Feed. The company discussed setting up a new section on the app entirely for news and directed a team to quietly work on developing it; one of the team’s ambitions was to try to build a competitor to Apple News.

Some of the company’s most senior execs, notably Chris Cox, agreed that Facebook needed to give serious publishers a leg up. Others pushed back, especially Joel Kaplan, a former deputy chief of staff to George W. Bush who was now Facebook’s vice president of global public policy. Supporting high-quality outlets would inevitably make it look like the platform was supporting liberals, which could lead to trouble in Washington, a town run mainly by conservatives. Breitbart and the Daily Caller, Kaplan argued, deserved protections too. At the end of the climactic meeting, on July 9, Zuckerberg sided with Kaplan and announced that he was tabling the decision about adding ways to boost publishers, effectively killing the plan. To one person involved in the meeting, it seemed like a sign of shifting power. Cox had lost and Kaplan had won. Either way, Facebook’s overall traffic to news organizations continued to plummet.

VI.

That same evening, Donald Trump announced that he had a new pick for the Supreme Court: Brett Kavanaugh. As the choice was announced, Joel Kaplan stood in the background at the White House, smiling. Kaplan and Kavanaugh had become friends in the Bush White House, and their families had become intertwined. They had taken part in each other’s weddings; their wives were best friends; their kids rode bikes together. No one at Facebook seemed to really notice or care, and a tweet pointing out Kaplan’s attendance was retweeted a mere 13 times.

Meanwhile, the dynamics inside the communications department had gotten even worse. Elliot Schrage had announced that he was going to leave his post as VP of global communications. So the company had begun looking for his replacement; it focused on interviewing candidates from the political world, including Denis McDonough and Lisa Monaco, former senior officials in the Obama administration. But Rachel Whetstone also declared that she wanted the job. At least two other executives said they would quit if she got it.

The need for leadership in communications only became more apparent on July 11, when John Hegeman, the new head of News Feed, was asked in an interview why the company didn’t ban Alex Jones’ InfoWars from the platform. The honest answer would probably have been to just admit that Facebook gives a rather wide berth to the far right because it’s so worried about being called liberal. Hegeman, though, went with the following: “We created Facebook to be a place where different people can have a voice. And different publishers have very different points of view.”

This, predictably, didn’t go over well with the segments of the news media that actually try to tell the truth and that have never, as Alex Jones has done, reported that the children massacred at Sandy Hook were actors. Public fury ensued. Most of Facebook didn’t want to respond. But Whetstone decided it was worth a try. She took to the @facebook account—which one executive involved in the decision called “a big fucking marshmallow we shouldn’t ever use like this”—and started tweeting at the company’s critics.

“Sorry you feel that way,” she typed to one, and explained that, instead of banning pages that peddle false information, Facebook demotes them. The tweet was very quickly ratioed, a Twitter term of art for a statement that no one likes and that receives more comments than retweets. Whetstone, as @facebook, also declared that just as many pages on the left pump out misinformation as on the right. That tweet got badly ratioed too.

Five days later, Zuckerberg sat down for an interview with Kara Swisher, the influential editor of Recode. Whetstone was in charge of prep. Before Zuckerberg headed to the microphone, Whetstone supplied him with a list of rough talking points, including one that inexplicably violated the first rule of American civic discourse: Don’t invoke the Holocaust while trying to make a nuanced point.

About 20 minutes into the interview, while ambling through his answer to a question about Alex Jones, Zuckerberg declared, “I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down, because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong.” Sometimes, Zuckerberg added, he himself makes errors in public statements.

The comment was absurd: People who deny that the Holocaust happened generally aren’t just slipping up in the midst of a good-faith intellectual disagreement. They’re spreading anti-Semitic hate—intentionally. Soon the company announced that it had taken a closer look at Jones’ activity on the platform and had finally chosen to ban him. His past sins, Facebook decided, had crossed into the domain of standards violations.

Eventually another candidate for the top PR job was brought into the headquarters in Menlo Park: Nick Clegg, former deputy prime minister of the UK. Perhaps in an effort to disguise himself—or perhaps because he had decided to go aggressively Silicon Valley casual—he showed up in jeans, sneakers, and an untucked shirt. His interviews must have gone better than his disguise, though, as he was hired over the luminaries from Washington. “What makes him incredibly well qualified,” said Caryn Marooney, the company’s VP of communications, “is that he helped run a country.”

Adam Maida

VII.

At the end of July, Facebook was scheduled to report its quarterly earnings in a call to investors. The numbers were not going to be good; Facebook’s user base had grown more slowly than ever, and revenue growth was taking a huge hit from the company’s investments in hardening the platform against abuse. But in advance of the call, the company’s leaders were nursing an additional concern: how to put Insta­gram in its place. According to someone who saw the relevant communications, Zuckerberg and his closest lieutenants were debating via email whether to say, essentially, that Insta­gram owed its spectacular growth not primarily to its founders and vision but to its relationship with Facebook.

Zuckerberg wanted to include a line to this effect in his script for the call. Whetstone counseled him not to, or at least to temper it with praise for Insta­gram’s founding team. In the end, Zuckerberg’s script declared, “We believe Insta­gram has been able to use Facebook’s infrastructure to grow more than twice as quickly as it would have on its own. A big congratulations to the Insta­gram team—and to all the teams across our company that have contributed to this success.”

After the call—with its payload of bad news about growth and investment—Facebook’s stock dropped by nearly 20 percent. But Zuckerberg didn’t forget about Insta­gram. A few days later he asked his head of growth, Javier Olivan, to draw up a list of all the ways Facebook supported Insta­gram: running ads for it on the Blue App; including link-backs when someone posted a photo on Insta­gram and then cross-published it in Facebook News Feed; allowing Insta­gram to access a new user’s Facebook connections in order to recommend people to follow. Once he had the list, Zuckerberg conveyed to Insta­gram’s leaders that he was pulling away the supports. Facebook had given Insta­gram servers, health insurance, and the best engineers in the world. Now Insta­gram was just being asked to give a little back—and to help seal off the vents that were allowing people to leak away from the Blue App.

Systrom soon posted a memo to his entire staff explaining Zuckerberg’s decision to turn off supports for traffic to Insta­gram. He disagreed with the move, but he was committed to the changes and was telling his staff that they had to go along. The memo “was like a flame going up inside the company,” a former senior manager says. The document also enraged Facebook, which was terrified it would leak. Systrom soon departed on paternity leave.

The tensions didn’t let up. In the middle of August, Facebook prototyped a location-­tracking service inside of Insta­gram, the kind of privacy intrusion that Insta­gram’s management team had long resisted. In August, a hamburger menu appeared. “It felt very personal,” says a senior Insta­gram employee who spent the month implementing the changes. It felt particularly wrong, the employee says, because Facebook is a data-driven company, and the data strongly suggested that Insta­gram’s growth was good for everyone.

The Instagram founders' unhappiness with Facebook stemmed from tensions that had brewed over many years and had boiled over in the past six months.

Friends of Systrom and Krieger say the strife was wearing on the founders too. According to someone who heard the conversation, Systrom openly wondered whether Zuckerberg was treating him the way Donald Trump was treating Jeff Sessions: making life miserable in hopes that he’d quit without having to be fired. Insta­gram’s managers also believed that Facebook was being miserly about their budget. In past years they had been able to almost double their number of engineers. In the summer of 2018 they were told that their growth rate would drop to less than half of that.

When it was time for Systrom to return from paternity leave, the two founders decided to make the leave permanent. They made the decision quickly, but it was far from impulsive. According to someone familiar with their thinking, their unhappiness with Facebook stemmed from tensions that had brewed over many years and had boiled over in the past six months.

And so, on a Monday morning, Systrom and Krieger went into Chris Cox’s office and told him the news. Systrom and Krieger then notified their team about the decision. Somehow the information reached Mike Isaac, a reporter at 纽约时报, before it reached the communications teams for either Facebook or Insta­gram. The story appeared online a few hours later, as Insta­gram’s head of communications was on a flight circling above New York City.

After the announcement, Systrom and Krieger decided to play nice. Soon there was a lovely photograph of the two founders smiling next to Mosseri, the obvious choice to replace them. And then they headed off into the unknown to take time off, decompress, and figure out what comes next. Systrom and Krieger told friends they both wanted to get back into coding after so many years away from it. If you need a new job, it’s good to learn how to code.

VIII.

Just a few days after Systrom and Krieger quit, Joel Kaplan roared into the news. His dear friend Brett Kavanaugh was now not just a conservative appellate judge with Federalist Society views on Roe v. Wade; he had become an alleged sexual assailant, purported gang rapist, and national symbol of toxic masculinity to somewhere between 49 and 51 percent of the country. As the charges multiplied, Kaplan’s wife, Laura Cox Kaplan, became one of the most prominent women defending him: She appeared on Fox News and asked, “What does it mean for men in the future? It’s very serious and very troubling.” She also spoke at an #IStandWithBrett press conference that was live­streamed on Breitbart.

On September 27, Kavanaugh appeared before the Senate Judiciary Committee after four hours of wrenching recollections by his primary accuser, Christine Blasey Ford. Laura Cox Kaplan sat right behind him as the hearing descended into rage and recrimination. Joel Kaplan sat one row back, stoic and thoughtful, directly in view of the cameras broadcasting the scene to the world.

Kaplan isn’t widely known outside of Facebook. But he’s not anonymous, and he wasn’t wearing a fake mustache. As Kavanaugh testified, journalists started tweeting a screenshot of the tableau. At a meeting in Menlo Park, executives passed around a phone showing one of these tweets and stared, mouths agape. None of them knew Kaplan was going to be there. The man who was supposed to smooth over Facebook’s political dramas had inserted the company right into the middle of one.

Kaplan had long been friends with Sandberg; they’d even dated as undergraduates at Harvard. But despite rumors to the contrary, he had told neither her nor Zuckerberg that he would be at the hearing, much less that he would be sitting in the gallery of supporters behind the star witness. “He’s too smart to do that,” one executive who works with him says. “That way, Joel gets to go. Facebook gets to remind people that it employs Republicans. Sheryl gets to be shocked. And Mark gets to denounce it.”

If that was the plan, it worked to perfection. Soon Facebook’s internal message boards were lighting up with employees mortified at what Kaplan had done. Management’s initial response was limp and lame: A communications officer told the staff that Kaplan attended the hearing as part of a planned day off in his personal capacity. That wasn’t a good move. Someone visited the human resources portal and noted that he hadn’t filed to take the day off.

What Facebook Fears

In some ways, the world’s largest social network is stronger than ever, with record revenue of $55.8 billion in 2018. But Facebook has also never been more threatened. Here are some dangers that could knock it down.

US Antitrust Regulation
In March, Democratic presidential candidate Elizabeth Warren proposed severing Instagram and WhatsApp from Facebook, joining the growing chorus of people who want to chop the company down to size. Even US attorney general William Barr has hinted at probing tech’s “huge behemoths.” But for now, antitrust talk remains talk—much of it posted to Facebook.

Federal Privacy Crackdowns
Facebook and the Federal Trade Commission are negotiating a settlement over whether the company’s conduct, including with Cambridge Analytica, violated a 2011 consent decree regarding user privacy.根据 纽约时报, federal prosecutors have also begun a criminal investigation into Facebook’s data-sharing deals with other technology companies.

European Regulators
While America debates whether to take aim at Facebook, Europe swings axes. In 2018, the EU’s General Data Protection Regulation forced Facebook to allow users to access and delete more of their data. Then this February, Germany ordered the company to stop harvesting web-browsing data without users’ consent, effectively outlawing much of the company’s ad business.

User Exodus
Although a fifth of the globe uses Facebook every day, the number of adult users in the US has largely stagnated. The decline is even more precipitous among teenagers. (Granted, many of them are switching to Instagram.) But network effects are powerful things: People swarmed to Facebook because everyone else was there; they might also swarm for the exits.

The hearings were on a Thursday. A week and a day later, Facebook called an all-hands to discuss what had happened. The giant cafeteria in Facebook’s headquarters was cleared to create space for a town hall. Hundreds of chairs were arranged with three aisles to accommodate people with questions and comments. Most of them were from women who came forward to recount their own experiences of sexual assault, harassment, and abuse.

Zuckerberg, Sandberg, and other members of management were standing on the right side of the stage, facing the audience and the moderator. Whenever a question was asked of one of them, they would stand up and take the mic. Kaplan appeared via video conference looking, according to one viewer, like a hostage trying to smile while his captors stood just offscreen. Another participant described him as “looking like someone had just shot his dog in the face.” This participant added, “I don’t think there was a single male participant, except for Zuckerberg looking down and sad onstage and Kaplan looking dumbfounded on the screen.”

Employees who watched expressed different emotions. Some felt empowered and moved by the voices of women in a company where top management is overwhelmingly male. Another said, “My eyes rolled to the back of my head” watching people make specific personnel demands of Zuckerberg, including that Kaplan undergo sensitivity training. For much of the staff, it was cathartic. Facebook was finally reckoning, in a way, with the #MeToo movement and the profound bias toward men in Silicon Valley. For others it all seemed ludicrous, narcissistic, and emblematic of the liberal, politically correct bubble that the company occupies. A guy had sat in silence to support his best friend who had been nominated to the Supreme Court; as a consequence, he needed to be publicly flogged?

In the days after the hearings, Facebook organized small group discussions, led by managers, in which 10 or so people got together to discuss the issue. There were tears, grievances, emotions, debate. “It was a really bizarre confluence of a lot of issues that were popped in the zit that was the SCOTUS hearing,” one participant says. Kaplan, though, seemed to have moved on. The day after his appearance on the conference call, he hosted a party to celebrate Kavanaugh’s lifetime appointment. Some colleagues were aghast. According to one who had taken his side during the town hall, this was a step too far. That was “just spiking the football,” they said. Sandberg was more forgiving. “It’s his house,” she told WIRED. “That is a very different decision than sitting at a public hearing.”

In a year during which Facebook made endless errors, Kaplan’s insertion of the company into a political maelstrom seemed like one of the clumsiest. But in retrospect, Facebook executives aren’t sure that Kaplan did lasting harm. His blunder opened up a series of useful conversations in a workplace that had long focused more on coding than inclusion. Also, according to another executive, the episode and the press that followed surely helped appease the company’s would-be regulators. It’s useful to remind the Republicans who run most of Washington that Facebook isn’t staffed entirely by snowflakes and libs.

IX.

That summer and early fall weren’t kind to the team at Facebook charged with managing the company’s relationship with the news industry. At least two product managers on the team quit, telling colleagues they had done so because of the company’s cavalier attitude toward the media. In August, a jet-lagged Campbell Brown gave a presentation to publishers in Australia in which she declared that they could either work together to create new digital business models or not. If they didn’t, well, she’d be unfortunately holding hands with their dying business, like in a hospice. Her off-the-­record comments were put on the record by The Australian, a publication owned by Rupert Murdoch, a canny and persistent antagonist of Facebook.

In September, however, the news team managed to convince Zuckerberg to start administering ice water to the parched executives of the news industry. That month, Tom Alison, one of the team’s leaders, circulated a document to most of Facebook’s senior managers; it began by proclaiming that, on news, “we lack clear strategy and alignment.”

Then, at a meeting of the company’s leaders, Alison made a series of recommendations, including that Facebook should expand its definition of news—and its algorithmic boosts—beyond just the category of “politics, crime, or tragedy.” Stories about politics were bound to do well in the Trump era, no matter how Facebook tweaked its algorithm. But the company could tell that the changes it had introduced at the beginning of the year hadn’t had the intended effect of slowing the political venom pulsing through the platform. In fact, by giving a slight tailwind to politics, tragedy, and crime, Facebook had helped build a news ecosystem that resembled the front pages of a tempestuous tabloid. Or, for that matter, the front page of FoxNews.com. That fall, Fox was netting more engagement on Facebook than any other English-language publisher; its list of most-shared stories was a goulash of politics, crime, and tragedy. (The network’s three most-shared posts that month were an article alleging that China was burning bibles, another about a Bill Clinton rape accuser, and a third that featured Laura Cox Kaplan and #IStandWithBrett.)


Politics, Crime, or Tragedy?

In early 2018, Facebook’s algorithm started demoting posts shared by businesses and publishers. But because of an obscure choice by Facebook engineers, stories involving “politics, crime, or tragedy” were shielded somewhat from the blow—which had a big effect on the news ecosystem inside the social network.

Source: Parse.ly

That September meeting was a moment when Facebook decided to start paying indulgences to make up for some of its sins against journalism. It decided to put hundreds of millions of dollars toward supporting local news, the sector of the industry most disrupted by Silicon Valley; Brown would lead the effort, which would involve helping to find sustainable new business models for journalism. Alison proposed that the company move ahead with the plan hatched in June to create an entirely new section on the Facebook app for news. And, crucially, the company committed to developing new classifiers that would expand the definition of news beyond “politics, crime, or tragedy.”

Zuckerberg didn’t sign off on everything all at once. But people left the room feeling like he had subscribed. Facebook had spent much of the year holding the media industry upside down by the feet. Now Facebook was setting it down and handing it a wad of cash.

As Facebook veered from crisis to crisis, something else was starting to happen: The tools the company had built were beginning to work. The three biggest initiatives for the year had been integrating WhatsApp, Insta­gram, and the Blue App into a more seamless entity; eliminating toxic content; and refocusing News Feed on meaningful social interactions. The company was making progress on all fronts. The apps were becoming a family, partly through divorce and arranged marriage but a family nonetheless. Toxic content was indeed disappearing from the platform. In September, economists at Stanford and New York University revealed research estimating that user interactions with fake news on the platform had declined by 65 percent from their peak in December 2016 to the summer of 2018. On Twitter, meanwhile, the number had climbed.

There wasn’t much time, however, for anyone to absorb the good news. Right after the Kavanaugh hearings, the company announced that, for the first time, it had been badly breached. In an Ocean’s 11–style heist, hackers had figured out an ingenious way to take control of user accounts through a quirk in a feature that makes it easier for people to play Happy Birthday videos for their friends. The breach was both serious and absurd, and it pointed to a deep problem with Facebook. By adding so many features to boost engagement, it had created vectors for intrusion. One virtue of simple products is that they are simpler to defend.

X.

Given the sheer number of people who accused Facebook of breaking democracy in 2016, the company approached the November 2018 US midterm elections with trepidation. It worried that the tools of the platform made it easier for candidates to suppress votes than get them out. And it knew that Russian operatives were studying AI as closely as the engineers on Mike Schroepfer’s team.

So in preparation for Brazil’s October 28 presidential election and the US midterms nine days later, the company created what it called “election war rooms”—a term despised by at least some of the actual combat veterans at the company. The rooms were partly a media prop, but still, three dozen people worked nearly around the clock inside of them to minimize false news and other integrity issues across the platform. Ultimately the elections passed with little incident, perhaps because Facebook did a good job, perhaps because a US Cyber Command operation temporarily knocked Russia’s primary troll farm offline.

Facebook got a boost of good press from the effort, but the company in 2018 was like a football team that follows every hard-fought victory with a butt fumble and a 30-point loss. In mid-November, 纽约时报 published an impressively reported stem-winder about trouble at the company. The most damning revelation was that Facebook had hired an opposition research firm called Definers to investigate, among other things, whether George Soros was funding groups critical of the company. Definers was also directly connected to a dubious news operation whose stories were often picked up by Breitbart.

After the story broke, Zuckerberg plausibly declared that he knew nothing about Definers. Sandberg, less plausibly, did the same. Numerous people inside the company were convinced that she entirely understood what Definers did, though she strongly maintains that she did not. Meanwhile, Schrage, who had announced his resignation but never actually left, decided to take the fall. He declared that the Definers project was his fault; it was his communications department that had hired the firm, he said. But several Facebook employees who spoke with WIRED believe that Schrage’s assumption of responsibility was just a way to gain favor with Sandberg.

Inside Facebook, people were furious at Sandberg, believing she had asked them to dissemble on her behalf with her Definers denials. Sandberg, like everyone, is human. She’s brilliant, inspirational, and more organized than Marie Kondo. Once, on a cross-country plane ride back from a conference, a former Facebook executive watched her quietly spend five hours sending thank-you notes to everyone she’d met at the event—while everyone else was chatting and drinking. But Sandberg also has a temper, an ego, and a detailed memory for subordinates she thinks have made mistakes. For years, no one had a negative word to say about her. She was a highly successful feminist icon, the best-selling author of Lean In, running operations at one of the most powerful companies in the world. And she had done so under immense personal strain since her husband died in 2015.

But resentment had been building for years, and after the Definers mess the dam collapsed. She was pummeled in the , in The Washington Post, on Breit­bart, and in WIRED. Former employees who had refrained from criticizing her in interviews conducted with WIRED in 2017 relayed anecdotes about her intimidation tactics and penchant for retribution in 2018. She was slammed after a speech in Munich. She even got dinged by Michelle Obama, who told a sold-out crowd at the Barclays Center in Brooklyn on December 1, “It’s not always enough to lean in, because that shit doesn’t work all the time.”

Everywhere, in fact, it was becoming harder to be a Facebook employee. Attrition increased from 2017, though Facebook says it was still below the industry norm, and people stopped broadcasting their place of employment. The company’s head of cybersecurity policy was swatted in his Palo Alto home. “When I joined Facebook in 2016, my mom was so proud of me, and I could walk around with my Facebook backpack all over the world and people would stop and say, ‘It’s so cool that you worked for Facebook.’ That’s not the case anymore,” a former product manager says. “It made it hard to go home for Thanksgiving.”

XI.

By the holidays in 2018, Facebook was beginning to seem like Monty Python’s Black Knight: hacked down to a torso hopping on one leg but still filled with confidence. The Alex Jones, Holocaust, Kaplan, hack, and Definers scandals had all happened in four months. The heads of WhatsApp and Insta­gram had quit. The stock price was at its lowest level in nearly two years. In the middle of that, Facebook chose to launch a video chat service called Portal. Reviewers thought it was great, except for the fact that Facebook had designed it, which made them fear it was essentially a spycam for people’s houses. Even internal tests at Facebook had shown that people responded to a description of the product better when they didn’t know who had made it.

Two weeks later, the Black Knight lost his other leg. A British member of parliament named Damian Collins had obtained hundreds of pages of internal Facebook emails from 2012 through 2015. Ironically, his committee had gotten them from a sleazy company that helped people search for photos of Facebook users in bikinis. But one of Facebook’s superpowers in 2018 was the ability to turn any critic, no matter how absurd, into a media hero. And so, without much warning, Collins released them to the world.

One of Facebook’s superpowers in 2018 was the ability to turn any critic, no matter how absurd, into a media hero.

The emails, many of them between Zuckerberg and top executives, lent a brutally concrete validation to the idea that Facebook promoted growth at the expense of almost any other value. In one message from 2015, an employee acknowledged that collecting the call logs of Android users is a “pretty high-risk thing to do from a PR perspective.” He said he could imagine the news stories about Facebook invading people’s private lives “in ever more terrifying ways.” But, he added, “it appears that the growth team will charge ahead and do it.” (It did.)

Perhaps the most telling email is a message from a then executive named Sam Lessin to Zuckerberg that epitomizes Facebook’s penchant for self-justification. The company, Lessin wrote, could be ruthless and committed to social good at the same time, because they are essentially the same thing: “Our mission is to make the world more open and connected and the only way we can do that is with the best people and the best infrastructure, which requires that we make a lot of money / be very profitable.”

The message also highlighted another of the company’s original sins: its assertion that if you just give people better tools for sharing, the world will be a better place. That’s just false. Sometimes Facebook makes the world more open and connected; sometimes it makes it more closed and disaffected. Despots and demagogues have proven to be just as adept at using Facebook as democrats and dreamers. Like the communications innovations before it—the printing press, the telephone, the internet itself—Facebook is a revolutionary tool. But human nature has stayed the same.

XII.

Perhaps the oddest single day in Facebook’s recent history came on January 30, 2019. A story had just appeared on TechCrunch reporting yet another apparent sin against privacy: For two years, Facebook had been conducting market research with an app that paid you in return for sucking private data from your phone. Facebook could read your social media posts, your emoji sexts, and your browser history. Your soul, or at least whatever part of it you put into your phone, was worth up to $20 a month.

Other big tech companies do research of this sort as well. But the program sounded creepy, particularly with the revelation that people as young as 13 could join with a parent’s permission. Worse, Facebook seemed to have deployed the app while wearing a ski mask and gloves to hide its fingerprints. Apple had banned such research apps from its main App Store, but Facebook had fashioned a workaround: Apple allows companies to develop their own in-house iPhone apps for use solely by employees—for booking conference rooms, testing beta versions of products, and the like. Facebook used one of these internal apps to disseminate its market research tool to the public.

Apple cares a lot about privacy, and it cares that you know it cares about privacy. It also likes to ensure that people honor its rules. So shortly after the story was published, Apple responded by shutting down all of Facebook’s in-house iPhone apps. By the middle of that Wednesday afternoon, parts of Facebook’s campus stopped functioning. Applications that enabled employees to book meetings, see cafeteria menus, and catch the right shuttle bus flickered out. Employees around the world suddenly couldn’t communicate via messenger with each other on their phones. The mood internally shifted between outraged and amused—with employees joking that they had missed their meetings because of Tim Cook. Facebook’s cavalier approach to privacy had now poltergeisted itself on the company’s own lunch menus.

But then something else happened. A few hours after Facebook’s engineers wandered back from their mystery meals, Facebook held an earnings call. Profits, after a months-long slump, had hit a new record. The number of daily users in Canada and the US, after stagnating for three quarters, had risen slightly. The stock surged, and suddenly all seemed well in the world. Inside a conference room called Relativity, Zuckerberg smiled and told research analysts about all the company’s success. At the same table sat Caryn Marooney, the company’s head of communications. “It felt like the old Mark,” she said. “This sense of ‘We’re going to fix a lot of things and build a lot of things.’ ” Employees couldn’t get their shuttle bus schedules, but within 24 hours the company was worth about $50 billion more than it had been worth the day before.

Less than a week after the boffo earnings call, the company gathered for another all-hands. The heads of security and ads spoke about their work and the pride they take in it. Nick Clegg told everyone that they had to start seeing themselves the way the world sees them, not the way they would like to be perceived. It seemed to observers as though management actually had its act together after a long time of looking like a man in lead boots trying to cross a lightly frozen lake. “It was a combination of realistic and optimistic that we hadn’t gotten right in two years,” one executive says.

Soon it was back to bedlam, though. Shortly after the all-hands, a parliamentary committee in the UK published a report calling the company a bunch of “digital gangsters.” A German regulatory authority cracked down on a significant portion of the company’s ad business. And news broke that the FTC in Washington was negotiating with the company and reportedly considering a multibillion-­dollar fine due in part to Cambridge Analytica. Later, Democratic presidential hopeful Elizabeth Warren published a proposal to break Facebook apart. She promoted her idea with ads on Facebook, using a modified version of the company’s logo—an act specifically banned by Facebook’s terms of service. Naturally, the company spotted the violation and took the ads down. Warren quickly denounced the move as censorship, even as Facebook restored the ads.

It was the perfect Facebook moment for a new year. By enforcing its own rules, the company had created an outrage cycle about Facebook—inside of a larger outrage cycle about Facebook.

XIII.

This January, George Soros gave another speech on a freezing night in Davos. This time he described a different menace to the world: China. The most populous country on earth, he said, is building AI systems that could become tools for totalitarian control. “For open societies,” he said, “they pose a mortal threat.” He described the world as in the midst of a cold war. Afterward, one of the authors of this article asked him which side Facebook and Google are on. “Facebook and the others are on the side of their own profits,” the financier answered.

The response epitomized one of the most common critiques of the company now: Everything it does is based on its own interests and enrichment. The massive efforts at reform are cynical and deceptive. Yes, the company’s privacy settings are much clearer now than a year ago, and certain advertisers can no longer target users based on their age, gender, or race, but those changes were made at gunpoint. The company’s AI filters help, sure, but they exist to placate advertisers who don’t want their detergent ads next to jihadist videos. The company says it has abandoned “Move fast and break things” as its motto, but the guest Wi-Fi password at headquarters remains “M0vefast.” Sandberg and Zuckerberg continue to apologize, but the apologies seem practiced and insincere.

At a deeper level, critics note that Facebook continues to pay for its original sin of ignoring privacy and fixating on growth. And then there’s the existential question of whether the company’s business model is even compatible with its stated mission: The idea of Facebook is to bring people together, but the business model only works by slicing and dicing users into small groups for the sake of ad targeting. Is it possible to have those two things work simultaneously?

To its credit, though, Facebook has addressed some of its deepest issues. For years, smart critics have bemoaned the perverse incentives created by Facebook’s annual bonus program, which pays people in large part based on the company hitting growth targets. In February, that policy was changed. Everyone is now given bonuses based on how well the company achieves its goals on a metric of social good.

Another deep critique is that Facebook simply sped up the flow of information to a point where society couldn’t handle it. Now the company has started to slow it down. The company’s fake-news fighters focus on information that’s going viral. WhatsApp has been reengineered to limit the number of people with whom any message can be shared. And internally, according to several employees, people communicate better than they did a year ago. The world might not be getting more open and connected, but at least Facebook’s internal operations are.

“It’s going to take real time to go backwards,” Sheryl Sandberg told WIRED, “and figure out everything that could have happened.”

In early March, Zuckerberg announced that Facebook would, from then on, follow an entirely different philosophy. He published a 3,200-word treatise explaining that the company that had spent more than a decade playing fast and loose with privacy would now prioritize it. Messages would be encrypted end to end. Servers would not be located in authoritarian countries. And much of this would happen with a further integration of Facebook, WhatsApp, and Insta­gram. Rather than WhatsApp becoming more like Facebook, it sounded like Facebook was going to become more like WhatsApp. When asked by WIRED how hard it would be to reorganize the company around the new vision, Zuckerberg said, “You have no idea how hard it is.”

Just how hard it was became clear the next week. As Facebook knows well, every choice involves a trade-off, and every trade-off involves a cost. The decision to prioritize encryption and interoperability meant, in some ways, a decision to deprioritize safety and civility. According to people involved in the decision, Chris Cox, long Zuckerberg’s most trusted lieutenant, disagreed with the direction. The company was finally figuring out how to combat hate speech and false news; it was breaking bread with the media after years of hostility. Now Facebook was setting itself up to both solve and create all kinds of new problems. And so in the middle of March, Cox announced that he was leaving. A few hours after the news broke, a shooter in New Zealand livestreamed on Facebook his murderous attack on a mosque.

Sandberg says that much of her job these days involves harm prevention; she’s also overseeing the various audits and investigations of the company’s missteps. “It’s going to take real time to go backwards,” she told WIRED, “and figure out everything that could have happened.”

Zuckerberg, meanwhile, remains obsessed with moving forward. In a note to his followers to start the year, he said one of his goals was to host a series of conversations about technology: “I’m going to put myself out there more.” The first such event, a conversation with the internet law scholar Jonathan Zittrain, took place at Harvard Law School in late winter. Near the end of their exchange, Zittrain asked Zuckerberg what Facebook might look like 10 or so years from now. The CEO mused about developing a device that would allow humans to type by thinking. It sounded incredibly cool at first. But by the time he was done, it sounded like he was describing a tool that would allow Facebook to read people’s minds. Zittrain cut in dryly: “The Fifth Amendment implications are staggering.” Zuckerberg suddenly appeared to understand that perhaps mind-reading technology is the 持续 thing the CEO of Facebook should be talking about right now. “Presumably this would be something someone would choose to use,” he said, before adding, “I don’t know how we got onto this.”


Nicholas Thompson (@nxthompson) is WIRED’s editor in chief. Fred Vogelstein (@­fvogelstein) is a contributing editor at the magazine.

This article appears in the May issue.现在订阅。

让我们知道您对本文的看法。通过mail@wired.com向编辑提交一封信。


更多伟大的有线故事

如何领导多代劳动力


图片来源:fizkes / Shutterstock

GenZ即将在未来几个月内以相当大的比例进入劳动力市场,与各个行业在过去几年中不得不适应的千禧一代合作。在协作仍然很高的情况下,您的团队如何能够保持正确和尊重?这是怎么回事。

无论您是为大型公司还是科技创业公司工作,都无可否认,工作场所政治可能是一个问题。在今天的环境中尤其如此,来自多代的员工并肩工作。

然而,工作场所政治问题不能仅仅归咎于千禧一代,婴儿潮一代或Z世代。真正的问题在于一代人进入劳动力队伍的过渡,他们越来越多地使用技术并更好地理解当前的数字化景观。

新员工不再被迫从地下开始,在阶梯之下,希望在五年内,他们将在行政桌上占有一席之地。现在,它不是关于品牌忠诚度或年龄歧视。相反,它更像是狗吃狗的气氛。尊重是,而且一直是双向的。没有它由一方提供,它不能被另一方回复和发回。在商业中,由于自我和价格,这就是紧张局势爆发,公司的目的在工作场所政治中失去的地方。

管理代沟

今天的领导者需要能够为工作人员提供服务,从婴儿潮一代到Y一代,一直到今年夏天进入就业市场的候选人涌入。

千禧一代开始进入工作场所近十年来,企业一直在管理员工价值观和标准的转变。这些年轻的员工告诉我们,旧的经营方式根本不适合他们。他们通过口口相传制定了一个系统的计划,并永远在劳动力上留下了自己的印记。

如今,一个未受到重视和未得到充分利用的员工不会为了忠诚而坚持下去。他们知道如果他们现在的雇主不欣赏并试图尊重他们 – 某人会在某个地方。

接管的数字化突出了那些没有适应的人。这通常源于自上而下。如果公司的领导层没有弥补那些在职业生涯的任何一段时间内进入劳动力市场的人的差距,那么员工就不会遭受不可避免的痛苦;这是公司。

这并不意味着你应该根据与经验,韧性和动力无关的书外方法盲目聘用候选人。通过各种方式,设计适用的测试运行,您的暂定员工可以采取确保他们将有利于他们的简历,投资组合和求职信所说的方式。

但是,测试不应被用作恐吓或激励的方法。作为企业家和企业主,您必须记住这些是真实的 有真正情绪和焦虑的人。有些测试比其他测试更好。但是,如果管理人员应该如何运作,那么对某人的技能进行测试,这些技能甚至超出了与工作范围相关的范围。不是在2019年,绝对不是在数字化和越来越偏远的工作世界中,企业现在发现自己。

每个人都有工作场所的需求,他们应该得到一个安全和受尊重的平台,以分享这些需求和关注。他们还应及时收到深思熟虑的切实反馈。现在是时候采取过去的现状,并保持在那里。

如何一起工作

现在是时候允许,启用和实践完全透明,并将其与工作场所尊重的双向街道合并。应该没有等级顺序。如果没有完全爬上公司阶梯的初级经理提出了一个更适用和有利可图的想法,而这个想法是由排在他们之上的人建议的,那么应该听到这个想法 – 不要批评或视为失败。

如果高层管理人员和/或高层管理人员未能适应 – 想想仿生学 – 最高能力的协作可能会存在。它的骄傲和自我阻碍了进步,成功和成长。这些人具有不同的技能,应该被人们所知和利用,而不是被抛弃,而是转向建议开箱即用的想法。自我在这个数字时代没有地位,绝对不是不断发展的工作场所。

你需要这些千禧一代将他们的落后思维转化为思想,计划和行动,与Z世代的驱动力合作,为更美好的明天创造一支劳动力。在公司的各个方面都存在利润和士气高,信任的地方。

缺乏透明度和不适应将导致任何公司在辞职,裁员,压力水平增加和员工过度工作方面遭受损失。领导者需要确保他们的员工委派和合作,以便以极高的效率完成工作。随着时间的推移,这将导致频谱的两端有机地赢得彼此的尊重。

赢得尊重的是繁荣和高能量的工作场所。如果这不是您的业务目标,也许是时候重新定义您的叙述,让年轻的光芒进入。

迪恩

Will Deane,一位具有丰富经验和成功经验的企业家,为企业建立,重塑品牌和提高投资回报率。 Will目前是位于德克萨斯州奥斯汀的数字营销机构Unstoppable Co.的所有者和创始人。从电子商务网站到制定高绩效团队,威尔认为通过利用人类情感与商业销售之间的融合,在任何市场中与消费者进行有效沟通的重要性。

喜欢这篇文章? 注册以获取更多精彩内容。

加入Business.com

已经是会员?
登入。

我们很想听到你的声音!登录评论

Millennial通过观看如何在YouTube上制作100万英镑的自行车行李箱制造商



<div _ngcontent-c14 =“”innerhtml =“

Nathan Hughes从零开始建立了他的Restrap业务 – 现在它拥有21名员工并在全球范围内出口。

Carlton Reid

在九年自筹资金的年代,内森休斯已将业余爱好业务转变为出口大国,年销售额接近100万英镑。 Restrap为不断扩大的“自行车旅行”自行车旅游市场提供行李箱等服务。这位28岁的老人在通过观看YouTube视频教自己缝合后,用一台缝纫机从家里推出了他的公司。&NBSP;

该公司以其原始产品命名,这是一种由回收汽车安全带制成的织带,可以将脚放在脚踏板上。 Restrap 现在雇佣了21名员工,随着销售额的不断增加,人们希望增加更多。&NBSP;

其他自行车行李箱品牌很久以前外包给亚洲; Restrap从未在英国制造过,而且最近购买了一台胶带封口机,约克郡公司现在可以处理制造过程的每个阶段。

“我们过去常常绕过废料场,从汽车上切下安全带,然后在我的妈妈身上洗洗衣机和后面的卧室缝线,“记得休斯。

第一条踏板带在自行车商店出售,这些商店专门生产“固定装置”,赛道上骑行的赛车式单齿自行车。&NBSP;

“添加其他面料并开始尝试是一种自然的进步。我们发展成为制造行李箱,背包和其他捆扎产品。“

去年年底,该公司搬进利兹市郊的一个运河工厂 – 一个前弹药工厂。它的面积是之前前提的两倍,但已经在隐喻接缝处爆破了。

Restrap为自行车行李箱 "bikepacking"。

Restrap

“在我们的第一年,我们的营业额只有9,000英镑。那时我没有另外的工作。从那时起,我们几乎翻了一番 [our sales] 去年同期。去年我们翻了700多英镑;今年我们将超过百万英镑大关。我们正在重新投资机械,计划长期存在,而不是向数百万的投资公司出售,并在中国制造一切。“

英国脱欧并没有抑制增长。

“是的,这是一个奇怪的,” 休斯承认&NBSP;

“这里的情绪一直很复杂 [Brexit has] 走了,它做了什么 [to people]。从商业角度来看,因为我们在英国采购和生产,所以我们没有受到太大的打击。由于货币的变化,我们出口了更多,因为我们现在比竞争对手便宜。“

国内制造也允许Restrap以高速迭代。

“我们设计了一个马鞍包,我们在一天内转过身来看看是否可以完成。我们完成了设计,对其进行了原型设计,向人们发送了一些测试,然后我们在24小时内推出了市场。它现在是我们最畅销的产品之一。“

Restrap出口到日本,韩国,中国,澳大利亚,以及欧盟和美国

该公司随着自行车包装的出现而不断发展,这是一种更新形式的轻量级自行车旅行,可以在行李箱之间吊装行李箱,并且可以将马鞍拉出鞍座,而不是将它们固定在前后行李架上。 (术语模仿“背包旅行”。)

休斯解释说:“如何使用自行车一直有很大的转变。”&NBSP;

“有很多人考虑骑自行车去度假,或者去周末或骑车上班。人们需要自行车行李,自行车包装是其中很重要的一部分。“

睡袋和bivvybag已从他们的Restrap中取出 "枪套" 在诺森伯兰郡的自行车旅游中。这辆自行车是Cannondale Topstone碎石自行车。

Carlton Reid

得益于Restrap的新型胶带封口机 – 用于防水 – 它现在也制作传统的自行车旅行“pannier”包,但自行车包装项目占主导地位,占公司50多个SKU的大部分。

“然而,我们最畅销的产品,按体积计算,是一条腰带,”休斯指出。

“它有弹性,适合骑行。我们卖了数千。“

休斯在大学学习电气工程,但没有考虑到工作。他说他没有“任何经验”,强调“我只是喜欢制作东西”。

他是一位自学成才的设计师,仍然是一名穿着缝纫机的人。

“我是这一代人的一员 [how to do things] 来自YouTube。如果你真的想学点东西,你可以学习它 [online]。好吧,如果你想成为一名医生,你必须上大学,但如果你想建一把椅子,那么你上网看看怎么做。“

除了业务初期销售总监的小额投资外,Restrap还得到了销售部门的资助。

“我们没有拿出大笔贷款购买这些机器,我们已经创造了购买它们的收入。它为我们的员工提供了更多的安全保障 – 这里的人们依赖我们的收入。“

天使投资不是游戏计划的一部分。

休斯说:“永远不要说永远,但这只是我们从未想过会有合适的机会。”

“如果我们有一个很大的想法,我们需要资助它,那么我们会考虑那些 [angles]。但是现在 [we’re happy] 去做吧 [all] 我们自己“。

如果Restrap永远采用VC路线,毫无疑问Hughes将转向YouTube观看它是如何完成的。

“>

Nathan Hughes从零开始建立了他的Restrap业务 – 现在它拥有21名员工并在全球范围内出口。

Carlton Reid

在九年自筹资金的年代,内森休斯已将业余爱好业务转变为出口大国,年销售额接近100万英镑。 Restrap为不断扩大的“自行车旅行”自行车旅游市场提供行李箱等服务。这位28岁的老人在通过观看YouTube视频教自己缝合后,用一台缝纫机从家里推出了他的公司。

该公司以其原始产品命名,这是一种由回收汽车安全带制成的织带,可以将脚放在踏板上。 Restrap现在雇佣了21名员工,随着销售额的不断增加,人们希望增加更多。

其他自行车行李箱品牌很久以前外包给亚洲; Restrap从未在英国制造过,而且最近购买了一台胶带封口机,约克郡公司现在可以处理制造过程的每个阶段。

“我们过去常常绕过废料场,从汽车上切下安全带,然后在我的妈妈身上洗洗衣机和后面的卧室缝线,“记得休斯。

第一条踏板带在自行车商店出售,这些商店专门生产“固定装置”,赛道上骑行的赛车式单齿自行车。

“添加其他面料并开始尝试是一种自然的进步。我们发展成为制造行李箱,背包和其他捆扎产品。“

去年年底,该公司搬进利兹市郊的一个运河工厂 – 一个前弹药工厂。它的面积是之前前提的两倍,但已经在隐喻接缝处爆破了。

Restrap为“自行车包装”制作自行车行李箱。

Restrap

“在我们的第一年,我们的营业额仅为9,000英镑。那时我没有另外的工作。从那时起,我们几乎翻了一番 [our sales] 去年同期。去年我们交出了70万英镑;今年我们将超过百万英镑大关。我们正在重新投资机械,计划长期存在,而不是向数百万的投资公司出售,并在中国制造一切。“

英国脱欧并没有抑制增长。

“是的,这是一个奇怪的,”休斯承认。

“这里的情绪一直很复杂 [Brexit has] 走了,它做了什么 [to people]。从商业角度来看,因为我们在英国采购和生产,所以我们没有受到太大的打击。由于货币的变化,我们出口了更多,因为我们现在比竞争对手便宜。“

国内制造也允许Restrap以高速迭代。

“我们设计了一个马鞍包,我们在一天内转过身来看看是否可以完成。我们进行了设计,对其进行了原型设计,将一些设计发送给人们进行测试,然后我们在24小时内推出了市场。它现在是我们最畅销的产品之一。“

Restrap出口到日本,韩国,中国,澳大利亚,以及欧盟和美国

该公司随着自行车包装的出现而不断发展,这是一种更新形式的轻量级自行车旅行,可以在行李箱之间吊装行李箱,并且可以将马鞍拉出鞍座,而不是将它们固定在前后行李架上。 (术语模仿“背包旅行”。)

休斯解释说:“如何使用自行车一直有很大的转变。”

“有很多人考虑骑自行车去度假,或者去周末或骑车上班。人们需要自行车行李,自行车包装是其中很重要的一部分。“

在诺森伯兰郡的自行车旅游中,睡袋和bivvybag已从他们的Restrap“皮套”中取出。这辆自行车是Cannondale Topstone碎石自行车。

Carlton Reid

得益于Restrap的新型胶带封口机 – 用于防水 – 它现在也制作传统的自行车旅行“pannier”包,但自行车包装项目占主导地位,占公司50多个SKU的大部分。

“然而,我们最畅销的产品,按体积计算,是一条腰带,”休斯指出。

“它有弹性,适合骑行。我们卖了数千。“

休斯在大学学习电气工程,但没有考虑到工作。他说他没有“任何经验”,强调“我只是喜欢制作东西”。

他是一位自学成才的设计师,仍然是一名穿着缝纫机的人。

“我是这一代人的一员 [how to do things] 来自YouTube。如果你真的想学点东西,你可以学习它 [online]。好吧,如果你想成为一名医生,你必须上大学,但如果你想建一把椅子,那么你上网看看怎么做。“

除了业务初期销售总监的小额投资外,Restrap还得到了销售部门的资助。

“我们没有拿出大笔贷款购买这些机器,我们已经创造了购买它们的收入。它为我们的员工提供了更多的安全保障 – 这里的人们依赖我们的收入。“

天使投资不是游戏计划的一部分。

休斯说:“永远不要说永远,但这只是我们从未想过会有合适的机会。”

“如果我们有一个很大的想法,我们需要资助它,那么我们会考虑那些 [angles]。但是现在 [we’re happy] 去做吧 [all] 我们自己“。

如果Restrap永远采用VC路线,毫无疑问Hughes将转向YouTube观看它是如何完成的。