穿过城镇，一个 包括首席运营官Sheryl Sandberg和全球通讯副总裁Elliot Schrage在内的Facebook高级管理人员组成了一个临时总部，靠近山脚下的托马斯·曼（Thomas Mann）放置虚构的疗养院。世界上最大的公司经常在这个世界上最大的精英公司中建立接待室，但今年Facebook的展馆并不是通常的通风现场。它更像是一个沙坑，它看到了与索罗斯的宽边一直点头的同样的大亨，部长和记者的紧张会议。
首席执行官马克扎克伯格最近承诺花2018年试图修复Facebook。但即便是公司新生的改革尝试也被视为可能宣布民主制度的战争。本月早些时候，Facebook公布了其新闻Feed排名的重大变化，以支持该公司所谓的“有意义的社交互动”。新闻Feed是Facebook的核心 – 流动宝贝图片，新闻报道，新时代公案的中心流，和俄罗斯制造的模因显示撒旦赞同希拉里克林顿。这些变化有利于朋友之间的互动，这意味着他们会不喜欢媒体公司发布的故事。不过，该公司承诺，对于因用户驱动的“可信度”指标而获得高分的本地新闻和出版物而言，这种打击将会有所缓和。
达沃斯为许多媒体高管提供了第一次与Facebook领导人就这些变化进行对抗的机会。因此，一个又一个，暴躁的出版商和编辑们整个星期都在达沃斯广场上跋涉到Facebook的总部，他们的靴子上贴着冰夹，寻求清晰。 Facebook在新闻机构的生活中已成为一种反复无常的，神圣的力量;它为他们提供了大约三分之一的推荐流量，同时吞噬了媒体行业所依赖的广告收入的越来越大的份额。现在这个。为什么？为什么被假新闻所困扰的公司会对真正的新闻不屑一顾？ Facebook的算法认为值得信赖的是什么？媒体高管甚至会看到他们自己的分数吗？
Facebook没有准备好回答所有这些问题;当然不是它想要的。最后一个 – 特别是关于可信度得分 – 很快激发了公司在达沃斯的高管和他们在门洛帕克的同事之间激烈的争论。包括Schrage在内的一些领导人想告诉出版商他们的分数。这是公平的。与此同意的还有该公司与新闻出版商的首席联络人坎贝尔•布朗（Campbell Brown），他的工作描述包括吸收Facebook和新闻业相互崩溃时的一些影响。
但加利福尼亚州的工程师和产品经理说，这是愚蠢的。当时负责News Feed的负责人亚当·莫塞里（Adam Mosseri）在电子邮件中指出，如果出版商知道他们的分数，他们就会对系统进行游戏。另外，他们太过简单，无法理解方法，而且无论如何分数都会不断变化。更糟糕的是，该公司还没有可靠的可信度衡量标准。
与此同时，Facebook的产品工程师远离达沃斯，开始着手实施扎克伯格愿景的精确算法业务。如果你想为数十亿人推广值得信赖的新闻，你首先要指出什么是值得信赖的，什么是新闻。 Facebook两者都很难过。为了确定可信度，该公司正在测试人们如何回应有关他们对不同出版商的印象的调查。为了定义新闻，工程师们从之前的项目中删除了一个分类系统 – 一个将该类别作为涉及“政治，犯罪或悲剧”的故事。
这个特殊的选择，这意味着算法对各种类型都不那么好 其他 新闻 – 从健康和科学到技术和体育 – 不是Facebook高管在达沃斯与媒体领导人讨论的事情。虽然它经过了与高级经理的评审，但并非公司的每个人都知道这一点。当一位Facebook高管最近在与一名下级工程师的简报中了解到这一点时，他们说他们“几乎落在他妈的地板上”。
令人困惑的有意义的社交互动 – 以内部异议，激烈的外部批评，真正的改革努力和愚蠢的错误为标志 – 为Facebook 2018年的舞台奠定了基础。这是关于annus horribilis的故事，基于对65名当前和前者的采访雇员。这最终是关于世界上最大的社交网络中发生的最大变化的故事。但它也是关于一家陷入其自身病态的公司，并且反过来说，也是由于其自身成功秘诀的必然逻辑。
这部故事中最关键的一集 – 最严重的危机 – 开始于达沃斯之后不久，当时一些记者来自 纽约时报， 守护者和英国的第四频道新闻来了。他们对一家名为Cambridge Analytica的英国阴暗公司了解了一些令人不安的事情，他们有一些问题。
它是在 某些方面，一个古老的故事。早在2014年，剑桥大学的一位名叫亚历山大·科根的年轻学者建立了一个名为thisisyourdigitallife的个性问卷调查应用程序。几十万人注册，让Kogan不仅可以访问他们的Facebook数据，而且还因为Facebook当时的隐私政策松散，而且他们的朋友网络中有多达8700万人。 Kogan并没有简单地将所有这些数据用于研究目的，而是他已经允许这样做，而是将这个数据传递给剑桥分析公司（Cambridge Analytica），这是一家战略咨询公司，谈论了一个关于其为政治客户建模和操纵人类行为的能力的大型游戏。 2015年12月， 守护者 据报道，剑桥分析公司曾使用这些数据来帮助特德克鲁兹的总统竞选活动，此时Facebook要求删除数据。
Facebook在2018年的最初几个月就知道了这一点。公司也知道 – 因为每个人都知道 – 在Ted Cruz退出竞选之后，剑桥Analytica继续与特朗普竞选合作。 Facebook的一些人担心他们公司与Cambridge Analytica关系的故事还没有结束。一位前Facebook通讯官员记得在2017年夏天被一位经理警告说剑桥Analytica故事的未解决的元素仍然是一个严重的漏洞。然而，Facebook上没有人确切知道未爆弹药的确切时间和地点。 “该公司还不知道它还不知道什么，”经理说。 （经理现在否认这么说。）
该公司在2月底首次听说过 时 和 守护者 有一个故事即将到来，但负责制定回应的部门是一个分裂的房子。在秋季，Facebook聘请了一位名叫雷切尔·惠特斯通（Rachel Whetstone）的辉煌但又火热的科技行业公关老手。她来自优步，为Facebook的WhatsApp，Instagram和Messenger运营通讯。不久，她与扎克伯格一起参加公共活动，加入桑德伯格的高级管理层会议，并制定决策，如公共关系公司外部裁员或保留 – 这通常与那些正式负责Facebook的300人通信商店的人一起休息。工作人员迅速分为粉丝和仇敌。
因此，一个混乱和不稳定的沟通团队与管理层挤在一起讨论如何回应 时 和 监护人 记者。标准方法应该是纠正错误信息或错误，并改变公司的故事。 Facebook最终选择了另一种策略。它将推动媒体的发布：在故事发布的前夕，将大量信息公之于众，希望能够将它们放在首位。这是一种短期利益但长期成本的策略。调查记者就像斗牛犬。踢他们一次，他们永远不会再相信你。
据多名参与者称，Facebook决定承担这一风险是一个紧密的决定。但是在3月16日星期五的晚上，该公司宣布将剑桥Analytica暂停其平台。这是一个致命的选择。 “这就是为什么 时 讨厌我们，“一位高级管理人员说。另一位通讯官员说：“去年，我不得不与记者交谈，他们担心我们会前瞻他们。这是最糟糕的。无论什么微积分，都不值得。“
这种策略也行不通。第二天，故事集中在一个名叫Christopher Wylie的粉红色头发的魅力举报人在欧洲和美国爆炸。 Wylie，前剑桥Analytica员工，声称该公司没有删除它从Facebook获取的数据，并且可能已经利用这些数据来推动美国总统大选。第一句话 守护者该报道称，这是“科技巨头有史以来最大的数据泄露事件之一”，剑桥分析公司曾使用这些数据“建立一个强大的软件程序来预测和影响投票箱的选择。”
这个故事是一个女巫酿造的俄罗斯特工，隐私侵犯，混乱数据和唐纳德特朗普。它触及了当下几乎所有令人担忧的问题。政客们要求监管;用户呼吁抵制。 Facebook在一天内损失了360亿美元的市值。因为许多员工都是根据股票的表现获得补偿，所以Menlo Park并没有忽视这一下降。
对于这个情感故事，Facebook有一个程序员的理性回应。几乎每一个事实 守护者其领导人相信，其开头段落具有误导性。公司没有遭到破坏 – 一名学者在获得许可的情况下公平地下载了数据，然后不公平地将其移交。剑桥Analytica构建的软件并不强大，也无法预测或影响投票箱的选择。
但这一切都不重要。当一位名叫Alex Stamos的Facebook高管试图在Twitter上争辩说这个词 突破口 被滥用了，他被打倒了。他很快删除了他的推文。他的立场是正确的，但谁在乎呢？如果有人用枪指着你并举起一个手指向上的标志，你就不应该担心撇号。这个故事是许多人中第一个阐述Facebook挣扎的核心讽刺之一的故事。该公司的算法帮助维持了一个优先考虑愤怒的新闻生态系统，而新闻生态系统正在学习如何引发对Facebook的愤怒。
“这五天非常非常长，”桑德伯格说，他现在承认延迟是一个错误。她说，公司陷入瘫痪，因为它并不了解所有事实;它认为Cambridge Analytica删除了数据。它没有一个特定的问题需要解决。允许Kogan收集如此多数据的宽松隐私政策在几年前就收紧了。 “我们不知道如何在不完善的信息系统中做出回应，”她说。
Facebook的另一个问题是，它不理解过去两年中反对它的反抗情绪。它的主要决策者已成功运行相同的剧本十五年：做他们认为最适合平台增长的（通常以牺牲用户隐私为代价），如果有人抱怨就道歉，并继续推进。或者，正如旧口号所说：快速行动并打破局面。现在公众认为Facebook打破了西方民主。 这个 隐私侵犯 – 不像之前的许多其他人 – 不是人们会简单地克服的。
最后，周三，该公司决定扎克伯格应该接受电视采访。在冷落CBS和PBS之后，该公司召集了一名CNN记者，通讯员工信任该记者的合理性。该网络的摄像机工作人员被视为潜在的间谍，一名通信官员记得即使他们去洗手间也需要对他们进行监控。 （Facebook现在说这不是公司协议。）在采访中，扎克伯格道歉。但他也是具体的：对于想要访问Facebook数据的人来说，会有审计和更严格的规则。 Facebook将建立一个工具，让用户知道他们的数据是否已经结束了Cambridge Analytica。他承诺Facebook会确保这种崩溃再也不会发生。
随后进行了一系列其他采访。那个星期三，WIRED得到了一个安静的单挑，我们可以在下午晚些时候与扎克伯格聊天。大约凌晨4点45分，他的通讯负责人打电话说他将在5点打电话。在那次采访中，扎克伯格再次道歉。但是当他转向其中一个话题时，他变得更加明亮，据接近他的人说，他真正投入了他的想象力：使用人工智能来防止人类污染Facebook。这不是对剑桥Analytica丑闻的回应，而是对积压的指控，自2016年以来收集，Facebook已成为毒性病毒性的污水池，但这实际上是一个问题，他想知道如何解决。他不认为AI可以完全消除仇恨言论或裸露或垃圾邮件，但它可能会接近。他告诉WIRED，“我对食品安全的理解是，当鸡肉进行加工时会有一定量的灰尘进入鸡肉，而且量不大 – 需要的量很少。”
星期六之后 桑德伯格告诉剑桥Analytica丑闻，Facebook的高级律师莫莉卡特勒（Molly Cutler）创建了一个危机应对团队。桑德伯格说，确保我们再也没有对此类重大问题做出延迟回应。她把卡特勒的新办公桌放在她旁边，以保证卡特勒能够说服部门负责人与她合作。 “我星期一开始担任这个角色，”卡特勒说。 “我从来没有回到我原来的办公桌前。几个星期后，法律团队的某个人给我发了消息并说：“你想让我们打包你的东西吗？好像你不回来了。'“
最终，Facebook知道这项工作必须主要由机器完成 – 这无论如何都是公司的偏好。机器可以整天浏览色情内容而不需要修补，他们还没有学会加入工会。因此，该公司同时在CTO Mike Schroepfer的带领下进行了巨大的努力，以创建人工智能系统，该系统可以大规模识别Facebook希望从其平台中消除的内容，包括垃圾邮件，裸体，仇恨言论，ISIS宣传，和儿童被放入洗衣机的视频。一个更棘手的目标是确定Facebook想要降级的东西但不是消除类似误导性的clickbait垃圾。在过去几年中，Facebook的核心AI团队每年的规模翻了一番。
即使是基本的机器学习系统也可以非常可靠地识别和阻止色情或图形暴力图像。仇恨言论要困难得多。一句话可能是仇恨或骄傲，取决于谁说出来。 “你不是我的婊子，然后婊子你已经完成，”可能是死亡威胁，灵感，或来自Cardi B的抒情诗。想象一下，试图用西班牙语，普通话或缅甸语解码类似的复杂系列。虚假新闻同样棘手。 Facebook不希望平台上出现谎言或公牛。但它知道真相可以是一个万花筒。善意的人在互联网上搞错了;恶意的演员有时会把事情做对。
Schroepfer的工作就是让Facebook的人工智能在追赶这些极其模棱两可的内容时嗤之以鼻。每个类别的工具和成功率都有所不同。但基本技术大致相同：您需要一组已分类的数据，然后您需要在其上训练机器。对于垃圾邮件和裸露，这些数据库已经存在，这些数据库是在无辜的日子里手工创建的，当时在线威胁是假的伟哥和山羊模因，而不是弗拉基米尔普京和纳粹。在其他类别中，您需要自己构建标记数据集 – 理想情况下，不需要雇佣一大群人来这样做。
这些是Facebook高管喜欢谈论的问题。它们涉及数学和逻辑，在公司工作的人是你将遇到的最合乎逻辑的人。但剑桥Analytica主要是一个隐私丑闻。 Facebook最明显的回应是加强内容审核，旨在保持平台安全和民用。然而，有时涉及的两大价值 – 隐私和文明 – 却遭到了反对。如果你给人们保护他们的数据完全保密的方法，你也会创建一个秘密隧道，让老鼠可以在未被发现的情况下匆匆忙忙。
换句话说，每一种选择都需要权衡，而每次权衡都意味着一些价值被摒弃了。而你所摒弃的每一个价值 – 特别是当你在2018年成为Facebook时 – 意味着锤子会落在你的头上。
危机提供机会。 它们会迫使您进行一些更改，但它们也为您一直希望进行的更改提供保护。在扎克伯格在国会作证之后四周，该公司发起了历史上最大规模的改组。大约十几名高管转移了椅子。最重要的是，作为Facebook内核称为Blue App的长期负责人Chris Cox现在也将监督WhatsApp和Instagram。考克斯可能是扎克伯格最亲密，最信任的知己，而且看起来像是继任计划。 Adam Mosseri搬到了Instagram上运行产品。
Instagram由Kevin Systrom和Mike Krieger于2010年创立，于2012年以10亿美元被Facebook收购。当时的价格看起来非常高：对于一家拥有13名员工的公司来说，这笔钱多少钱？不久价格看起来很低廉：世界上发展最快的社交网络只有几十亿美元？在内部，Facebook起初以骄傲的态度看待Instagram的无情增长。但是，据一些人说，骄傲变成了怀疑，因为学生的成功匹配，然后超过教授的。
Systrom炙手可热的新闻报道并没有帮助。据直接参与的人士称，2014年，扎克伯格下令没有其他高管在未经桑德伯格批准的情况下参加杂志档案。有些人记得这是为了让竞争对手更难找到员工偷猎;其他人还记得它是包含Systrom的直接努力。 Facebook的高管们也认为Instagram的增长正在蚕食蓝色应用程序。 2017年，考克斯的团队向高级管理人员展示了数据，表明人们在蓝色应用程序中共享的内容较少，部分原因是因为Instagram。对某些人来说，这听起来就像他们只是提出要解决的问题。其他人都惊呆了，并将其视为Facebook的管理层更关心他们所生产的产品而不是他们所采用的产品。
当Cambridge Analytica丑闻曝光时，Instagram创始人Kevin Systrom和Mike Krieger已经担心扎克伯格会对他们感到厌恶。
大多数Instagram和Facebook的一些人太讨厌照片共享应用程序的增长，无论如何都可以看作是麻烦。是的，人们使用Blue App和Instagram更多。但这并不意味着Instagram正在偷猎用户。也许离开蓝色应用程序的人会花时间在Snapchat或观看Netflix或修剪他们的草坪。如果Instagram快速增长，也许是因为产品很好？ Instagram有它的问题 – 欺凌，羞辱，FOMO，宣传，腐败的微观影响者 – 但它的内部架构帮助它避免了一些困扰该行业的恶魔。帖子难以转发，这会降低病毒式传播速度。外部链接更难嵌入，这使得假新闻提供商远离。极简主义设计也将问题降至最低。多年来，Systrom和Krieger为使Instagram免于汉堡包而感到自豪：在屏幕的一角打开菜单，由三条水平线组成的图标。 Facebook到处都有汉堡包和其他菜单。
与此同时，WhatsApp，Brian Acton和Jan Koum的创始人已经离开了Facebook的帐篷并开始了火灾。扎克伯格在2014年以190亿美元收购了加密消息平台，但这些文化从未完全融合。 The two sides couldn’t agree on how to make money—WhatsApp’s end-to-end encryption wasn’t originally designed to support targeted ads—and they had other differences as well. WhatsApp insisted on having its own conference rooms, and, in the perfect metaphor for the two companies’ diverging attitudes over privacy, WhatsApp employees had special bathroom stalls designed with doors that went down to the floor, unlike the standard ones used by the rest of Facebook.
Eventually the battles became too much for Acton and Koum, who had also come to believe that Facebook no longer intended to leave them alone. Acton quit and started funding a competing messaging platform called Signal. During the Cambridge Analytica scandal, he tweeted, “It is time. #deletefacebook.” Soon afterward, Koum, who held a seat on Facebook’s board, announced that he too was quitting, to play more Ultimate Frisbee and work on his collection of air-cooled Porsches.
The departure of the WhatsApp founders created a brief spasm of bad press. But now Acton and Koum were gone, Mosseri was in place, and Cox was running all three messaging platforms. And that meant Facebook could truly pursue its most ambitious and important idea of 2018: bringing all those platforms together into something new.
By the late spring, news organizations—even as they jockeyed for scoops about the latest meltdown in Menlo Park—were starting to buckle under the pain caused by Facebook’s algorithmic changes. Back in May of 2017, according to Parse.ly, Facebook drove about 40 percent of all outside traffic to news publishers. A year later it was down to 25 percent. Publishers that weren’t in the category “politics, crime, or tragedy” were hit much harder.
At WIRED, the month after an image of a bruised Zuckerberg appeared on the cover, the numbers were even more stark. One day, traffic from Facebook suddenly dropped by 90 percent, and for four weeks it stayed there. After protestations, emails, and a raised eyebrow or two about the coincidence, Facebook finally got to the bottom of it. An ad run by a liquor advertiser, targeted at WIRED readers, had been mistakenly categorized as engagement bait by the platform. In response, the algorithm had let all the air out of WIRED’s tires. The publication could post whatever it wanted, but few would read it. Once the error was identified, traffic soared back. It was a reminder that journalists are just sharecroppers on Facebook’s giant farm. And sometimes conditions on the farm can change without warning.
Inside Facebook, of course, it was not surprising that traffic to publishers went down after the pivot to “meaningful social interactions.” That outcome was the point. It meant people would be spending more time on posts created by their friends and family, the genuinely unique content that Facebook offers. According to multiple Facebook employees, a handful of executives considered it a small plus, too, that the news industry was feeling a little pain after all its negative coverage. The company denies this—“no one at Facebook is rooting against the news industry,” says Anne Kornblut, the company’s director of news partnerships—but, in any case, by early May the pain seemed to have become perhaps excessive. A number of stories appeared in the press about the damage done by the algorithmic changes. And so Sheryl Sandberg, who colleagues say often responds with agitation to negative news stories, sent an email on May 7 calling a meeting of her top lieutenants.
That kicked off a wide-ranging conversation that ensued over the next two months. The key question was whether the company should introduce new factors into its algorithm to help serious publications. The product team working on news wanted Facebook to increase the amount of public content—things shared by news organizations, businesses, celebrities—allowed in News Feed. They also wanted the company to provide stronger boosts to publishers deemed trustworthy, and they suggested the company hire a large team of human curators to elevate the highest-quality news inside of News Feed. The company discussed setting up a new section on the app entirely for news and directed a team to quietly work on developing it; one of the team’s ambitions was to try to build a competitor to Apple News.
Some of the company’s most senior execs, notably Chris Cox, agreed that Facebook needed to give serious publishers a leg up. Others pushed back, especially Joel Kaplan, a former deputy chief of staff to George W. Bush who was now Facebook’s vice president of global public policy. Supporting high-quality outlets would inevitably make it look like the platform was supporting liberals, which could lead to trouble in Washington, a town run mainly by conservatives. Breitbart and the Daily Caller, Kaplan argued, deserved protections too. At the end of the climactic meeting, on July 9, Zuckerberg sided with Kaplan and announced that he was tabling the decision about adding ways to boost publishers, effectively killing the plan. To one person involved in the meeting, it seemed like a sign of shifting power. Cox had lost and Kaplan had won. Either way, Facebook’s overall traffic to news organizations continued to plummet.
That same evening, Donald Trump announced that he had a new pick for the Supreme Court: Brett Kavanaugh. As the choice was announced, Joel Kaplan stood in the background at the White House, smiling. Kaplan and Kavanaugh had become friends in the Bush White House, and their families had become intertwined. They had taken part in each other’s weddings; their wives were best friends; their kids rode bikes together. No one at Facebook seemed to really notice or care, and a tweet pointing out Kaplan’s attendance was retweeted a mere 13 times.
Meanwhile, the dynamics inside the communications department had gotten even worse. Elliot Schrage had announced that he was going to leave his post as VP of global communications. So the company had begun looking for his replacement; it focused on interviewing candidates from the political world, including Denis McDonough and Lisa Monaco, former senior officials in the Obama administration. But Rachel Whetstone also declared that she wanted the job. At least two other executives said they would quit if she got it.
The need for leadership in communications only became more apparent on July 11, when John Hegeman, the new head of News Feed, was asked in an interview why the company didn’t ban Alex Jones’ InfoWars from the platform. The honest answer would probably have been to just admit that Facebook gives a rather wide berth to the far right because it’s so worried about being called liberal. Hegeman, though, went with the following: “We created Facebook to be a place where different people can have a voice. And different publishers have very different points of view.”
This, predictably, didn’t go over well with the segments of the news media that actually try to tell the truth and that have never, as Alex Jones has done, reported that the children massacred at Sandy Hook were actors. Public fury ensued. Most of Facebook didn’t want to respond. But Whetstone decided it was worth a try. She took to the @facebook account—which one executive involved in the decision called “a big fucking marshmallow we shouldn’t ever use like this”—and started tweeting at the company’s critics.
“Sorry you feel that way,” she typed to one, and explained that, instead of banning pages that peddle false information, Facebook demotes them. The tweet was very quickly ratioed, a Twitter term of art for a statement that no one likes and that receives more comments than retweets. Whetstone, as @facebook, also declared that just as many pages on the left pump out misinformation as on the right. That tweet got badly ratioed too.
Five days later, Zuckerberg sat down for an interview with Kara Swisher, the influential editor of Recode. Whetstone was in charge of prep. Before Zuckerberg headed to the microphone, Whetstone supplied him with a list of rough talking points, including one that inexplicably violated the first rule of American civic discourse: Don’t invoke the Holocaust while trying to make a nuanced point.
About 20 minutes into the interview, while ambling through his answer to a question about Alex Jones, Zuckerberg declared, “I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down, because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong.” Sometimes, Zuckerberg added, he himself makes errors in public statements.
The comment was absurd: People who deny that the Holocaust happened generally aren’t just slipping up in the midst of a good-faith intellectual disagreement. They’re spreading anti-Semitic hate—intentionally. Soon the company announced that it had taken a closer look at Jones’ activity on the platform and had finally chosen to ban him. His past sins, Facebook decided, had crossed into the domain of standards violations.
Eventually another candidate for the top PR job was brought into the headquarters in Menlo Park: Nick Clegg, former deputy prime minister of the UK. Perhaps in an effort to disguise himself—or perhaps because he had decided to go aggressively Silicon Valley casual—he showed up in jeans, sneakers, and an untucked shirt. His interviews must have gone better than his disguise, though, as he was hired over the luminaries from Washington. “What makes him incredibly well qualified,” said Caryn Marooney, the company’s VP of communications, “is that he helped run a country.”
At the end of July, Facebook was scheduled to report its quarterly earnings in a call to investors. The numbers were not going to be good; Facebook’s user base had grown more slowly than ever, and revenue growth was taking a huge hit from the company’s investments in hardening the platform against abuse. But in advance of the call, the company’s leaders were nursing an additional concern: how to put Instagram in its place. According to someone who saw the relevant communications, Zuckerberg and his closest lieutenants were debating via email whether to say, essentially, that Instagram owed its spectacular growth not primarily to its founders and vision but to its relationship with Facebook.
Zuckerberg wanted to include a line to this effect in his script for the call. Whetstone counseled him not to, or at least to temper it with praise for Instagram’s founding team. In the end, Zuckerberg’s script declared, “We believe Instagram has been able to use Facebook’s infrastructure to grow more than twice as quickly as it would have on its own. A big congratulations to the Instagram team—and to all the teams across our company that have contributed to this success.”
After the call—with its payload of bad news about growth and investment—Facebook’s stock dropped by nearly 20 percent. But Zuckerberg didn’t forget about Instagram. A few days later he asked his head of growth, Javier Olivan, to draw up a list of all the ways Facebook supported Instagram: running ads for it on the Blue App; including link-backs when someone posted a photo on Instagram and then cross-published it in Facebook News Feed; allowing Instagram to access a new user’s Facebook connections in order to recommend people to follow. Once he had the list, Zuckerberg conveyed to Instagram’s leaders that he was pulling away the supports. Facebook had given Instagram servers, health insurance, and the best engineers in the world. Now Instagram was just being asked to give a little back—and to help seal off the vents that were allowing people to leak away from the Blue App.
Systrom soon posted a memo to his entire staff explaining Zuckerberg’s decision to turn off supports for traffic to Instagram. He disagreed with the move, but he was committed to the changes and was telling his staff that they had to go along. The memo “was like a flame going up inside the company,” a former senior manager says. The document also enraged Facebook, which was terrified it would leak. Systrom soon departed on paternity leave.
The tensions didn’t let up. In the middle of August, Facebook prototyped a location-tracking service inside of Instagram, the kind of privacy intrusion that Instagram’s management team had long resisted. In August, a hamburger menu appeared. “It felt very personal,” says a senior Instagram employee who spent the month implementing the changes. It felt particularly wrong, the employee says, because Facebook is a data-driven company, and the data strongly suggested that Instagram’s growth was good for everyone.
The Instagram founders' unhappiness with Facebook stemmed from tensions that had brewed over many years and had boiled over in the past six months.
Friends of Systrom and Krieger say the strife was wearing on the founders too. According to someone who heard the conversation, Systrom openly wondered whether Zuckerberg was treating him the way Donald Trump was treating Jeff Sessions: making life miserable in hopes that he’d quit without having to be fired. Instagram’s managers also believed that Facebook was being miserly about their budget. In past years they had been able to almost double their number of engineers. In the summer of 2018 they were told that their growth rate would drop to less than half of that.
When it was time for Systrom to return from paternity leave, the two founders decided to make the leave permanent. They made the decision quickly, but it was far from impulsive. According to someone familiar with their thinking, their unhappiness with Facebook stemmed from tensions that had brewed over many years and had boiled over in the past six months.
And so, on a Monday morning, Systrom and Krieger went into Chris Cox’s office and told him the news. Systrom and Krieger then notified their team about the decision. Somehow the information reached Mike Isaac, a reporter at 纽约时报, before it reached the communications teams for either Facebook or Instagram. The story appeared online a few hours later, as Instagram’s head of communications was on a flight circling above New York City.
After the announcement, Systrom and Krieger decided to play nice. Soon there was a lovely photograph of the two founders smiling next to Mosseri, the obvious choice to replace them. And then they headed off into the unknown to take time off, decompress, and figure out what comes next. Systrom and Krieger told friends they both wanted to get back into coding after so many years away from it. If you need a new job, it’s good to learn how to code.
Just a few days after Systrom and Krieger quit, Joel Kaplan roared into the news. His dear friend Brett Kavanaugh was now not just a conservative appellate judge with Federalist Society views on Roe v. Wade; he had become an alleged sexual assailant, purported gang rapist, and national symbol of toxic masculinity to somewhere between 49 and 51 percent of the country. As the charges multiplied, Kaplan’s wife, Laura Cox Kaplan, became one of the most prominent women defending him: She appeared on Fox News and asked, “What does it mean for men in the future? It’s very serious and very troubling.” She also spoke at an #IStandWithBrett press conference that was livestreamed on Breitbart.
On September 27, Kavanaugh appeared before the Senate Judiciary Committee after four hours of wrenching recollections by his primary accuser, Christine Blasey Ford. Laura Cox Kaplan sat right behind him as the hearing descended into rage and recrimination. Joel Kaplan sat one row back, stoic and thoughtful, directly in view of the cameras broadcasting the scene to the world.
Kaplan isn’t widely known outside of Facebook. But he’s not anonymous, and he wasn’t wearing a fake mustache. As Kavanaugh testified, journalists started tweeting a screenshot of the tableau. At a meeting in Menlo Park, executives passed around a phone showing one of these tweets and stared, mouths agape. None of them knew Kaplan was going to be there. The man who was supposed to smooth over Facebook’s political dramas had inserted the company right into the middle of one.
Kaplan had long been friends with Sandberg; they’d even dated as undergraduates at Harvard. But despite rumors to the contrary, he had told neither her nor Zuckerberg that he would be at the hearing, much less that he would be sitting in the gallery of supporters behind the star witness. “He’s too smart to do that,” one executive who works with him says. “That way, Joel gets to go. Facebook gets to remind people that it employs Republicans. Sheryl gets to be shocked. And Mark gets to denounce it.”
If that was the plan, it worked to perfection. Soon Facebook’s internal message boards were lighting up with employees mortified at what Kaplan had done. Management’s initial response was limp and lame: A communications officer told the staff that Kaplan attended the hearing as part of a planned day off in his personal capacity. That wasn’t a good move. Someone visited the human resources portal and noted that he hadn’t filed to take the day off.
What Facebook Fears
In some ways, the world’s largest social network is stronger than ever, with record revenue of $55.8 billion in 2018. But Facebook has also never been more threatened. Here are some dangers that could knock it down.
US Antitrust Regulation
In March, Democratic presidential candidate Elizabeth Warren proposed severing Instagram and WhatsApp from Facebook, joining the growing chorus of people who want to chop the company down to size. Even US attorney general William Barr has hinted at probing tech’s “huge behemoths.” But for now, antitrust talk remains talk—much of it posted to Facebook.
Federal Privacy Crackdowns
Facebook and the Federal Trade Commission are negotiating a settlement over whether the company’s conduct, including with Cambridge Analytica, violated a 2011 consent decree regarding user privacy.根据 纽约时报, federal prosecutors have also begun a criminal investigation into Facebook’s data-sharing deals with other technology companies.
While America debates whether to take aim at Facebook, Europe swings axes. In 2018, the EU’s General Data Protection Regulation forced Facebook to allow users to access and delete more of their data. Then this February, Germany ordered the company to stop harvesting web-browsing data without users’ consent, effectively outlawing much of the company’s ad business.
Although a fifth of the globe uses Facebook every day, the number of adult users in the US has largely stagnated. The decline is even more precipitous among teenagers. (Granted, many of them are switching to Instagram.) But network effects are powerful things: People swarmed to Facebook because everyone else was there; they might also swarm for the exits.
The hearings were on a Thursday. A week and a day later, Facebook called an all-hands to discuss what had happened. The giant cafeteria in Facebook’s headquarters was cleared to create space for a town hall. Hundreds of chairs were arranged with three aisles to accommodate people with questions and comments. Most of them were from women who came forward to recount their own experiences of sexual assault, harassment, and abuse.
Zuckerberg, Sandberg, and other members of management were standing on the right side of the stage, facing the audience and the moderator. Whenever a question was asked of one of them, they would stand up and take the mic. Kaplan appeared via video conference looking, according to one viewer, like a hostage trying to smile while his captors stood just offscreen. Another participant described him as “looking like someone had just shot his dog in the face.” This participant added, “I don’t think there was a single male participant, except for Zuckerberg looking down and sad onstage and Kaplan looking dumbfounded on the screen.”
Employees who watched expressed different emotions. Some felt empowered and moved by the voices of women in a company where top management is overwhelmingly male. Another said, “My eyes rolled to the back of my head” watching people make specific personnel demands of Zuckerberg, including that Kaplan undergo sensitivity training. For much of the staff, it was cathartic. Facebook was finally reckoning, in a way, with the #MeToo movement and the profound bias toward men in Silicon Valley. For others it all seemed ludicrous, narcissistic, and emblematic of the liberal, politically correct bubble that the company occupies. A guy had sat in silence to support his best friend who had been nominated to the Supreme Court; as a consequence, he needed to be publicly flogged?
In the days after the hearings, Facebook organized small group discussions, led by managers, in which 10 or so people got together to discuss the issue. There were tears, grievances, emotions, debate. “It was a really bizarre confluence of a lot of issues that were popped in the zit that was the SCOTUS hearing,” one participant says. Kaplan, though, seemed to have moved on. The day after his appearance on the conference call, he hosted a party to celebrate Kavanaugh’s lifetime appointment. Some colleagues were aghast. According to one who had taken his side during the town hall, this was a step too far. That was “just spiking the football,” they said. Sandberg was more forgiving. “It’s his house,” she told WIRED. “That is a very different decision than sitting at a public hearing.”
In a year during which Facebook made endless errors, Kaplan’s insertion of the company into a political maelstrom seemed like one of the clumsiest. But in retrospect, Facebook executives aren’t sure that Kaplan did lasting harm. His blunder opened up a series of useful conversations in a workplace that had long focused more on coding than inclusion. Also, according to another executive, the episode and the press that followed surely helped appease the company’s would-be regulators. It’s useful to remind the Republicans who run most of Washington that Facebook isn’t staffed entirely by snowflakes and libs.
That summer and early fall weren’t kind to the team at Facebook charged with managing the company’s relationship with the news industry. At least two product managers on the team quit, telling colleagues they had done so because of the company’s cavalier attitude toward the media. In August, a jet-lagged Campbell Brown gave a presentation to publishers in Australia in which she declared that they could either work together to create new digital business models or not. If they didn’t, well, she’d be unfortunately holding hands with their dying business, like in a hospice. Her off-the-record comments were put on the record by The Australian, a publication owned by Rupert Murdoch, a canny and persistent antagonist of Facebook.
In September, however, the news team managed to convince Zuckerberg to start administering ice water to the parched executives of the news industry. That month, Tom Alison, one of the team’s leaders, circulated a document to most of Facebook’s senior managers; it began by proclaiming that, on news, “we lack clear strategy and alignment.”
Then, at a meeting of the company’s leaders, Alison made a series of recommendations, including that Facebook should expand its definition of news—and its algorithmic boosts—beyond just the category of “politics, crime, or tragedy.” Stories about politics were bound to do well in the Trump era, no matter how Facebook tweaked its algorithm. But the company could tell that the changes it had introduced at the beginning of the year hadn’t had the intended effect of slowing the political venom pulsing through the platform. In fact, by giving a slight tailwind to politics, tragedy, and crime, Facebook had helped build a news ecosystem that resembled the front pages of a tempestuous tabloid. Or, for that matter, the front page of FoxNews.com. That fall, Fox was netting more engagement on Facebook than any other English-language publisher; its list of most-shared stories was a goulash of politics, crime, and tragedy. (The network’s three most-shared posts that month were an article alleging that China was burning bibles, another about a Bill Clinton rape accuser, and a third that featured Laura Cox Kaplan and #IStandWithBrett.)
Politics, Crime, or Tragedy?
In early 2018, Facebook’s algorithm started demoting posts shared by businesses and publishers. But because of an obscure choice by Facebook engineers, stories involving “politics, crime, or tragedy” were shielded somewhat from the blow—which had a big effect on the news ecosystem inside the social network.
That September meeting was a moment when Facebook decided to start paying indulgences to make up for some of its sins against journalism. It decided to put hundreds of millions of dollars toward supporting local news, the sector of the industry most disrupted by Silicon Valley; Brown would lead the effort, which would involve helping to find sustainable new business models for journalism. Alison proposed that the company move ahead with the plan hatched in June to create an entirely new section on the Facebook app for news. And, crucially, the company committed to developing new classifiers that would expand the definition of news beyond “politics, crime, or tragedy.”
Zuckerberg didn’t sign off on everything all at once. But people left the room feeling like he had subscribed. Facebook had spent much of the year holding the media industry upside down by the feet. Now Facebook was setting it down and handing it a wad of cash.
As Facebook veered from crisis to crisis, something else was starting to happen: The tools the company had built were beginning to work. The three biggest initiatives for the year had been integrating WhatsApp, Instagram, and the Blue App into a more seamless entity; eliminating toxic content; and refocusing News Feed on meaningful social interactions. The company was making progress on all fronts. The apps were becoming a family, partly through divorce and arranged marriage but a family nonetheless. Toxic content was indeed disappearing from the platform. In September, economists at Stanford and New York University revealed research estimating that user interactions with fake news on the platform had declined by 65 percent from their peak in December 2016 to the summer of 2018. On Twitter, meanwhile, the number had climbed.
There wasn’t much time, however, for anyone to absorb the good news. Right after the Kavanaugh hearings, the company announced that, for the first time, it had been badly breached. In an Ocean’s 11–style heist, hackers had figured out an ingenious way to take control of user accounts through a quirk in a feature that makes it easier for people to play Happy Birthday videos for their friends. The breach was both serious and absurd, and it pointed to a deep problem with Facebook. By adding so many features to boost engagement, it had created vectors for intrusion. One virtue of simple products is that they are simpler to defend.
Given the sheer number of people who accused Facebook of breaking democracy in 2016, the company approached the November 2018 US midterm elections with trepidation. It worried that the tools of the platform made it easier for candidates to suppress votes than get them out. And it knew that Russian operatives were studying AI as closely as the engineers on Mike Schroepfer’s team.
So in preparation for Brazil’s October 28 presidential election and the US midterms nine days later, the company created what it called “election war rooms”—a term despised by at least some of the actual combat veterans at the company. The rooms were partly a media prop, but still, three dozen people worked nearly around the clock inside of them to minimize false news and other integrity issues across the platform. Ultimately the elections passed with little incident, perhaps because Facebook did a good job, perhaps because a US Cyber Command operation temporarily knocked Russia’s primary troll farm offline.
Facebook got a boost of good press from the effort, but the company in 2018 was like a football team that follows every hard-fought victory with a butt fumble and a 30-point loss. In mid-November, 纽约时报 published an impressively reported stem-winder about trouble at the company. The most damning revelation was that Facebook had hired an opposition research firm called Definers to investigate, among other things, whether George Soros was funding groups critical of the company. Definers was also directly connected to a dubious news operation whose stories were often picked up by Breitbart.
After the story broke, Zuckerberg plausibly declared that he knew nothing about Definers. Sandberg, less plausibly, did the same. Numerous people inside the company were convinced that she entirely understood what Definers did, though she strongly maintains that she did not. Meanwhile, Schrage, who had announced his resignation but never actually left, decided to take the fall. He declared that the Definers project was his fault; it was his communications department that had hired the firm, he said. But several Facebook employees who spoke with WIRED believe that Schrage’s assumption of responsibility was just a way to gain favor with Sandberg.
Inside Facebook, people were furious at Sandberg, believing she had asked them to dissemble on her behalf with her Definers denials. Sandberg, like everyone, is human. She’s brilliant, inspirational, and more organized than Marie Kondo. Once, on a cross-country plane ride back from a conference, a former Facebook executive watched her quietly spend five hours sending thank-you notes to everyone she’d met at the event—while everyone else was chatting and drinking. But Sandberg also has a temper, an ego, and a detailed memory for subordinates she thinks have made mistakes. For years, no one had a negative word to say about her. She was a highly successful feminist icon, the best-selling author of Lean In, running operations at one of the most powerful companies in the world. And she had done so under immense personal strain since her husband died in 2015.
But resentment had been building for years, and after the Definers mess the dam collapsed. She was pummeled in the 时, in The Washington Post, on Breitbart, and in WIRED. Former employees who had refrained from criticizing her in interviews conducted with WIRED in 2017 relayed anecdotes about her intimidation tactics and penchant for retribution in 2018. She was slammed after a speech in Munich. She even got dinged by Michelle Obama, who told a sold-out crowd at the Barclays Center in Brooklyn on December 1, “It’s not always enough to lean in, because that shit doesn’t work all the time.”
Everywhere, in fact, it was becoming harder to be a Facebook employee. Attrition increased from 2017, though Facebook says it was still below the industry norm, and people stopped broadcasting their place of employment. The company’s head of cybersecurity policy was swatted in his Palo Alto home. “When I joined Facebook in 2016, my mom was so proud of me, and I could walk around with my Facebook backpack all over the world and people would stop and say, ‘It’s so cool that you worked for Facebook.’ That’s not the case anymore,” a former product manager says. “It made it hard to go home for Thanksgiving.”
By the holidays in 2018, Facebook was beginning to seem like Monty Python’s Black Knight: hacked down to a torso hopping on one leg but still filled with confidence. The Alex Jones, Holocaust, Kaplan, hack, and Definers scandals had all happened in four months. The heads of WhatsApp and Instagram had quit. The stock price was at its lowest level in nearly two years. In the middle of that, Facebook chose to launch a video chat service called Portal. Reviewers thought it was great, except for the fact that Facebook had designed it, which made them fear it was essentially a spycam for people’s houses. Even internal tests at Facebook had shown that people responded to a description of the product better when they didn’t know who had made it.
Two weeks later, the Black Knight lost his other leg. A British member of parliament named Damian Collins had obtained hundreds of pages of internal Facebook emails from 2012 through 2015. Ironically, his committee had gotten them from a sleazy company that helped people search for photos of Facebook users in bikinis. But one of Facebook’s superpowers in 2018 was the ability to turn any critic, no matter how absurd, into a media hero. And so, without much warning, Collins released them to the world.
One of Facebook’s superpowers in 2018 was the ability to turn any critic, no matter how absurd, into a media hero.
The emails, many of them between Zuckerberg and top executives, lent a brutally concrete validation to the idea that Facebook promoted growth at the expense of almost any other value. In one message from 2015, an employee acknowledged that collecting the call logs of Android users is a “pretty high-risk thing to do from a PR perspective.” He said he could imagine the news stories about Facebook invading people’s private lives “in ever more terrifying ways.” But, he added, “it appears that the growth team will charge ahead and do it.” (It did.)
Perhaps the most telling email is a message from a then executive named Sam Lessin to Zuckerberg that epitomizes Facebook’s penchant for self-justification. The company, Lessin wrote, could be ruthless and committed to social good at the same time, because they are essentially the same thing: “Our mission is to make the world more open and connected and the only way we can do that is with the best people and the best infrastructure, which requires that we make a lot of money / be very profitable.”
The message also highlighted another of the company’s original sins: its assertion that if you just give people better tools for sharing, the world will be a better place. That’s just false. Sometimes Facebook makes the world more open and connected; sometimes it makes it more closed and disaffected. Despots and demagogues have proven to be just as adept at using Facebook as democrats and dreamers. Like the communications innovations before it—the printing press, the telephone, the internet itself—Facebook is a revolutionary tool. But human nature has stayed the same.
Perhaps the oddest single day in Facebook’s recent history came on January 30, 2019. A story had just appeared on TechCrunch reporting yet another apparent sin against privacy: For two years, Facebook had been conducting market research with an app that paid you in return for sucking private data from your phone. Facebook could read your social media posts, your emoji sexts, and your browser history. Your soul, or at least whatever part of it you put into your phone, was worth up to $20 a month.
Other big tech companies do research of this sort as well. But the program sounded creepy, particularly with the revelation that people as young as 13 could join with a parent’s permission. Worse, Facebook seemed to have deployed the app while wearing a ski mask and gloves to hide its fingerprints. Apple had banned such research apps from its main App Store, but Facebook had fashioned a workaround: Apple allows companies to develop their own in-house iPhone apps for use solely by employees—for booking conference rooms, testing beta versions of products, and the like. Facebook used one of these internal apps to disseminate its market research tool to the public.
Apple cares a lot about privacy, and it cares that you know it cares about privacy. It also likes to ensure that people honor its rules. So shortly after the story was published, Apple responded by shutting down all of Facebook’s in-house iPhone apps. By the middle of that Wednesday afternoon, parts of Facebook’s campus stopped functioning. Applications that enabled employees to book meetings, see cafeteria menus, and catch the right shuttle bus flickered out. Employees around the world suddenly couldn’t communicate via messenger with each other on their phones. The mood internally shifted between outraged and amused—with employees joking that they had missed their meetings because of Tim Cook. Facebook’s cavalier approach to privacy had now poltergeisted itself on the company’s own lunch menus.
But then something else happened. A few hours after Facebook’s engineers wandered back from their mystery meals, Facebook held an earnings call. Profits, after a months-long slump, had hit a new record. The number of daily users in Canada and the US, after stagnating for three quarters, had risen slightly. The stock surged, and suddenly all seemed well in the world. Inside a conference room called Relativity, Zuckerberg smiled and told research analysts about all the company’s success. At the same table sat Caryn Marooney, the company’s head of communications. “It felt like the old Mark,” she said. “This sense of ‘We’re going to fix a lot of things and build a lot of things.’ ” Employees couldn’t get their shuttle bus schedules, but within 24 hours the company was worth about $50 billion more than it had been worth the day before.
Less than a week after the boffo earnings call, the company gathered for another all-hands. The heads of security and ads spoke about their work and the pride they take in it. Nick Clegg told everyone that they had to start seeing themselves the way the world sees them, not the way they would like to be perceived. It seemed to observers as though management actually had its act together after a long time of looking like a man in lead boots trying to cross a lightly frozen lake. “It was a combination of realistic and optimistic that we hadn’t gotten right in two years,” one executive says.
Soon it was back to bedlam, though. Shortly after the all-hands, a parliamentary committee in the UK published a report calling the company a bunch of “digital gangsters.” A German regulatory authority cracked down on a significant portion of the company’s ad business. And news broke that the FTC in Washington was negotiating with the company and reportedly considering a multibillion-dollar fine due in part to Cambridge Analytica. Later, Democratic presidential hopeful Elizabeth Warren published a proposal to break Facebook apart. She promoted her idea with ads on Facebook, using a modified version of the company’s logo—an act specifically banned by Facebook’s terms of service. Naturally, the company spotted the violation and took the ads down. Warren quickly denounced the move as censorship, even as Facebook restored the ads.
It was the perfect Facebook moment for a new year. By enforcing its own rules, the company had created an outrage cycle about Facebook—inside of a larger outrage cycle about Facebook.
This January, George Soros gave another speech on a freezing night in Davos. This time he described a different menace to the world: China. The most populous country on earth, he said, is building AI systems that could become tools for totalitarian control. “For open societies,” he said, “they pose a mortal threat.” He described the world as in the midst of a cold war. Afterward, one of the authors of this article asked him which side Facebook and Google are on. “Facebook and the others are on the side of their own profits,” the financier answered.
The response epitomized one of the most common critiques of the company now: Everything it does is based on its own interests and enrichment. The massive efforts at reform are cynical and deceptive. Yes, the company’s privacy settings are much clearer now than a year ago, and certain advertisers can no longer target users based on their age, gender, or race, but those changes were made at gunpoint. The company’s AI filters help, sure, but they exist to placate advertisers who don’t want their detergent ads next to jihadist videos. The company says it has abandoned “Move fast and break things” as its motto, but the guest Wi-Fi password at headquarters remains “M0vefast.” Sandberg and Zuckerberg continue to apologize, but the apologies seem practiced and insincere.
At a deeper level, critics note that Facebook continues to pay for its original sin of ignoring privacy and fixating on growth. And then there’s the existential question of whether the company’s business model is even compatible with its stated mission: The idea of Facebook is to bring people together, but the business model only works by slicing and dicing users into small groups for the sake of ad targeting. Is it possible to have those two things work simultaneously?
To its credit, though, Facebook has addressed some of its deepest issues. For years, smart critics have bemoaned the perverse incentives created by Facebook’s annual bonus program, which pays people in large part based on the company hitting growth targets. In February, that policy was changed. Everyone is now given bonuses based on how well the company achieves its goals on a metric of social good.
Another deep critique is that Facebook simply sped up the flow of information to a point where society couldn’t handle it. Now the company has started to slow it down. The company’s fake-news fighters focus on information that’s going viral. WhatsApp has been reengineered to limit the number of people with whom any message can be shared. And internally, according to several employees, people communicate better than they did a year ago. The world might not be getting more open and connected, but at least Facebook’s internal operations are.
“It’s going to take real time to go backwards,” Sheryl Sandberg told WIRED, “and figure out everything that could have happened.”
In early March, Zuckerberg announced that Facebook would, from then on, follow an entirely different philosophy. He published a 3,200-word treatise explaining that the company that had spent more than a decade playing fast and loose with privacy would now prioritize it. Messages would be encrypted end to end. Servers would not be located in authoritarian countries. And much of this would happen with a further integration of Facebook, WhatsApp, and Instagram. Rather than WhatsApp becoming more like Facebook, it sounded like Facebook was going to become more like WhatsApp. When asked by WIRED how hard it would be to reorganize the company around the new vision, Zuckerberg said, “You have no idea how hard it is.”
Just how hard it was became clear the next week. As Facebook knows well, every choice involves a trade-off, and every trade-off involves a cost. The decision to prioritize encryption and interoperability meant, in some ways, a decision to deprioritize safety and civility. According to people involved in the decision, Chris Cox, long Zuckerberg’s most trusted lieutenant, disagreed with the direction. The company was finally figuring out how to combat hate speech and false news; it was breaking bread with the media after years of hostility. Now Facebook was setting itself up to both solve and create all kinds of new problems. And so in the middle of March, Cox announced that he was leaving. A few hours after the news broke, a shooter in New Zealand livestreamed on Facebook his murderous attack on a mosque.
Sandberg says that much of her job these days involves harm prevention; she’s also overseeing the various audits and investigations of the company’s missteps. “It’s going to take real time to go backwards,” she told WIRED, “and figure out everything that could have happened.”
Zuckerberg, meanwhile, remains obsessed with moving forward. In a note to his followers to start the year, he said one of his goals was to host a series of conversations about technology: “I’m going to put myself out there more.” The first such event, a conversation with the internet law scholar Jonathan Zittrain, took place at Harvard Law School in late winter. Near the end of their exchange, Zittrain asked Zuckerberg what Facebook might look like 10 or so years from now. The CEO mused about developing a device that would allow humans to type by thinking. It sounded incredibly cool at first. But by the time he was done, it sounded like he was describing a tool that would allow Facebook to read people’s minds. Zittrain cut in dryly: “The Fifth Amendment implications are staggering.” Zuckerberg suddenly appeared to understand that perhaps mind-reading technology is the 持续 thing the CEO of Facebook should be talking about right now. “Presumably this would be something someone would choose to use,” he said, before adding, “I don’t know how we got onto this.”
This article appears in the May issue.现在订阅。