Applebaum’s article: Regulate social media now. The future of democracy is at stake. A few days ago, ProPublica, an independent, nonprofit newsroom, discovered that a tool it was using to track political advertising on Facebook had been quietly disabled — by Facebook. For the past year, the company had accepted corrections from ProPublica — until one day it decided it didn’t want them anymore. It also seems like “they don’t wish for there to be information about the targeting of political advertising,” an editor at ProPublica told me. Facebook also made news in recent days for another tool: an app, this time its own, designed to give the company access to extensive information about how consumers were using their telephones. Sheryl Sandberg, the company’s chief operating officer, has defended the project vigorously, on the grounds that those who signed up to use this research app knew what they were doing — and were paid $20 a month. Unamused, Apple decided to intervene — and has now banned the app from its phones We don’t get to decide how information companies collect data, and we don’t get to decide how transparent they should be. The tech companies do that all by themselves Why does it matter? Because this is the information network that now brings most people their news and opinions about politics, about medicine, about the economy. This is also the information network that is fueling polarization, that favors sensational news over constructive news and that has destroyed the business model of local and investigative journalism These companies also operate according to their own rules and algorithms. They decide how data gets collected and who sees it. They decide how political and commercial advertising is regulated and monitored. They even decide what gets censored. The public sphere is shaped by these decisions, but the public has no say. There is a precedent for this historical moment. In the 1920s and 1930s, democratic governments suddenly found themselves challenged by radio, the new information technology of its time. Radio’s early stars included Adolf Hitler and Joseph Stalin: Solutions We can, for example, regulate Internet advertising, just as we regulate broadcast advertising, insisting that people know when and why they are being shown political ads or, indeed, any ads. We can curb the anonymity of the Internet — recent research shows that the number of fake accounts on Facebook may be far higher than what the company has stated in public — because we have a right to know whether we are interacting with real people or bots. In the longer term, there may be even more profound solutions. What would a public-interest algorithm look like, for example, or a form of social media that favored constructive conversations over polarization? Urgency If we don’t do it — if we don’t even try — we will not be able to ensure the integrity of elections or the decency of the public sphere. If we don’t do it, in the long term there won’t even be a public sphere, and there won’t be functional democracies anymore, either. Lee Rainie: Americans’ complicated feelings about social media in an era of privacy concerns Pew Research Center has studied the spread and impact of social media since 2005, when just 5% of American adults used the platforms. The trends tracked by our data tell a complex story that is full of conflicting pressures. On one hand, the rapid growth of the platforms is testimony to their appeal to online Americans. On the other, this widespread use has been accompanied by rising user concerns about privacy and social media firms’ capacity to protect their data. While there is evidence that social media works in some important ways for people, Pew Research Center studies have shown that people are anxious about all the personal information that is collected and shared and the security of their data. Overall, a 2014 survey found that 91% of Americans “agree” or “strongly agree” that people have lost control over how personal information is collected and used by all kinds of entities. Some 80% of social media users said they were concerned about advertisers and businesses accessing the data they share on social media platforms, and 64% said the government should do more to regulate advertisers. Six-in-ten Americans (61%) have said they would like to do more to protect their privacy. Additionally, two-thirds have said current laws are not good enough in protecting people’s privacy People’s issues with the social media experience go beyond privacy Near the end of the 2016 election campaign, 37% of social media users said they were worn out by the political content they encountered, and large shares said social media interactions with those opposed to their views were stressful and frustrating. Large shares also said that social media interactions related to politics were less respectful, less conclusive, less civil and less informative than offline interactions. By Jennifer Senior: You’re Not Alone When You’re on Google and we click “I agree” when downloading our apps, knowing full well that those apps are talking to other apps, telling them how much we eat and what music we listen to and when we ovulate. we’re now alerted to the use of cookies on websites, thanks to that recent rule issued by the European Union? Speaking for myself, I would say it has not. Those alerts make me feel worse, because they reveal my impatience, my recklessness, my everyday failures of self-regulation. I seem to be forever surrendering my privacy in exchange for some short-term gain, rather than dutifully slogging through the decision tree of the cookie opt-out. So there you have one explanation for this so-called paradox: To fully apprehend our vulnerabilities as digital creatures would require far too much time and energy. More than that: It would require an entirely new set of instincts, a radically different cognitive framework from the one we now possess.( They are giving us to agree on policies and terms that we don’t agree upon reading. They make them so confusing and time-confusing.) We think we’re alone while we’re buzzing through the mists of cyberspace — that a Google search is akin to thumbing through the Yellow Pages, because it feels just as solitary. But it isn’t. We are being watched, tracked; we simply don’t realize it, because we can’t see it or feel it. What many of us don’t realize, when we’re online, is how very much the technologies we’re using are reshaping our ideas about privacy without our notice. Which in turn reshapes our behaviors. SOCIAL MEDIA SHOULD SERVE A FREE SOCIETY, NOT MINE DATA ARAM SINNREICH AND BARBARA ROMZEK the risks data collection poses to civic institutions, public discourse and individual privacy. The U.K.-based political consulting firm didn't just collect personal data from the 270,000 people who used researcher Aleksandr Kogan's online personality quiz—nor was the damage limited to 87 million of their friends. As scholars of public accountability and digital media systems, we know that the business of social media is based on extracting user data and offering it for sale. Furthermore, this problem is not specific just to Facebook. Other companies, including Google and Amazon, also gather and exploit extensive personal data, and are locked in a digital arms race that we believe threatens to destroy privacy altogether. Governments need to be better guardians of public welfare—including privacy. Many companies using various aspects of technology in new ways have so far avoided regulation by stoking fears that rules might stifle innovation. Facebook and others have often claimed that they're better at regulating themselves in an ever-changing environment than a slow-moving legislative process could be. Zuckerberg’s So-Called ShiftToward Privacy Tufekci Here are four pressing questions about privacy that Mr. Zuckerberg conspicuously did not address: Will Facebook stop collecting data about people’s browsing behavior, which it does extensively? Will it stop purchasing information from data brokers who collect or “scrape” vast amounts of data about billions of people, often including information related to our health and finances? Will it stop creating “shadow profiles” — collections of data about people who aren’t even on Facebook? And most important: Will it change its fundamental business model, which is based on charging advertisers to take advantage of this widespread surveillance to “micro-target” consumers? Until Mr. Zuckerberg gives us satisfying answers to those questions, any effort to make Facebook truly “privacy-focused” is sure to disappoint At the moment, critics can (and have) held Facebook accountable for its failure to adequately moderate the content it disseminates — allowing for hate speech, vaccine misinformation, fake news and so on. Once end-to-end encryption is put in place, Facebook can wash its hands of the content. We don’t want to end up with all the same problems we now have with viral content online — only with less visibility and nobody to hold responsible for it. Sheryl Sandberg, Facebook’s chief operating officer, likes to say that the company’s problem is that it has been “way too idealistic.” I think the problem is the invasive way it makes its money and its lack of meaningful oversight. Until those things change, I don’t expect any shift by the company toward privacy to matter much.