>> Amy Draves: My name is Amy Draves, and I'm here to introduce Rebecca MacKinnon, who is joining us as part of the Microsoft Research Visiting Speakers Series. Rebecca is here today to discuss her book, Consent of the Networked, The Worldwide Struggle For Internet Freedom. She has been at the center of evolving debates about how the Internet will affect democracy, privacy, individual liberties and the other values free societies want to defend. She is providing a way to think about the future of citizenship in the digital age. Rebecca MacKinnon is a Bernard L. Schwartz Senior Fellow at the New America Foundation, where she conducts research, writing and advocacy on global Internet policy and the impact of digital technologies on human rights. Please join me in giving her a very warm welcome. [applause]. >> Rebecca MacKinnon: Thank you so much for having me here today and having the opportunity to talk and hopefully have a bit of a conversation afterwards about some of these issues. And I didn't plan it this way when I first started to work on my book, but it was quite amazing as I was in the middle of writing it that the Arab Spring happened last year, and that of course resulted there my revising it somewhat and kind of refocusing it. But one of the interesting things about the Arab Spring has been the debate about to what extent did the Internet cause the Arab Spring, to what extent did certain social networking platforms play a critical role, would the Arab Spring have happened without the Internet, to what extent did it have to do with anti-corruption movements and anti-torture movement that is were going on in Tunisia and Egypt for some time and to what extent did the ability of people to network together using new digital technologies really make a difference? And I think those debates are going to continue on for some time as social scientists kind of really research what happened and look at all the causes and effects. But what my book really deals with is the question not so much, you know, is the net helping the good guys more than the bad guys or is it helping authoritarian governments more than its helping. But how do we ensure that the Internet evolves in a manner that's compatible ultimately with democracies? And I don't think that the answer is inevitable one way or the other. And I think it depends on kind of what we all do with it, on it, and about it to ensure that the Internet really can support the type of society we'd like to have in our own country and that people all around the world aspire to. One of the really interesting things that happened this fall actually after my book went, you know, went to the publisher and was finalized is that the guy who runs the Tunisian Internet Agency, a guy named Moez Chakchouk, started to give speeches about how his agency during the Ben Ali dictatorship regime had been beta testing censorware and surveillance technologies provided by western companies on Tunisia's networks and that then these technologies were being marketed around the region. But the story of censorship and surveillance and concerns and debates around censorship and surveillance has not ended just because Tunisia had a revolution and just held a fairly successful democratic election to elect a constituent assembly that's now trying to put together a constitution and figure out how to create the institutions they need to support a democratic society in a Muslim country. And as it so happens, you know, several months after Ben Ali stepped down, during the transitional period, the Government started censoring again. Based on the demands of the number of constituencies, primarily of people who were concerned about family values in a Muslim society that certain content was just inappropriate. And so a number of Facebook pages have been blocked since May of last year and a number of other websites that are considered to be inflammatory and obscene according to the standards of the people who are currently in charge of the country. And so this has sparked off a big debate about what to do. And actually Tunisia's high court is going to be ruling on this in the coming month, later this month, about whether or not this censorship should stand. And there are quite a number of people in the democratically elected assembly who want the censorship to happen because we're a conservative Muslim society, certain content is inappropriate. And so this demand is coming from democratically elected representatives. And Moez Chakchouk, this Tunisian Internet Agency head, he's of course on Twitter, along with so many other people, and somebody asked him the question on Twitter about what's going on with the censorship controversy, and he responded, well, it's not decided. And how it -- how it plays out really depends on how non-governmental organizations and activists respond to this and whether civil society keeps pushing for a free and open Internet. And this guy Chakchouck, you know, he's running this agency, and he's actually made statements about how he wants to figure out how Tunisia's networks should be governed and also structured to support a democratic society, so that there can be dissent so that unpopular minorities still have the ability to speak out, so that there's still room for controversial speech and speech that makes some people or makes the majority uncomfortable or that the majority doesn't want to hear, how to ensure that those who have been empowered by democratic voters and who choose to censor certain content in the name of public morays don't abuse that power and start censoring more and more things and how do you hold them accountable and prevent them from mission creep, and when it comes to law enforcement certain amount of surveillance on networks needs to happen just to catch criminals in any country that has, you know, meaningful amounts of Internet use. And so then what do you do to ensure that surveillance in the name of law enforcement is not abused against minorities, against people who may be saying critical things about the ruling government that even if it's been democratically elected? And the interesting thing is that, you know, this is a argument that we're having really all over the world now is how do democratic societies handle surveillance, handle content control, management, censorship, whatever you want to call it, and prevent the abuse of power within this context? There's also the question of just what choices companies are making. And you can have activists kind of using platforms like Facebook to challenge the legitimacy and sovereignty of their governments. And this is an example of a page called -- that's called We Are Khaled Said, and it was set up in honor of a young man who was beaten to death by police in -- at the summer of 2010. And he kind of became the cause celeb of an anti-torture movement that had actually been moving for some years. And they organized several demonstrations in the summer and fall of 2010 and then organized another big demonstration on January 25th of last year, which ended up snowballing and turning into the Tahrir Square demonstrations that brought down the regime. But what's very interesting is that the people who set up this page originally did not feel safe using their real names in -- in connection with their Facebook accounts because they didn't want to end up like Khaled Said. But Facebook has a real name policy. So if your account is being used with a pseudonym, and if that gets reported, and you're more likely to get reported if your page is active and controversial like this one was, then the administrators will disable your account. And so on the eve of a major demonstration they were planning right around Thanksgiving of 2010, the page went down. Just sort of at the critical time that they were trying to organize people it went down because it had been brought to the administrator's attention in Palo Alto that this was -- that the administrators of this page were violating the terms of service. And so then while Ghonim, who was one of the Google employees who was pseudonymously running this page, along with some other people, he got in touch with people he knew in the US. Some of the other people connected with the page got in touch some human rights groups I know who got in touch with Facebook, got it reinstated under the account of an Egyptian woman in Washington, DC who was willing to use her real name so that it was technically no longer violating the terms of service. But this is just kind of one of those examples where, yes, you know, the we are Khaled Said movement and the anti-torture movement and the use of social media around that both with Twitter and Facebook, you know, was a critical element of bringing the regime down. But at the same time, you know, these networks and the choices they're making about how they're handling the rules, the terms of service, the conditions under which people can use their networks sometimes result in consequences for activists that can be troubling and not just people's accounts deactivated but when you have, you know, sudden changes in privacy policies, which happens on Facebook, it seems, with some regularity. People's friends networks suddenly get exposed. And you could say, you know, Mark Zuckerberg says, you know, the world would be a better place if we're all transparent and even. But, you know, if you're being tortured for your Facebook password by Iranian police and your friends are all on the network using their real names, that's kind of unfortunate. Or if -- even if, you know, your -- nobody has hacked into your account, but suddenly you thought your friend's network was private and then suddenly it's revealed publicly, there can be serious physical consequences for human beings that can be troubling. And so the question about, you know, the choices that companies are making. And this is a screenshot. And you've probably heard of Twitter's decision recently that they're going to, as they role out offices in more countries, they're going to have to be complying with local laws to block tweets in certain jurisdictions where they receive legally binding orders that they have to comply with. And so far they haven't implemented the policy. People who run Twitter say they're not going to be responding to Chinese government demands or Syrian government demands and this is mainly about places where they're going to be having offices in the near future, and it really kind of deals with their response to democratic governments' requests, so that they cannot expose their employees to criminal liability in countries like Brazil, the UK, Germany or Italy or places like that or France. But again, we're going to have to see how that plays out. There are a lot of activists in a lot of countries who are wondering whether they're going to still be able to use this tool for activism as they once did. And so, you know, what -- this new world that we've entered into as citizens no matter where we live, assuming we're in a country where, you know, there's a critical mass of people using the Internet, critical mass of people using mobile phones, maybe not even necessarily the majority but at least the majority of the elites, our relationship of -- with our government is increasingly mediated through digital platforms and devices and services. And so then the question is how do we ensure that as this whole ecosystem evolves it evolves in a way that maximizes the empowerment of the citizen, of people, and doesn't end up turning sort of this layer into an opaque extension of state power or as something that kind of maximizes the interests of various corporations but isn't really thinking too hard or too consistently about how their decisions are affecting the citizen and how their decisions are affecting the citizen's relationship with the government. And of course, there are, you know, plenty of arguments going on in this country. And this is the screenshot of Wikipedia a couple weeks ago when Wikipedia decided to join the protest against the Stop Online Piracy Act which was a bill in congress which seems to have effectively been killed, along with its sister bill and the senate. But it would have instituted a system both of DNS filtering, you know, blocking of overseas websites deemed to be infringing of copyrighted content but that would also impose liability on intermediary websites and intermediary Internet services in terms of basically if they were found to be facilitating or hosting infringing content they could be liable and in some cases even criminally. And this would result in, you know, everything from Google to Twitter to Facebook to non-profit groups like Wikipedia and activist groups like Global Voices, which I co-founded, a blogger's network, if we're going to be held liable for everything our users are posting that makes it much harder to run a citizen media community, let alone a company. And there are serious implications for free speech even in this country. So we, as well, are having debates about how do you solve -- resolve legitimate problems? Piracy is a legitimate problem. And so is, you know -- you know, the child exploitation, the need to protect children online, the need to fight crime and terror and to extend that fight to our digital networks. But how do you ensure in trying to solve these real problems you're not eroding civil liberties and eroding the way in which power and the abuse of power is held accountable. And we don't have a lot of good answers to that. But one of the, I think, hopeful results of this kind of online activism against SOPA is that at least in this country but I think also globally people are starting to pay more attention to how laws and regulations and also corporate actions affect our civil rights and our political rights in the physical world. Now, in the debates about SOPA, as some of us liked to -- I actually wrote a thing that -an op-ed that is was titled Stop the Great Firewall of America. And of course I don't think the United States is going to turn into China anytime soon. But one of the things that did trouble me about the Stop Online Piracy Act was that it was reaching for mechanisms in the pursuit of solutions to legitimate problems that are technically and, in many ways, legally very similar to what China does to censor the Internet. And this is the great firewall of China in action. That's a screenshot using a Chrome browser, sorry, of what you see when you try to access Facebook from a Chinese Internet connection if you're using a Chinese browser of course it will come up in Chinese. But that's really only the first layer of Chinese Internet censorship. And the second layer is what lawyers in this country might call intermediary liability. The Chinese call it self-discipline. And what this means is that any Internet company running a social media platform, search engine, or ISP or mobile service company, anything that's involved with transmitting user or customer content through their network or involved with publishing user content is held responsible for what their users do. So if your social network or your e-mail service for that matter or your instance messaging service is being used in the ways that the government finds threatening the state power, you can have your business license revoked very easily. And there's actually -- this is an award ceremony I attended in China in 2009. I sat way in the back so I wasn't able to get a good picture. So I used this website photo from the event instead. And they gave out basically this award called the self discipline award to the top 20 companies and kind of provincial level websites who did the best job at policing their users' content and keeping off harmful content and making the Internet a safe and harmonious place. And what's really interesting in the language they use it sounds really similar to the child protection and kind of crime fighting language that we use when talking about keeping the Internet safe, except their definition is much broader. And oh, by the way one of the recipients that year was Robin Li, the CEO of Baidu, China's main search engine. And so Chinese companies are actually doing most of the handiwork in terms of deleting controversial content and helping to monitor users. And I know people actually who work for Chinese Internet companies and have kind of sat there at lunch while they're receiving texts and e-mails and messages from several different government departments telling them what they need to delete and, you know, this account is getting too much traction on that topic and you need to freeze it and so on. So it's very actively managed. And this is an example I thought it would be kind of fun to show. Baidu has a blogging service. You know, there's several hundred different kind of blogging services in China, and this is one of them, that, you know, if you try to post certain content you can't even publish it. So this is an article that I tried to publish on a blogging -- on Baidu's blogging platform about the Nobel Peace Prize winner Liu Xiaobo and the democracy petition that he had written and his jailing, and I couldn't even publish it. I get this message saying sorry, your article has failed to be published. It contains inappropriate content. Please check it. So, you know, this is one way in which -- you know, it's not just about blocking content that, you know, smart people can access through the use of the proper VPs and circumvention tools, some content is just kept off the Chinese Internet with some thoroughness. And this is kind of a fun example I want to show. There's been a lot of coverage of what's called Weibo, which is the Chinese version of Twitter, run by a Chinese company inside China. And so a lot of the international media organizations use Chinese social media to reach Chinese audiences because the foreign social -- you know, Facebook and Twitter are all blocked. So you want to reach the Chinese public, you use Chinese social media. So Deutsche Welle, which is a German radio station with a Chinese service set up an account on Weibo to post, you know, little posing about various stories they were doing. And their account got deleted because it -- they posted something that the Weibo administrators felt was going to get them in trouble. So then they reestablished a new account and they wrote this thing saying we've reestablished our account. We've never done anything against Chinese law and we never touched any taboos that Sina has specifically outlined in its terms of service. And then they get this little popup warning, your account will be deleted if you create more trouble. And sure enough, after short order, they got deleted. And this is an error message that appeared soon after their account was deactivated for the second time which said, and I don't have a translated here but it says, you know, we're sorry, you are -- you have -you have accessed a page that no longer exists or you've accessed this page in error and it's just kind of gone and there's this cute little cartoon. But people who use the Chinese Internet run across these types of messages all the time. And it's not the kind of network 404 browser error, but it's an error page that's been put up by the company saying, you know, this content no longer exists, it's been taken down. And you see that all over the Internet in China. And you can't circumvent your way around it because the content has been deleted. And so China in a lot of ways -- and I discuss this in the book, in a full chapter, is kind of exhibit A for what happens when you take an unaccountable government and pair it with corporations that just do what the government wants and you get China. And it's kinds of one of the reasons why the Chinese government has enabled people to feel more free than they've ever felt before, people are able to talk about more things on social media than they could in traditional media. They can report on local malfeasance or train crashes and this kind of thing. But when it comes to trying to organize an opposition party or circulate a petition for multi-party democracy, people doing that will go to jail as usual. So they've managed to exercise enough control over Chinese Internet companies -- you know, not total control but enough control to prevent people from using social media in a way that people did in the Middle East and North Africa. Whoops. I just went backward. So one of the things I talk about in the book is how sovereignty is getting more confusing in the digital age. And so, you know, this is the traditional map of the world with countries. And in a democracies at least, you know, we vote for our congress and then our congress passes laws and those laws govern what companies can do in this country or, you know, and -- but the problem is that you have a situation where congress can pass a law that affects American Internet companies which have customers all over the world, and it will affect people everywhere else who did not vote for congress. So you can have a situation as we did essentially with SOPA and PIPA where this law was going to benefit some key constituencies in the United States but it was going to have a huge impact on Internet users around the world and those Internet users had no say in the matter, no way of holding accountable the locally elected -- the nationally elected representatives who are democratically passing this law. You know. And we can get into arguments about lobbying and the way laws get made and passed in this country, which is another interesting question. But the fact of the matter is kind of the way -- because the Internet is a globally interconnected state but you have nation states passing laws that affect this global network, it ends up being sort of a mismatch when it comes to political accountability. And you also have social networks, and the Internet companies, technology companies, that this -- you know, this is the world map of social networks. And Facebook is the most popular network in much word these days except for China where it's blocked and in Russian speaking companies where for a complicated set of reasons -- region, you still have Russian language services that are dominant. And sometimes people are able to use these networks or quite often to challenge the sovereignty and the legitimacy of the government that they're living under physically. But other times these services and companies are making decisions that might affect you negatively in terms of your ability to speak truth to power and understand what's going on in the world and to organize. And so I talk in the book there's a section of what I call the Sovereigns of Cyberspace and this issues where, you know, you have companies and companies aren't supposed to be democracies, you don't want to hold an election amongst all your customers and users before you figure out how to develop a product exactly. You can't really run a company that way or innovate or develop software or do anything. But at the same time, you know, the companies that are running online platforms, online services and even a lot of devices that are globally networked together are playing a governance role. They're determining what we can and cannot do in our digital lives which has an impact ultimately on what we can do economically, what we can do politically in our physical world. And the question is to what extent are the sovereigns in cyberspace considering these issues and to what extent should they and to what extent should they be held accountable for their decisions in terms of how it affects us politically. So to kind of put it another way, just sort of picture this mismatch that's going on. So technology companies are mediating a relationship with government. And under sort of a demographic nation state, kind of the idea is that you citizens vote for their political representatives and then if companies are doing something that is harming the citizens' rights in some way or making them sick or, you know, polluting the air or whatever it is, then government goes and regulates the companies. And that's kind of the chain of accountability, right? But then you have the problem where companies are increasingly considered people legally in this country and have direct influence on the government for that reason and so increasingly you have laws that get passed that are basically written by certain public sectors. And so that's a question. So you've got a little bit of a problem with that sort of accountability loop. And companies, because they're shaping the way we can organize online and how our identity's configured, what the people know about us and what they don't, and what the government might know about us and what they don't and what the government might know about us and what it might not, you know, through law enforcement requests and so on is also having an impact on kind of how our relationship is shaped with government. So one of the things I raise in the book is the need for citizens to kind of engage more directly with companies and to organize more directly to raise concerns and perhaps even bargain and negotiate a bit about, you know, I'm a user of this particular product and I like this product a lot, I want to keep using it, but here are some real concerns I have about the implications on my privacy or on my rights that I would really like you to address. And that we're starting to see I think with some companies people starting to organize in a more systematic way to try to get companies to change certain practices. And I talk more specifically about that in the book. But then, like I said, you've got the problem where, you know, this is not nation state by nation state kind of situation. And so what a company in one country does or, you know, that is based in one country is of course affecting people all over the world and also what many governments, particularly the US government but also other governments that are affecting companies that of global user bases, they're making decisions and making regulatory moves that end up affecting Internet people all over the world who can't hold them accountable. So then there's the whole issue of, you know, if somebody in India is being affected by a US government law that's affecting the social networks they use they can't really take it to the US congress because the US congress didn't care what that person in India thinks because they're not a constituent, so the person in India really does need the companies to listen to them and kind of speak on their behalf if they're going to address this bad law. So that kind of provides sort of yet another layer of complexity here. And so because I just showed a really excessively complex graphic I'm going to now show a funny video. And this is my favorite clip from Monty Python and the Holy Grail. How many of you have seen that movie? Yeah. So you'll all remember this. I was one of those people in high school who had it memorized. And this is my favorite clip. And I think it really highlight one of the points I try to make in the book, which is [inaudible]. >>: [inaudible] majority in the case ->>: Be quiet. I order you to be quiet. >>: Order, who does he think he is? >>: I'm your king. >>: Well, I didn't vote for you. >>: You don't vote for kings. >>: Well how do you become king then? >>: The lady of the lake, her arm clad in the purist shimmering light held aloft Excalibur from the bosom of the water, signifying by divine providence that I, Arthur, was to carry Excalibur. That is why I'm your king. >>: Listen. Strange women lying in ponds distributing swords is no basis for a system of government. Supreme executive power arrives from a mandate from the masses, not from some farcical aquatic ceremony. >>: Be quiet. >>: You can't expect to wield supreme executive power just cousin some watery start threw a sword at you. >>: Shut up. >> Rebecca MacKinnon: I love this clip for a lot of reasons, but the reason it's relevant to the book is that it juxtaposes two completely ways of thinking about power and governance, the first being the divine right of kings, which up until a certain point in history everyone assumed was the only way you could run a physical -- you know, it wasn't even called a state. Run a Kingdom. You know, naturally you had this king who God sort of anointed and gave him power and everybody has to do what he says. And then eventually we evolved politically or we innovated politically and over about 800 years arrived not only at the notion of the consent of the governed, but actually had the American Revolution which was the first attempt -- first successful attempt of implementing consent of the governed and the notion that, you know, divine right of kings don't work so well for most people and that government should only be considered legitimate if it's derived from the consent of the people who it's governing. So one of the problems is that while that's worked for us, you know, and that's kind of really even with authoritarian countries today, they still try and make a case with how they're governing with the consent of their population, you know, by holding fake elections or, you know, whatever it is they try to do to kind of justify this argument. But consent of the governed based on the nation stayed we kind of hit yet another point in history where our assumption about the way things should be done is no longer quite working for us so well anymore, for all the reasons I talked about, how, you know, the nation state trying to govern these global networks is resulting in kind of unaccountable situations. And you have private power govern digital spaces. And so how do we move from consent of the governed to consent of the networked? And that's what we need to figure out. And we don't have a lot of answers I think at this moment we're kind of at the Magna Carta moment, we're not at the American Revolution moment, and that we're recognizing that the old way of organizing power and organizing government isn't working so well for the majority of Internet users on the planet. But we're not quite sure how to get to where we want to go. And another way to think about it, I like to quote a professor at the University of Austin, Texas, who talks about the pre-Internet world as a desert. It was sort of an informational and communications desert. It was an economy and governance system of scarcity basically. And then suddenly the rain came and we now have a rain forest in which there are all these new organisms. And the way you organic an economy, the way you organize, you know, a civilization in a rain forest is totally different than the way you would organize one in a desert or system of governance or, you know, security systems, and so on. And so we're having to rethink everything. And we haven't quite worked out how to create a sustainable and safe and civilized civilization in the rain forest that really kind of serves the interests and rights of everybody who is inhabiting this rain forest. And one of the other things -- and I'm almost done and we can get to questions. One of the other I think very important elements that I talk about in the book, is what I call the digital commons. And it's a combination both of technical commons of the shared standards that are not proprietary that make the Internet possible and open source software that civil society in particular and activist groups in developing countries really rely upon to run their organizations because they can't afford a lot of licensed products. And also just commons in terms of content with people who are creating, you know, whether it's written content or visual content or, you know, creating platforms for organization, based on motivations that are not commercial, that are social, that are cultural, that have other motivations. And that this is really the glue that I think, you know, I argue in the book that we need -- yes, we need companies. Yes, we need copyright of some kind. You know, property rights are important. But it's also really important to have a robust and healthy digital commons if we want robust and healthy democracy going forward. And just really quickly, you know, as we're thinking about -- I know Mary's here, and she's been working on Internet governance issues. You know, there are experiments going on with new words to coordinate and governs resources across the Internet and efforts to create more forms of what's known as multi-stake holder governance, so it's not just about governments making decisions or groups of engineers making decisions but bringing together, you know, civil society groups and technical communities and governments and sort of everybody who has a stake in how these resources are managed and how their regulations are shaped to negotiate how to go forward. And I talk about that. You're also starting to see a number of kind of declarations of rights of people basically taking the universal declaration of human rights, which has stood as some pretty good stead for a long time, and it's turned out to be quite technology neutral in its language. And taking it and saying here's how it applies to the Internet. And here's how governments and companies should think about how to protect fundamental human rights while they're either regulating the Internet or building technologies and platforms and services on the Internet. People are also, as I mentioned, starting to get more and more political in terms of policy. And this is one of many activist websites around SOPA. And there are also initiatives, one that Microsoft is part of the global network initiative where the few companies that have had the guts to sign on commit to core principles on free expression and privacy and agree to be evaluated independently on, you know, how -to what extent are they actually living up to these principles and to work with civil assert groups, human rights activists and academic researchers to figure out how to anticipate difficult human rights problems around the world and also resolve things that come up. So Microsoft is a member of the global network initiative actually worked very closely with some human rights groups that were part of the initiative when you guys ran into problems with Russia with the Russian government using pirated software, pirated Microsoft software as an excuse to crack down on activist groups. And some of the human rights groups actually worked in a -- you know, kind of worked together to help Microsoft figure out how to respond to this and how to help the groups that had been inadvertently hurt by this and to move forward in a way that was kind of win/win for everybody. So, you know, what it is in many ways is a recognition that, you know, companies -recognition by a few companies anyway that there are responsibilities that, you know, just as, you know, if you're running a factory of course your primary -- if you're making tennis shoes, your primary goal is to sell a lot of shoes and make a lot of money. But if you're hiring 10 year olds and you're polluting the air and water, then you kind of loose your social license to operate and, you know, you're creating so much negative value for society that, you know, the question is how are you fitting into this picture. And so similarly with privacy with freedom of expression with the kind of things that we need to make democracy healthy and functional going forward. What are companies' responsibilities and how can they contribute to that while still -- you know, how do you do well and do good at the same time? And so that's the kind of issue that the Global Network Initiative which I'm part of that we're trying to work with companies to figure out. And we're, you know, as I mentioned before, we're starting to see more company-targeted activist groups who are trying to, you know, raise awareness around certain things that companies are doing and trying to get them to change. And I think we're going to see more and more of that in a more organized fashion. And just to sum up because, you know, this is my favorite paragraph from the Occupy Wall Street Movement that I think applies kind of everything. We're at a point in time where, you know, there's a lot of things that are not working, both in terms of the way kind of the corporate governance norms have functioned in a lot of sectors and the way government has functioned in a lot of sectors globally and a lot of people are demanding new ways of thinking about things and new solutions. And we're just at a moment where we all -- you know, I tend to believe in capitalism. I spent too long in communist countries not to think that innovative companies aren't a good thing. But we need to work together to figure this out and to deal with I think some of the failure of our current system both of politics, geopolitics and corporate governance and kind of business norms to constrain power and to support the kind of values that, you know, we hold dear in this country and that people around the world aspire to. But with that, I've spoken a little longer than I had planned. I'll stop and welcome your thoughts or comments or questions or whatever. Yes? >>: Do you think they need new institutions or some kind of new laws where we can have to make sure free Internet? >> Rebecca MacKinnon: Yeah. Well, I think that's a really good question. And I think the danger with new institutions is sort of how you build them, you know. And so there's some people who are sort of calling for more global government, you know, to deal with the mismatch. But that's sort of who runs it. So there's a big argument going on about whether the United Nations should be more involved with governing the Internet or whether it should be left more to kind of decentralized multistakeholder bodies to run the Internet. And, you know, the problem with the United Nations is a lot of the nation -- a lot of the governance -- a lot of the governments in the United Nations, you know, are not very accountable to their people and, you know, one could argue don't have their people's best interests at heart. And so I think it's probably less at the moment about building new institutions and more about getting everybody more involved in existing processes. So I think part of the reason we have so much of this governance mismatch is that I think people who use technology use the Internet kind of think of themselves as users rather than kind of citizens of these spaces and so aren't kind of pushing back when they're unhappy about how things are being run. And people need to get more active in that regard. And also pay more attention to, you know, if their government is -- if their legislature's passing stupid laws to, you know, pay attention to that. And, you know, one of the problems is that the press doesn't cover this stuff very much so, you know, we need to -- we need to sort of as a society get more concerned and recognize that kind of Internet law and regulation is really central to our liberties and our futures going forward and we need to pay as much attention to it as we pay attention to, you know, other kinds of regulations. Yes? >>: The fact that you gave a great shout-out to GNI and I know you [inaudible] in helping set it up. And you just talked about the need to get more people, companies involved in the process. I think the GNI is great, it's a multistakeholder and it's voluntary. Why hasn't it gained more traction? >> Rebecca MacKinnon: Yeah. >>: What's the problem? >> Rebecca MacKinnon: Well, I think part of it is that it takes a while for any new industry to kind of recognize that it's not God's gift to humanity in all ways. And it took decades for other industries to recognize that they needed to be accountable to certain things, other than just making money. And I think because a lot of technology and Internet companies, you know, do have aspects to them that are very empowering and that are being used by human rights activists in really positive ways, I think that leads to arrogance and to some company just saying, you know, we don't need help, we're already sort of a force for liberation and you're just kind of dragging us down. And, you know, Facebook's like oh, well we're going to have one billion users by the summer anyway, obviously people are fine with us, or we wouldn't be growing, so, screw you, why do I want a bunch of human rights activists telling me what to do, you know, kind of attitude. And I think some companies kind of are worried if they come out and say we accept we have responsibilities that they're painting a target on their back and that they're kind of opening themselves up for criticism. And I'm hoping that over time that that will change and that companies will be more proactive and recognize if they make proactive commitments to their users and customers' rights that that's going to build trust which is going to be good for their business. And if they recognize -- if they admit publicly that they're not perfect and they can't figure out everything by themselves and they're not God, and that if they're reaching out to, you know, others kind of for help and cooperation in working through difficult problems, that's going to make them better and it will build their trustworthiness over time. That kind of takes a certain maturity that I think a lot of Internet companies both literally and figuratively do not have at the moment. But I'm hoping that over time for the same reasons why a growing, you know, number of companies and, you know, never without problems are recognizing that sustainability is actually in their long-term business interests, that more companies will see this as being in their long-term interest as well. >>: [inaudible]. >> Rebecca MacKinnon: Yeah, I hope so too. Yeah? >>: I agree with what you're saying. But aren't -- the choices the companies make, who would prevent them from being able to have a presence in certain countries? >> Rebecca MacKinnon: Yeah. >>: So it's a little bit of a catch 22 because if their technology is being used within a country in a way that benefits ->> Rebecca MacKinnon: Yeah. >>: -- benefits human rights, but if they come out publicly and with that agenda, it's very possible that ->> Rebecca MacKinnon: Yeah. >>: -- that they're going to be blocked in these countries where they're actually being useful now. So it's a little bit of a catch 22. >> Rebecca MacKinnon: Yeah. >>: It seems naive to think that we're ever going to have consensus amongst all the countries about, you know, everybody buying in say yes, the Internet is, you know -everything can post anything they want and we'll let everyone see it because they're scared. >> Rebecca MacKinnon: Right. >>: The US Government is scared, forget these totalitarian governments. >> Rebecca MacKinnon: Yeah. >>: So it's really -- I think it puts companies in a difficult place. >> Rebecca MacKinnon: Right. Yeah. >>: It's not necessarily just because they -- it's a financial incentive. >> Rebecca MacKinnon: Yeah. No, I think you're -- I totally agree with you. And actually in the book I talk about this. And it's one reason like why would the Global Network Initiative. We're not saying that companies should pull out of China. And Google made that decision, you know, based on their own business and their own set of criteria. But, you know, Microsoft is staying in China. And where we end up coming down is that companies need to be transparent about what they're doing. So, you know, ultimately there's very few countries in the world where governments aren't making demands on the companies to either take content down or hand data over. There's basically nowhere, right? So if you're going to keep the Internet totally free of all controls you're not going to do business anywhere. And so the question is, are you doing this in a way that the users understand what's going on and if it's being abused they know who to hold accountable and hold responsible. And that's one of the issues in this country as well as -- you know, let alone China. So for instance with Microsoft in China, what makes bing different from Baidu for instance is that bing only responds to legally binding requests and in writing from what I understand, whereas Baidu, you know, every time they get a phone call or text message from the authorities they'll take stuff down. And so, you know, it's, you know, how can you -- you know, it's an acceptance that we do need to be realistic. And it's better -- I agree it's better that Microsoft's in China than not. But it's also about making -- you know, it's not about engage or disengage. It's about how you engage. And you do have to make difficult choices. And, you know, Microsoft chose not to do Hotmail in China. Why? Because they didn't want to end up like Yahoo, you know, handing dissidents over to the government. So while Microsoft is in, it's also made some, you know, decisions that it's thought through what way it's going to be in had. And so I think it's a lot or, you know, just thinking through what's going to happen in certain scenarios once you go in before you do it. I think that's really important too. So, yeah. You had your hand up first, I think. >>: In several places you seem to equate Internet with corporate Internet. I think your first slide asked how to ensure that the Internet evolves in a way compatible with democracy, and your final slide was how to ensure that capitalism evolves in a way with democracy. And also the slide where you had citizens Internet government changed into citizens companies government. >> Rebecca MacKinnon: Yeah. >>: I'm wondering if there's any use or hope for the non- corporate Internet, like Usenet at all to fulfill the goals you're talking ->> Rebecca MacKinnon: Well, I think there are very important non-commercial spaces that, you know, have been operating from the beginning and that have sizeable communities. I think it may be unrealistic to expect that they will be used by the majority of Internet users on the planet, especially given that, you know, in the news just the other day you had smartphones outpacing PC sales. And you have the situation now where the next billion people going online are going to be doing so almost exclusively through their phones. And it's just becoming harder and harder, you know, to use a phrase from another author who writes about the Internet for people who are coming on the Internet are kind of the new users of the Internet, there's less and less ability to control what tools they're users or the context in which you're using or to choose a non-commercial service over a commercial service. And that's quite troubling too. And I don't think it necessarily should be that way, but that's where it's going. So realistically while I write about some of these projects, Tor and others and efforts to create non-commercial spaces and the need for these spaces to exist for people who have the time and ability to use them, they're absolutely vital and critical. The majority have the world's Internet users are going to be using it through almost entirely commercial spaces, you know, with the exception of maybe somewhere down the stack there's a Linux server or something which is, you know -- but, you know, it's -and so without the corporate accountability piece, I don't -- there are a number of people who have been writing about, you know, now the solution is to create sort of a non-government, non-corporate parallel new Internet and that's the solution. And I think that's very nice and utopian, but it's going to work just about as well as the hippie communes did in the '60s, I'm afraid, for most people. You know, there are some people who really need to go off the -- I'm making a slightly exaggerated, you know, equation. But -- or analogy. But, you know, the fact of the matter is that most people's digital lives are going to be mediated through corporate services at somewhere along the way and it's very, very hard, you know, going from the ISP to kind of everything else to be completely commercial free. And so without accountable companies, I think it's -- unless we push for that very hard, I think we're being unrealistic if we say okay, we're going to build a new Internet over here. You had a question. >>: You spoke about this already, but I just want to perhaps [inaudible]. What does responsible business look like now. You know, at Microsoft our performance groups are very into stacked rankings, so [inaudible] graded on a curve. And you don't have to answer the question, but I'm curious sort of what the curve looks like for you right now in terms of what companies are wrestling with the issues thoughtfully and others that aren't. >> Rebecca MacKinnon: Yeah. Well, you know, definitely I would say that the companies that have joined the Global Network Initiative, which are Microsoft, Yahoo and Google, I mean, nobody's getting everything right for sure, you know. But are at least trying to think through these issues and are reaching out to experts who can help them think through these issues as they're making their decisions. You know, I think Twitter has kind of decided that they're going to kind of go it alone more, but they're actually doing many of the things that GNI is kind of advocating that companies do. They're sometimes having trouble communicating what they're doing in a way that people understand what their intent is. And we'll kind of see how they evolve over time. I think, you know, to be honest, I mean, I've seen a great deal more arrogance coming from Facebook about, you know, sort of an ideology of its founder that troubles me. Although they, too, as they mature as a company, are starting to reach out more to people in the human rights community for help on certain cases and so on. You know, so I guess -- yeah, it's hard to kind of create a -- or create grades on a curve. Cisco, Cisco's really interesting because they're not as bad as the media would, you know, paint them to be. I think -- yes, their routers are used for filtering but, you know, any router can be used for filtering. I think there's been some misreporting on, you know, the configuration of some like special router and so on. But they've been horrible about -- I think a lot of their marketing in China has -- you know, and the extent to which they market products to law enforcement and so on can be troubling. They've been terrible about the way they communicate about these issues with the public about their refusal to have a dialog with their shareholders even about human rights questions that have come up in the media. And there have been socially responsible investment companies that have dropped Cisco's stock because they've gotten so frustrated with Cisco's refusal to just have a conversation about these issues. You know, meanwhile, Cisco actually -- you know, I sat in on an IETF meaning where there were some arguments going on about privacy standards and the Cisco engineers were totally on the right side of the issue. You know, so it's not like, you know -- actually I think Cisco, if it were willing to have more of a conversation about what it's doing would probably benefit. But they kind of aren't, for whatever reason, aren't doing it. So that's one of the things, too, just companies being willing to have an honest dialog with the various stakeholders and shareholders even about the challenges they face and the decisions they're trying to make. So, you know, one could go through -- RIM has been pretty disappointing in though they've been handling government demands and a range of companies and very opaque and it's really not clear what the heck's going on. And I'm not a Blackberry users but based on how they've handled surveillance demands from a range of governments and just the way they've kind of refused to clarify what's going on, you know, is as much that as anything else. You know, I feel less inclined to want to ever trust their product in the future, should I be in a -- given a free Blackberry by somebody. So, you know, a lot of it again is just engaging with people who are concerned. And I think when you engage people recognize that, you know, as she pointed out, it's complex issues. A lot of times it's trade-offs and shades of gray. And a lot of it is about the users also understanding what's an intelligent use of your product if they happen to be at risk, and what's a not intelligent use of your product if you don't want to get jailed for something controversial and just what certain products are for and what they're not for and the need to communicate that better so that people don't feel victimized, you know, in ways that perhaps were the not fair on all sides. So, yeah, that's kind of the slightly indirect way of answering your question. But yeah. >> Amy Draves: [inaudible] sign books. >> Rebecca MacKinnon: Okay. We can chat over there while I'm signing books about other stuff. >> Amy Draves: Thank you so much. >> Rebecca MacKinnon: Thank you. [applause]