trusted computing and digital rights management – a case study in

advertisement
1 TRUSTED COMPUTING AND DIGITAL RIGHTS MANAGEMENT – A CASE STUDY IN NEW TECHNOLOGIES AND PRIVACY Lindy Siegert Introduction Privacy is recognised explicitly in the International Covenant on Civil and Political Rights 1 but is also seen as integral to other rights such as security of the person and freedom of expression. The 2005 annual report on the state of privacy published jointly by the Electronic Privacy Information Center and Privacy International goes so far as to claim that privacy "underpins human dignity and other values such as freedom of association and freedom of speech. It has become one of the most important human rights of the modern age." 2 The New Zealand Privacy Commissioner, along with most data protection authorities, has a vital interest in new technologies such as trusted computing (TC) and digital rights management (DRM) and their potential to affect personal privacy. Some may be protective of privacy, such as one that camouflages closed circuit television recordings preventing identification unless needed. Others can intrude on personal privacy by increasing the potential for surveillance, or exposure of personal information. 1 International Covenant on Civil and Political Rights (19 December 1966) 999 UNTS 171, art 17. 2 Marc Rotenberg, Cédric Laurant and others Privacy and Human Rights 2005: An International Survey of Privacy Laws and Developments (Electronic Privacy Information Center and Privacy International, Washington DC, 2006). Also available through the Privacy International website at < www.privacyinternational.org> (last accessed 10 July 2007).
2 Human Rights Research As government is a key collector of personal information, its adoption of new technologies can dramatically affect privacy. The Privacy Act 1993 regulates government data matching for exactly that reason. TC and DRM technologies have been on the radar for privacy advocates for some years. They promise enhanced security and protection for information but have raised questions too. 3 This paper provides a case study of how government can develop a response to new technologies that respects and protects human rights while capturing the benefits. The Human Rights Framework and New Technologies New Zealand is fortunate that its Human Rights Commission can conclude that the country "meets international human rights standards in many respects, and often surpasses them", while acknowledging that there are still areas for improvement. 4 Challenges that the report identifies range from the eradication of child poverty to the human rights challenges raised by advances in biotechnologies such as DNA mapping and sequencing. Subtle challenges may, however, be more easily overlooked than the dramatic ones that attract media attention. In the case of new information and communication technologies (ICT), a major challenge is to recognise the potential human rights issues early enough to apply a human­rights­based approach to the policy development around their possible adoption and implementation. Given the ubiquity of personal information in ICT data stores and log records, identifying the privacy impacts of new technologies is a useful 'first cut' indicator of possible human rights issues. As privacy protective legislation generally includes a 3 The 2006 Big Brother Awards from Italy saw Trusted Computing awarded "Most Invasive Technology" and the Trusted Computing Group received "Worst Private Group": see <www.edri.org/edrigram/number4.10/bbaitaly> (last accessed 10 July 2007). 4 New Zealand Human Rights Commission Human Rights in New Zealand Today/Ngā Tika Tangata O Te Motu (Human Rights Commission, Wellington, 2005) Executive Summary. Available in hard copy through the New Zealand Human Rights Commission (email nzaphr@hrc.co.nz, phone (04) 471 6754 or 0800 4 Your Rights, or by writing to the Human Rights Commission, PO Box 12­ 411, Wellington). Also available at <www.hrc.co.nz> (last accessed 10 July 2007).
Trusted Computing and Digital Rights Management right of access and correction to personal information held by an agency, privacy impact analysis will often highlight other issues around transparency and accountability. Privacy and new technologies Various international conventions and agreements clearly identify and accept privacy as a basic human right, albeit one balanced by other rights such as freedom of expression. 5 The close ties between privacy and the collection and handling of personal information means that privacy advocates have long had an interest in how ICTs can affect privacy. Two recent publications that grapple with the human rights issues raised by new technologies 6 focus strongly on the privacy issues as exemplifying the tensions that new technologies bring for human rights. The UNESCO paper says about sensor technologies: "In terms of human rights, one of the most glaring issues is that of privacy." 7 Often, new technologies do not raise new privacy issues so much as provide new environments where old issues appear again, often with their intrusiveness magnified. Closed circuit television monitoring of public and private spaces has been around for a long time. Early privacy responses spoke to human monitoring of camera images displayed on screens and the proper handling of films that recorded what the cameras saw. Now filming and recordings take place digitally. That has changed the economics of storing and re­playing recordings, making long­term capture of data more attractive and raising concerns about the need for such long­term storage. On the privacy­plus side, however, it has also brought the potential for digitally masking identifying features of individuals unless and until their identification is required. 5 International Covenant on Civil and Political Rights, above n 1. 6 Mary Rundle and Chris Conley Ethical Implications of Emerging Technologies: A Survey (UNESCO, Paris, 2007) available at http://unesdoc.unesco.org/images/ 0014/001499/149992E.pdf (last accessed 10 July 2007). See also Dilemmas of Privacy and Surveillance: Challenges of Technological Change (The Royal Academy of Engineering, London, 2007) available at < www.raeng.org.uk/policy/reports/pdf/dilemmas_of_privacy_and_surveillance_ report.pdf> (last accessed 10 July 2007). 7 Rundle and Conley, above n 6, 53.
3 4 Human Rights Research The UNESCO paper sums it up for radio­frequency identification technology (RFID) this way: Thus, in some ways, RFID represents a microcosm of the possibilities and threats of ICTs. If carefully deployed and regulated, RFIDs present an opportunity to improve many facets of life; if used without adequate safeguards for security, privacy, and other liberties, they threaten to bring nightmarish consequences. 8 An industry response to the concerns about these "nightmarish consequences" can be seen in a Royal Academy of Engineering publication where they call for information technology services designed for privacy, rigorous public debate on how personal data can be protected, and how reciprocity between data subjects and data collectors can be built into business processes and systems. Further substantial industry responses can be seen in the emergence of privacy and other rights­oriented working groups within the ISO (International Organisation for Standardisation) standards development framework. 9 The Financial Services Technical Committee sub­committee on core banking services is developing a standard for Privacy Impact Assessment (ISO/CD 22307) for their industry. Other work is underway on cross­jurisdictional and ethical guidance for new technology implementations in biometrics (ISO/IEC JTC­1 SC­37) and privacy and identity management in security techniques (ISO/IEC JTC­1 SC­27). At the national level, GS1 New Zealand has developed a voluntary standard for the responsible use of RFID in retail commerce 10 and the Biometrics Institute has developed a voluntary Code of Practice 11 for the use of biometrics that is approved by the Australian Privacy Commissioner. 8 Rundle and Conley, above n 6, 50. 9 For more information about the International Organisation for Standardisation (ISO), its committees, standards, standard development processes, and how it operates, see <www.iso.org> (last accessed 10 July 2007). 10 The EPC/RFID Retail Consumer Code of Practice is available from GS1 New Zealand at <www.gs1nz.org/EPCglobal/DownloadEPCRFIDCodeofPractice. aspx> (last accessed 10 July 2007). 11 Biometrics Institute Privacy Code is available from the Biometrics Institute at < www.biometricsinstitute.org> (last accessed 10 July 2007).
Trusted Computing and Digital Rights Management Cyberspace and the human­rights­based approach A fundamental concept of the human­rights­based approach is that the state has a duty when developing policy to do more than mere vetting against such legislated items as the Privacy Act 1993 and the New Zealand Bill of Rights Act 1990. Agnes Callamard reiterated this in a statement on human rights in cyberspace where she said "Governments must not only respect people's right to communicate. They must also protect and fulfill this right." 12 A human­rights–based approach to policy development should include participation, accountability, be non­discriminatory, empower individuals and groups, and be tied to the international human rights framework. However, cyberspace exists over privately­owned, if regulated, networks. Dr Callamard asserted responsibilities towards freedom of expression for the private sector in cyberspace, criticising the ability of internet service providers to make decisions regarding content and its removal without due process, transparency or legal framework. Similarly, the technologies themselves are developed by the private sector. Companies are bound by the legal frameworks of the countries in which they do business, but they are not parties to the treaties and agreements which form the international human rights framework. The United Nations established the World Summit on the Information Society 13 to provide a forum to begin to tackle some of the issues in this complex environment. In New Zealand, specific laws already exist to protect personal privacy in both the private and public sectors and assist government accountability by ensuring public access to government documents. 14 The statutory officer responsible for the operation of the Privacy Act is the Privacy Commissioner. 12 Agnes Callarmard "Strengthening the Human Rights Framework in and for Cyberspaces" (Paper presented at the UNESCO International Conference on Freedom of Expression in Cyberspace, Paris, France, 3­4 February 2005) available at <http://portal.unesco.org/ci/en/files/18245/11081314251statement_ Callamard.doc/statement_Callamard.doc> (last accessed 10 July 2007). 13 World Summit on the Information Society <www.itu.int/wsis/index.html> (last accessed 10 July 2007). 14 Such laws as the Privacy Act 1993, the Human Rights Act 1993, and the Archives, Culture, and Heritage Reform Act 2000 ("the Archives Act").
5 6 Human Rights Research The Privacy Commissioner and new technologies The Privacy Commissioner has a range of options available for responding to the challenges that new ICTs raise for privacy. These include taking a "wait and see" position, developing educational material, or working with other privacy authorities to raise awareness of privacy­sensitive approaches to new technologies. Two years ago, the Commissioner established a Technology Team targeting three areas of activity:
· assessment and compliance monitoring of government authorised information matching programmes
· responding to the privacy implications of e­government initiatives (such as the all­of­government authentication project)
· developing the capability to assist the Commissioner in understanding and responding to the privacy impacts of new technologies (such as the surveillance potential of geographic location information provided by cell phones). An example of the Office's involvement in these areas can be seen in the growth of authorised government information matching from 16 active programmes in June 2002 to 40 in June 2006. In that same period, the number of records about individuals disclosed in these programmes doubled. The all­of­government authentication project has been the subject of five privacy impact assessments so far with more expected as the project progresses. The Technology Team has also produced consumer guidance on topics like websites and personal information. These approaches tend to be reactive and require both a fairly good understanding of what the technologies do and the facts involved in particular cases. In the case of TC and DRM, the involvement of the Office was unusual. Here, the role we accepted and the nature of government's response to the technologies were an attempt to influence the evolution of these technologies rather than respond to them.
Trusted Computing and Digital Rights Management TC/DRM Principles and Policies Project In November 2003, the State Services Commission issued formal advice 15 to all government agencies that they should not permit the operation of the DRM features, called "information rights management", of the then recently released Microsoft Server 2003 and Office 2003. That advice stated: Using IRM may negatively impact on agencies' ability to meet their obligations under legislation such as the Official Information Act, the Archives Act, the Privacy Act, and impede the working of other legislation such as the Protected Disclosures Act and the National Library of New Zealand Act. A year later, the Commission reiterated that advice in a briefing paper on TC 16 and published a comprehensive paper on internet security that included issues raised by DRM technologies. 17 In that paper, the Commission stated government's responsibilities with respect to information as: "It needs to protect information which should be kept private, preserve information which must be archived, and make available information which should be published." 18 Those responsibilities were the focus of the work that followed. The Commission's formal advice was unusual in the hard line it took against technologies that were, in 2003, only just barely commercially available. However, the concern raised about potential difficulties in meeting legislated obligations was only one of five reasons for issuing the advice. The others were the possibility that information could become inaccessible or unusable, 15 E­government Unit, State Services Commission "Advice on Digital Rights Management" (3 November 2003) News Release: available at <www.e.govt.nz> (last accessed 10 July 2007). 16 E­government Unit, State Services Commission "Trusted Computing Technologies: Briefing" (October 2004) News Release, 5 ["Trusted Computing Technologies: Briefing"], available at <www.e.govt.nz/policy/trust­ security/trusted­200410/drm­200410.pdf> (last accessed 10 July 2007). 17 E­government Unit, State Services Commission Trust and Security on the Internet: Keeping the Internet Safe for E­government in New Zealand (State Services Commission, Wellington, 24 November 2004) 12: available at <www.e.govt.nz/policy/trust­security/trust­security­2004/trust­security.pdf> (last accessed 10 July 2007) [Trust and Security on the Internet]. 18 Trust and Security on the Internet, above n 17, 12.
7 8 Human Rights Research uncertainties about backwards and cross­compatibility, the implications of this new form of security were unknown, and there was concern about the long­term implications for government as a whole. Today, pundits predict that TC chips will become ubiquitous and are already being shipped in more than half of portable computers and increasing numbers of desktop units. 19 Finding a sustainable response to the perceived threats is clearly of pressing concern. So, what are these technologies? DRM technologies are a response to the threat of "perfect copies" available for digitised media. In the traditional analogue world, a copy of an original is of inferior quality to the original. For example, a photocopy of a printed book page is harder to read even if it is perfectly serviceable for most purposes. However, if a copy is made of the copy, the quality deteriorates again and the second generation copy is usually clearly less usable and attractive than the original. This is not necessarily true of digital media. All the information required to perfectly reproduce the item is contained in its digital form. TC is a response to the very real threats of doing business in an insecure environment like the internet. 20 The Trusted Computing Group is an industry association, although not the only one, 21 that includes Microsoft, IBM, Hewlett Packard, AMD, Intel and other major players in hardware and software development. They developed open specifications for TC building blocks including a Trusted Platform Module (TPM), which is a microcontroller (chip) that provides core security services for other hardware and software applications to use. The chip stores keys, passwords and digital certificates used to protect both the basic operating environment and applications operating in that environment. 19 Roger Kay This Ain't Your Father's Internet: How Hardware Security Will Become Nearly Ubiquitous as a Rock Solid Solution to Safeguarding Connected Computing (Technology Pundits, 2005): available at <www.technologypundits.com/index.php?article_id=215> (last accessed 10 July 2007). 20 "Trusted Computing" in Wikipedia <http://en.wikipedia.org/wiki/Trusted_ Computing> (last accessed 10 July 2007). 21 See Open Trusted Computing Consortium <www.opentc.net> (last accessed 10 July 2007). AMD, IBM, and HP also participate on the Open Trusted Computing Consortium website.
Trusted Computing and Digital Rights Management Summarising some of the information from the Electronic Privacy Information Centre's trove of material about DRM, 22 gives a lay perspective on what these capabilities mean: 23 In the physical world, anonymous use (if not the acquisition) of media and its intellectual content is the norm – even the purchase of the content can be anonymous if cash is used:
· no­one is aware of the pages you read in the paper, unless they are reading over your shoulder in the train
· no­one is aware of the pages you re­read in a book, nor do they care if you do so 10 times because the material was complex or you loved the poem
· no­one is aware of the exact place where you paused the video to answer the phone, how long it stayed paused, or where you re­wound to, once off the phone. All those actions and more can be tracked by DRM software which as currently available is capable of:
· tracking use of the software
· reporting back to the copyright owner or seller or other party on that use or even other totally unrelated activity on your computer
· limiting the number of times you can read that page or view that video. While TC and DRM are different technologies, when implemented in concert they can reinforce each other. 22 Electronic Privacy Information Center <www.epic.org/privacy/drm> (last accessed 10 July 2007). 23 What follows in these two paragraphs came from Lindy Siegert "Digital Rights Management (DRM) Technology And Privacy: Background to Comments Made to an Agencies Leaders' Forum Meeting at the State Services Commission, 23 March 2005" <www.privacy.org.nz/library/digital­rights­management­drm­ technology­and­privacy> (last accessed 10 July 2007).
9 10 Human Rights Research State Services Commission response The SSC established a Trusted Computing Steering Group to guide the project identified in the briefing paper, inviting the Privacy Commissioner to nominate a representative to the Group. The goal of the project was to develop principles and policies to guide government agencies' responses to TC/DRM. The SSC chose to address these technologies together because of the previously mentioned potential to interact and reinforce the capabilities each brings to the computing environment. Given the focus on principles and policies, the Commission asked specifically for representatives with a policy background rather than technology expertise. SSC's early investigations into TC/DRM, as reported in its briefing paper, identified five potential concerns in the adoption of TC/DRM by government agencies: access to government information by all entitled to access it; privacy of recorded personal information on government computers and the privacy of users of TC/DRM protected computers and computer systems, including individuals logging into a government website; long­term management of government information; permanence of government records; and legal obligations of government agencies under, for example, the Official Information Act (1982) and the Evidence Act (2006). The following example 24 shows how these concerns could play out in a concrete case. Consider an electronically submitted vendor response to a tender call or request for proposals for a government project. That response as a whole would be commercially sensitive but would likely also contain proprietary material, information covered by patents held by the company, and details about individuals proposed for the project such as their work histories and charge­out rates. A sensible vendor might well wish to provide some protection for that information by attaching digital rights controls to the document. That protection raises privacy issues, but also raises general concerns about the long­term integrity of government­held information. The government agency that received the document would have manifold responsibilities with respect to it. Any personal information would require treatment in compliance with the Privacy Act, including having a mechanism 24 Other examples of how these technologies could cause difficulty for government are outlined in "Trusted Computing Technologies: Briefing", above n 16, Annex 1.
Trusted Computing and Digital Rights Management for processing requests for access by an individual to their own information. 25 Other access to the information would be subject to the Official Information Act. Long­term disposition of the document would be subject to the Archives Act. General requirements for openness and accountability of government would influence how the document should be stored and access managed. The agency's own establishing or regulating legislation might be relevant, as would government's procurement policies and standards. All these responsibilities could be complicated at the very least by DRM controls. TC/DRM controls could complicate government management of its information in many ways. Government would need to be able to provide access to a document where someone else possibly controlled the access rights. How could that be managed? What if that entity closed down access, deliberately or by accident, or simply vanished? Government would need to be able to protect or account for its stewardship of information that was stored on a computer that might have TC/DRM installed and was "phoning home". What might be in those encrypted messages? Who would have access to them? Could a third party get information about government activities or plans by tracking an employee's use of their computer through the "phone home" feature? What about long­term access as hardware and software move through their life­cycles to obsolescence? Government's need for access to its information will often outlast the equipment it lives on. The Working Group's brief 26 was to develop and document those principles while taking into account the following:
· issues related to storage and retrieval of information in the context of any legislation, policies or requirements under which government agencies operate
· instances for which use of TC and DRM might enhance agencies' operations 25 However, one exception to the general right of access under Principle 6 is found in section 28 of the Privacy Act 1993 where disclosure can be refused if it would disclose a trade secret. 26 Trusted Computing Working Group <www.e.govt.nz/policy/trust­ security/working­group> (last accessed 10 July 2007).
11 12 Human Rights Research
· practices that should be adopted to ensure that risks of the use of these technologies would be mitigated
· the practicability and acceptability of such principles in the vendor community
· the process and means by which such principles could be communicated to agencies, and the challenges of having them accepted and adopted
· the international context of such principles. Clearly, the first and last bullet points cover both local legislation related to human rights and any obligations New Zealand has under international treaties and agreements. Others directly or implicitly required a consultative approach. It became quickly apparent that principles alone would not be sufficient. Government agencies would require help at a practical level. Consequently, the Working Group decided and the Steering Group concurred that they needed to develop both principles and policies on which to base standards and other implementation level guidance. In other words, a formal framework was required, the foundation of which would be the principles. As the goal of the project was a consensus proposal to go forward, agreeing on a definition of principles, policies and standards was an important first step. They defined 27 principles for the purpose of the work as fundamental truths, laws or requirements as the basis for reasoning or action. Principles should articulate the basis ("the why") of the policies, reflect concerns, risks, issues and emphases, and adherence to them should be qualitatively assessable. Human rights obligations might not have been explicit in this definition but were certainly a significant component of those "fundamental truths" the Working Group tried to identify and address, especially as expressed in local legislation relating to privacy and access. Policies are high­level courses of action that deal with "the what" necessary to implement the principles and, like the principles, should be qualitatively 27 State Services Commission Trusted Computing and Digital Rights Management Principles & Policies (State Services Commission, Wellington, 5 September 2006) ["Trusted Computing and Digital Rights Management Principles & Policies"]. Also available at <www.e.govt.nz/policy/tc­and­drm/principles­ policies­06> (last accessed 10 July 2007).
Trusted Computing and Digital Rights Management assessable. Policies would need to include scope and interpretation statements and their rationale for inclusion. The standards definition stipulated the setting out of specific, measurable or observable goals to support the policies. Standards should address "the how" of implementation and adherence to a standard should be measurable or observable. The Working Group reported on their proposed principles and policies to the Steering Group who recommended them to the e­GIF Management Committee. The principles and policies were adopted as formal government policy on 5 September 2006. Adoption has risks, as implementation may be difficult if New Zealand attempts this without support from other governments to influence the development of the technologies. Conversely, taking a principled stand while a technology is in its early development offers the opportunity to influence that development to accommodate commercial and government drivers for success. It also provides a small window of opportunity for New Zealand to develop support for its position and perhaps the cumulative purchasing power of many governments can influence a privacy­friendly development of these technologies. Outcomes and Privacy The over­arching concern of the principles is maintaining the integrity of government­held information over time. While protecting personal information is a primary expression of that concern, another major focus of the principles and policies is accessibility to government­held information. The process had one apparent omission, the lack of direct representation and involvement of consumers. Despite the abundance of media attention on consumer concerns about DRM, 28 this was a deliberate choice. The Steering Group felt that expanding the coverage beyond those aspects over which government had non­legislative control would compromise a timely and effective response. Timeliness was the key to providing a coherent picture of government concerns to the developers and in that way influencing the development of the technologies themselves. This would directly protect consumers' rights with respect to the information that government holds 28 See for example, Electronic Frontier Foundation "Are you Infected by Sony­ BMG's Rootkit?" (9 November 2005) Press Release. Also available at <www.eff.org/deeplinks/archives/004144.php> (last accessed 10 July 2007).
13 14 Human Rights Research about them, and they should also benefit more generally by the incorporation of government's concerns into the development strategies of the major vendors. This is in part because of emphasis on the protection of individual rights to privacy and access in the policies and principles. 29 Individuals are entitled under the Privacy Act to request correction or deletion of information held about them. In addition, without access to the documentation of government decisions and audit trails of activities, government openness disappears. The principles make clear that DRM protected documents should not compromise privacy rights, or protect against whistle­blowing and formal investigations into government actions. The tools it uses should not compromise government's compliance with its own laws and regulations. Government accountability is also the driver behind the requirement for government's awareness of and explicit consent to information entering or leaving its systems and any amendments to information. It speaks to the need for human judgement when making decisions that may affect individuals. It reflects the position that automated decision making alone is not an acceptable option for a government that respects individuals' rights to privacy, and public and private security. The wording of the principles and policies is intentionally generic and not specific to New Zealand laws or society. This is not a necessary result of taking a principles­based approach, but could have been impossible with a narrower focus. The generic wording should make the principles and policies acceptable to and adoptable by all governments subscribing to the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. 30 They should also be adaptable where there is legislation providing for public access to government information, whether called sunshine laws, or freedom of information laws or the Official Information Act. This drafting goal also contributes to the aim to consult as widely as possible with other 29 Trusted Computing and Digital Rights Management Principles & Policies, above n 27, 5­7. 30 OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (23 September 1980), available at <www.oecd.org/document/18/0,2340,en_2649_201185_1815186_1_1_1_1,00.ht ml> (last accessed 10 July 2007).
Trusted Computing and Digital Rights Management governments, share the results of the work in international forums, and build support for the widespread adoption of the principles and policies. Conclusion Did the project take a conscious human­rights­based approach to the challenge of these new technologies? Clearly, the answer is "no". Did the "principles­based approach" used resonate with a human­rights­based approach? I believe the answer is "yes". The project started with the identification of government's relevant responsibilities under local statutes and international covenants and any rights that might be affected by these technologies, including privacy. The Working Group took a broadly consultative approach that included voices for most interested parties, including vendors and representatives with concern for individual rights. It emphasised government accountability in its own processes for formal reporting through the Steering Group and the e­GIF Management Committee. The recommendations and products of the work aim to ensure that neither government accountability nor privacy and access rights already enjoyed by New Zealanders will be compromised by government adoption of these technologies. While direct empowerment of individuals in their private interactions with these technologies may not have been possible, the work has drawn vendors' attention to the potential impact of these technologies on individual rights and encouraged incorporation of those concerns into future development. So, while the TC/DRM work led by the State Services Commission may not appear on the surface to be a privacy or human­rights focused project, in fact the process and its outcomes would seem to provide a basis for tackling privacy and human rights implications for government of new ICTs. As the growing interest in and responses to human rights issues around new technologies indicates, a tried and tested model for government fulfillment of its human rights responsibilities when adopting new technologies is an important development.
15 
Download