Main Factors Affecting the Definition and Design of

The Privacy & Security
Research Paper Series
Main Factors Affecting the
Definition and Design of the
PACT Privacy Reference
Framework for Security
Technology
issue #9
The Privacy & Security - Research Paper Series
Edited by Centre for Science, Society & Citizenship Co-­‐edited by University of Westminster – Communication and Media Research Insti-­‐
tute ISSN 2279-­‐7467 Main Factors Affecting the Definition and Design of the PACT
Privacy Reference Framework for Security Technology
Authors:
Israel Rodríguez Fernández. Atos Secure Identity Technologies Lab Senior Research.
Alberto Crespo Garcia. Head of Atos Secure Identity Technologies Lab.
Pedro Soria Rodríguez. Atos Financial Service Market Manager.
M Nuria Ituarte Aranda. Atos Secure Identity Technologies Lab Senior Research.
Research Paper Number #9
Date of Publication: January 8, 2014
Acknowledgement: The research presented in this paper was conducted in
the project “PACT – Public Perception of Security and Privacy: Assessing
Knowledge, Collecting Evidence, Translating Research into Action”, funded by
EU FP7 SECURITY, grant agreement no. 285635
This paper is an output of PACT’s Work Package 7 “Dissemination and
Stakeholder Involvement”
All rights reserved.
No part of this publication may be reproduced, distributed or utilized in any
form or by any means, electronic, mechanical, or otherwise, without the prior
permission in writing from the Centre for Science, Society and Citizenship.
Download and print of the electronic edition for non commercial teaching or
research use is permitted on fair use grounds. Each copy should include the
notice of copyright.
Source should be acknowledged.
© 2013 PACT
http://www. projectpact.eu
The Privacy & Security - Research Paper Series, Issue #9
1
Main Factors Affecting the Definition and Design of the
PACT Privacy Reference Framework for Security Technology
Abstract: This paper discusses questions about the main factors affecting the definition and design of PACT’s Priva-­‐
cy Reference Framework for Security Technology. These factors are considered either “static” or “dynam-­‐
ic”, based on the ability by the different PRFST users (decision makers). The dynamic factors are grouped under the Needs category and includes factors such as the starting point or goal to achieve, the framing conditions and the assets to protect. The static factors are divided in two groups: the Norms category (which includes the Ethical and Societal principles, the influence of technology and non-­‐technology stand-­‐
ards about privacy, the laws and regulations focussed on privacy, and the connections with other EU re-­‐
search projects), and the Preferences category includes the relevant results from analysis of PACT’s empir-­‐
ical survey across 27 EU Member States. Finally the paper provides an approach to the procedure to com-­‐
bine these factors in the framework to produce guidance for the PRFST users. Keywords: PACT, privacy, framework, factors, theoretical, law, regulation, standards, ethical, societal, survey, empirical, needs, goals to achieve, framing conditions, actors, stakeholders, systems, privacy domains, Key choice points, assets to protect, privacy threat index, privacy controls, mitigate, FP7, Europe The Privacy & Security - Research Paper Series, Issue #9
1
1. Introduction
One of the expected outcomes of the PACT research project (see http://www.projectpact.eu/) is the development and validation of a Framework (the PACT Privacy Reference Framework for Security Technology (PRFST). This frame-­‐
work is a readable document that provides a set of steps to support, guide and help decision-­‐makers to make security technology decisions in a transparent and rational way (while taking account of citizens’ perceptions and judgements, e.g. on the rela-­‐
tionship between privacy and security). In this paper, the main factors affecting the definition and design of this framework will be analysed. A progressive approach to identify the audiences of the PRFST (its end-­‐users) and the relevant factors is presented. Finally, the document concludes with the proposed procedure to combine these factors to provide privacy awareness and guidance as support to the decision process. 2. PRFST Audiences
The PRFST audiences can be grouped in three main segments, the first one being the key target end-­‐user group for the PRFST: Decision-­‐makers, at different levels within private and public organisations (from higher governance and managerial responsibilities to IT technical staff including ar-­‐
chitects, designers and developers) need guidance on technology purchasing choice making and on different design and development paths that can be followed depend-­‐
ing on a number of goals and constraints that are becoming harder to balance appro-­‐
priately in the face of legal compliance requirements, cost-­‐effective management of resources of different kinds and the fulfilment of expectations from different stake-­‐
holders (customers, end-­‐users, civil rights NGOs, etc.). In particular, the complexity around technology as it converges with other technologies in the area of security, makes it difficult, even for technologists, to understand adequately all the potential privacy threats that can be involved and how governance and technical measures or controls should be applied to mitigate them. Policy makers, this group are involved in the European conversation on privacy, fundamental rights, and security and have the difficult role and high responsibility to translate into effective policies (at European, national or regional and local levels), the dynamic security and privacy priorities considering the respect for fundamental rights of citizens (dignity, freedom, equality, solidarity, etc.) and for the democratic values (including accountability, transparency and openness) which are the political basis of our European societies. Society at large, including EU citizens, social minorities, and third-­‐sector organisa-­‐
tions representing their respective interests, is increasingly affected by the goals of security-­‐related policies and systems based on security technology and potentially citizens can become the victims of errors in their interpretation or application or suf-­‐
fer the consequences of flaws or omissions in their design. Therefore, and as it is es-­‐
2
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
tablished in the Internal Security Strategy for the European Union (European Council, 2010, page 19), it should be ensured that the policies in this area “take account of their concerns and opinions” (of citizens) as they should be not only protected but served by such regulatory instruments and by systems that have to comply with such instruments. In this respect, the Stockholm Programme mentions the need to mobi-­‐
lise the “necessary technological tools”, including pan-­‐European information ex-­‐
change models, as an integral component of the EU’s internal policies, but it also calls for “improving our citizens’ security within a clear framework that also protects their privacy”(European Council, 2010, page 65). In this respect, if better decisions are made by policy makers and by those actors responsible for security technology investments and / or design, there will be a direct benefit for citizens that are meant to be protect-­‐
ed and served by policies and systems based on security technology in terms of i.e. increased acceptance in the application of such rules and technologies and trust in the stakeholders responsible for shaping and applying them. 3. PRFST Design Factors
The PACT Privacy Reference Framework for Security Technology objectives are to identify in a readable document which describes in a comprehensive manner i) the main cultural, social, ethical, factors to be taken in consideration during the assess-­‐
ment of the security and privacy implications of given security technology, ii) trade-­‐
off and non-­‐trade-­‐off elements that affects public perception of security investments; and iii) the role played by trust and concern in shaping the public opinion in this poli-­‐
cy area”. Another important PRFST objective is to translate research coming from the comparative analysis of empirical evidence collection of qualitative and quantitative data across Europe highlighting national and cultural differences on the perception by individuals of the different factors associated with the relationship be-­‐
tween privacy and security. In particular, findings from the empirical approach are relevant to feed the PRFST with an adequate understanding of how real citizens react when confronted with the need to express their preferences in discrete choice exper-­‐
iments, contextualized in realistic scenarios in which security technologies and rele-­‐
vant policy measures can affect their privacy and fundamental rights. This proper understanding of the factors driving citizen’s choices is of paramount importance to predict i.e. public acceptance or rejection in relation to different possible policy op-­‐
tions or proposed deployment or improvement of security technology systems sup-­‐
porting them. Moreover, the PRFST will investigate how to incorporate methodologically from the analysis of PACT survey socio-­‐demographic descriptors as well as other key findings, relevant for assisting decision and policy makers with a better understand-­‐
ing of the extent to which factors that shape citizens’ perception of security and pri-­‐
vacy may be specific for each technology, and if technology can alleviate them, or whether they chiefly depend on general variables (e.g., core beliefs, level of trust in politicians, economic and social concerns, feelings of community belonging, national ethos, etc.). The Privacy & Security - Research Paper Series, Issue #9
3
The PRFST approach is consistent with PACT’s logical structure and concept de-­‐
picted in the Figure 1 below: Figure 1: PACT’s Logical Structure and Concept Having in mind the complexity of trying to join the theoretical and the empirical work of PACT, the guidance delivered by the PRFST aims to facilitate “collective agreement” in all system’s life-­‐cycle phases; it does so by providing a method for analysis of the relevant elements on security and privacy in contextual settings, e.g. PACT scenarios. Based on these factors, it is necessary to group them according with a criteria. After analysing several options, the resulting factors-­‐grouping PRFST criteria are based on three categories of factor and their influence in the decision making process. Thus, with the PRFST can be distinguished: Norms, this category compiles all factors that result to be more static in nature and exert a stable influence, such as legislation (laws, regulation), ethical principles and values, societal aspects, and standards devoted to privacy. These “static” factors re-­‐
main the same within each context and their consideration is mandatory. Preferences, in this category empirical results based on analysis of a survey to the general public across 27 EU Member States will be considered. This survey corre-­‐
sponds to real-­‐life contexts in which issues of privacy, security and trust are involved. The survey responses are complemented with socio-­‐demographic information of re-­‐
spondents, questions aimed at capturing respondent's beliefs, norms and attitudes towards security, privacy and trust, and their familiarity regarding the above pro-­‐
cesses. The survey analysis reveal in a systematic and comprehensive manner real user’s preferences in relation to i.e. surveillance technologies & policy measures and their assessment of their implications for privacy and security, relative importance of factors for trust/distrust, i.e. relevance of cultural and contextual aspects (e.g., “pri-­‐
vatization of public services”, “religious extremism”, insecurity of EU borders”, etc.), specific elements or contexts for concerns, how they address risk-­‐taking, plus attitu-­‐
dinal and life-­‐style/socio-­‐economic and demographic indicators. Needs, under this category the specific factors related with the context of the PRFST user needs are identified performing a contextual analysis and grouped. This contextual analysis of broad scenarios involved and/or specific use cases, allow to 4
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
properly scoping the application or business service that will manage personal in-­‐
formation in terms of providing the environment in which data protection and priva-­‐
cy requirements apply. The extent of this analysis and scoping is to be decided by the entity making use of the PRFST. This analysis can, for example be iterated in two cy-­‐
cles: a first cycle can define a define a High Level Use Case which will can guide the subsequent analysis; a second cycle can make a detailed analysis refining the initial overall use case into finer-­‐grained use cases and focusing on more detailed descrip-­‐
tion of analytical elements below. 4. Norms
As mentioned above, this category incorporates as PRFST inputs outcomes from the root and branch review carried out in first stage of PACT project as well as theoretical results from previous projects in the EU Security Research Programme that have been identified as relevant to the PRFST process. Also, the aim has been to incorpo-­‐
rate key law and regulation aspects as needed by decision makers in order for them to respect the privacy principles specifically or implicitly included in the law and reg-­‐
ulation. A number of on-­‐going standardisation initiatives are increasingly focussing on and supporting effective approaches to Privacy by design principles which have been considered in turn useful to help identify rules, guidelines, specifications, characteris-­‐
tics and other precise criteria relevant for some aspects of the PRFST methodological approach. The PRFST incorporates in this category the ethical principles and fundamental rights that might be adversely affected by decisions (and which include dignity, right to life, right to liberty and safety, right to respect for his/her private and family life, home and communications, right to the protection of personal data, freedom of as-­‐
sembly and of association, freedom of movement) (Mordini, 2012). The Societal Impact Assessment sets forth those societal values that need to be considered and assessed, including inter alia: “citizen rights; research ethics; societal relevance; security technology and civil liberties inside and outside the EU” (McCar-­‐
thy, 2012). These and other societal values are part of the societal impact checklist contained in the Societal Expert Working Group Report, which establishes a baseline societal standard that should be met by security development and technology. 4.1. Theoretical background
4.1.1. PACT Root and Branch review
A comprehensive assessment of the theoretical framework explaining the relation-­‐
ship between privacy and security, and also between trust and concern has been pro-­‐
vided (Amicelle, 2012) at the beginning of PACT project. The previous analyses ap-­‐
plied to the relation between Trust and Technology offer the conclusion that trust can be strengthened if systems include the aim of achieving trustworthiness through privacy-­‐enhancing controls. The Privacy & Security - Research Paper Series, Issue #9
5
The relation between Trust, Security and Privacy was also analysed and was con-­‐
cluded with essential recommendations. For instance, Human Security approaches were discussed to give the appropriate attention to human rights and to avoid perspectives too inclined towards “securitisa-­‐
tion”, concluding that approaches that include contextual and pluralistic characteris-­‐
tics, are the most desirable. Privacy does not have a unitary value; it is a derivative concept from other concepts such as property, liberty, or multifaceted notions. These derivatives are values that can be placed at different levels: individual, social or socie-­‐
tal. Furthermore, in the PACT Project an extensive classification was proposed to cate-­‐
gorise current and future technologies used in the context of surveillance and security systems (Crespo, 2012). This taxonomical classification was also made taking into account the different forms of surveillance, grouping them into seven families con-­‐
taining a total of 41 technology groups. The top-­‐level families are as follows: • Visual surveillance • Dataveillance • Communications surveillance • Biometrics and identification • Sensors • Location determination and • Emergent and futuristic technologies This analysis also included a mapping of the different technology families to privacy issues considering the privacy principles underlying the Data Protection Directive of 1995 (European Parliament and Council, 1995). This is relevant for selecting specific technologies to be applied in security and surveillance systems. Attention must also be paid to potential conflicting interests when security and privacy are seen as di-­‐
mensions that can (or should) be traded-­‐off. This is in contrast to holistic approaches preferred by PACT where privacy is viewed as a positive factor that complements security (or other intended beneficial goals like functionality, convenience for the user, etc.). 4.1.2. E.U. Related projects
The outcomes of related EU projects, inasmuch as they could provide valuable insight into key design aspects of the PRFST and/or dimensions to be addressed in the next phase of PRFST development, have been taken into consideration. This includes pro-­‐
jects: PRISE (Privacy enhancing shaping of security research and technology) project is particularly centred on providing criteria (for performing a PIA) and guidelines for security technologies and measures in line with human rights in general and with the protection of privacy. 6
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
IRISS (Increasing Resilience in Surveillance Societies) project which has the aim to investigate societal effects of different surveillance practices from multi-­‐disciplinary social science and legal perspectives, has published a public report/deliverable “ad-­‐
dressing and analysing the factors underpinning the development and use of surveil-­‐
lance systems and technologies by both public authorities and private actors, and their implications in fighting crime and terrorism, social and economic costs, protec-­‐
tion or infringement of civil liberties, fundamental rights and ethical aspects” (Wright, 2012). DESSI (Decision Support in Security Investments) Project which launched its DSS in 2013 (allowing users to compare different ways of counteracting different threats and providing support to rational decision-­‐making) and focuses on societal implica-­‐
tions or dimensions1 (DESSI D2.6, 2013) of decision-­‐making processes of security investments providing a versatile assessment process. PRISMS (The PRIvacy and Security MirrorS: Towards a European framework for integrated decision making) project provides a discussion paper on legal approaches to security, privacy and personal data protection (PRISMS D5.1, 2013) which clearly describes the difference between “privacy” and “data protection”, and their intersec-­‐
tion with “security”. PRESCIENT (Privacy and emerging fields of science and technology: Towards a common framework for privacy and ethical assessment) project aims to provide an extended understanding of privacy by embracing different approaches and provides too policy-­‐makers and researchers with a tool by means of which not only privacy and data protection risks, but also ethical issues related to the design, development and use of emerging technologies can be identified and addressed early on(Vernier, 2013). This project provide a Privacy and Ethical Impact Assessment Framework for Emerging Sciences and Technologies), which contains two chapters (4 and 5) focus-­‐
sing respectively on the provisions on PIA in the new Data Protection Reform and on identifying the limits and challenges of the PIA approach when trying to incorporate ethical values. SAPIENT (Supporting fundamentAl rights, PrIvacy and Ethics in surveillaNce Technologies) project has recently been working on the development of a PIA for “Dimensions are parameters that decision-makers think or should think implicitly or explicitly. The systematic approach taken by DESSI that none of
these dimensions is excluded in decision-making beforehand, that is, before it
is checked whether a certain security investment decision concerns these dimensions. These dimensions concentrate on different aspects, representing
specific classes of features of the security investments under consideration and
specific impacts on other spheres of society, economy, law and policy which
DESSI aims to take into consideration. These dimensions are organized according to the areas of social life, which are actually or potentially impacted
and regarded as crucial in assessing the relative success of a security investment”,
1
The Privacy & Security - Research Paper Series, Issue #9
7
smart surveillance technologies, the outcome of which is of interest for PACT consid-­‐
ering PIA-­‐like approaches in the PRFST model. In addition, it provides “a fundamental rights analysis of smart surveillance” (Friedewald, 2013) such as relevant source for PRFST Legal Requirements and Principles step inasmuch it “addresses smart surveil-­‐
lance from a legal point of view” by reviewing “existing laws and principles that are relevant to the use of surveillance technologies in general, focusing in particular on the right to privacy and data protection”. Valuesec (Cost benefit analysis of current and future security measures in Europe) project is focussed on making available a methodological framework (tool box) for a comprehensive cost-­‐benefit analysis and a decision support tool for the consequential evaluation of ICT security measures and investments intended to avert or mitigate security events. It does this “as a means to make more transparent and systematic assessments of the wide array of costs incurred by security decisions as well as the potential and expected benefits linked to such decisions. It will thus facilitate better and more effective decision making. (Valuesec flyer, 2013)” 4.2. Law and regulation
When referring to legal background we should consider that legal requirements re-­‐
flect or incorporate (at least generic) choices already made by legislators as policy makers on national/transnational/international level. There is a dialectical relation-­‐
ship between legal requirements and “privacy, ethical, and social considerations” that should be incorporated into security policies: legal requirements in force (should) incorporate privacy, ethical, and social considerations but on the same time they in-­‐
fluence these considerations. Law and regulation provide a critical variable, because on one side the policy makers can define new legislation to modify them, but on the other hand, the decision makers need to respect the laws. However, all levels must respect the privacy principles specifically or implicitly included in the law and regula-­‐
tion. The European understanding of privacy as reflected in the regulatory framework is strongly influenced by the so-­‐called dignity approach, which is related to the moral autonomy of the persons. Dignity has become a universal, fundamental and inescap-­‐
able term of reference even though it should always be seen against the specific cul-­‐
tural and historical background. The EU’s Charter of Fundamental Rights, has brought about the constitutionalisation of the person starting from personal freedom and dig-­‐
nity. Dignity and inalienable rights residing with the individual are the hallmarks of the European regulatory approach. At European level the legal content of privacy and consequently the normative ele-­‐
ment of a Privacy Reference Framework can be securely derived from the pertinent case law of the European Court of Human Rights in Strasbourg (ECtHR), in relation to right to private life (Art. 8 ECHR), to which the European Signatory Parties have to comply with taking all necessary and appropriate legal measures on national level. The Court did not consider it possible or necessary to attempt an exhaustive defini-­‐
tion of the notion of ‘private life’. It has rather opted for a case to case approach: so it 8
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
covers the physical and psychological integrity of a person (X. and Y. v. the Nether-­‐
lands) or it can sometimes embrace aspects of an individual’s physical and social identity (Mikulic v. Croatia). Within the private life sphere fall also gender identifica-­‐
tion, name and sexual orientation. Concerning the relation of privacy and data protection the Strasbourg Court has provided a very flexible jurisprudence adapting the traditional private life rule to the challenges and risks of data processing. The Court did effectively considered data pro-­‐
tection cases through the prism of privacy (art. 8 ECHR) and it has developed criteria (nature of data, context and extent of processing, risks and harms for the individuals) to assess whether an issue of data protection touches or not upon the right to privacy. Focusing in the European jurisprudential perception of privacy is regarded to be important in relation to the assessment and justification of interferences and re-­‐
strictions based on the claim for security (policy). In the system of the ECHR (and that of all human rights instruments), the right to privacy is not absolute. Interferences with this right are legitimate as long as they meet the conditions laid down in art. 8.2. The conditions, which are foreseen by Article 8 § 2 ECHR and applied both by the Eu-­‐
ropean Court of Human Rights and European Court of Justice, in order to regard re-­‐
strictions of privacy as legitimate, are crucial for implying an ultimate balancing of interests, for making judgments and choices concerning privacy and security. Anchor-­‐
ing the rights to privacy and data protection in the fundamental ethical and political values from which they derive their normative force and that they are meant to ad-­‐
vance has become crucial for the developing a privacy reference framework. Finally a regulation related with the privacy is the Data Protection EC Directive 1995/46/EC (European Parliament and Council, 1995) also defines Personal Data as “any information relating to an identified or identifiable natural person ("data sub-­‐
ject"); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity;" (art. 2 a). 4.3. Standards
As part of the norms to consider in the PRFST it is necessary to incorporate the pro-­‐
gress in on-­‐going standardisation efforts in the area of privacy and data protec-­‐
tion in order to identify rules, guidelines, specifications, characteristics or other pre-­‐
cise criteria that could be useful to consider for some aspects of the PRFST. Authoritative standardisation texts (either in stable or draft form), mainly from OASIS and ISO/IEC JTC1 (Information Technology) / SC27 (IT Security Techniques Subcommittee) WG5 (Identity Management and Privacy Technologies Working Group), with 57 member countries, have been considered, in particular: The ISO/IEC 29100 Privacy Framework defines the requirements for safeguard-­‐
ing Personally Identifiable Information (PII) processed by any ICT system, across any jurisdiction. It is internationally applicable and general in nature, in the sense that it takes into account organizational, technical, procedural and regulatory mat-­‐
The Privacy & Security - Research Paper Series, Issue #9
9
ters. Specifically, it sets common privacy terminology and principles. It also lists pri-­‐
vacy features to be upheld in conjunction with security guidelines. The ISO/IEC 29101 Privacy Reference Architecture aims to provide guidelines in order to maintain the effectiveness and consistency of any technical implementa-­‐
tion of privacy safeguarding mechanisms within ICT systems. It resolves to suggest a privacy-­‐enhanced system architecture that enables the creation of cohesive privacy protection mechanisms within and across ICT platforms. This reference architecture comprises a series of main elements (centred around Business, Information and Technology) of the architecture (i.e. risk management, privacy requirements, control structure, PII classification, privacy safeguarding controls, data processing flows, pri-­‐
vacy services, monitoring & reporting, etc.). It covers the various phases in life cycle management of data, classifying data and addressing the respective responsibilities and roles of various stakeholders in each case in order to satisfy necessary infor-­‐
mation privacy functionalities. The ISO/IEC SD 2 Official Privacy Documents References List provides guidance on privacy-­‐related references to assist individuals, organizations, enterprises and regulatory authorities in identifying the adequate documentation and/or contact in-­‐
formation to the privacy issues, initiatives and risks, improving the understanding of specifications and guidelines to develop privacy policies and practices, and introduc-­‐
ing the implications of privacy-­‐related laws and regulations. The ISO/IEC 24760 defines a framework for identity management, focusing on defining what constitutes Identity information and their attributes, how to manage the data lifecycle, Identity Management requirements and implementation, control objectives and information access management (policies, privileges, authorization, authentication etc.). Identity is defined as a set of characteristics or attributes repre-­‐
senting an acting entity, while a partial identity is defined as a subset of those charac-­‐
teristics. The ISO/IEC 27000 family of standards focuses on Information Security Risk Management. ISO/IEC 27001 provides the Requirements, while ISO/IEC 27002 includes the Code of practice for information security management. The ISO/IEC 27005 standard provides guidelines for Information Security Risk Management in an organization, supporting in particular the requirements listed in ISO/IEC 27001. OASIS Privacy Management Reference Model (PMRM) is a basis the framework developed by the International Security, Trust, and Privacy Alliance (ISTPA), has re-­‐
fined and developed a Privacy Management Reference Model (PMRM with the aim “to provide a standards-­‐based framework that will help business process engineers, IT analysts, architects, and developers implement privacy and security policies in their operations” (OASIS PMRM, 2012). The driving purpose behind this standard (in Committee Specification Draft status since March 2012) is to fill in the existing gap whereby broad privacy policies describe principles and fair information practices but do not offer actual implementation insight that can actually help to develop the oper-­‐
ational solutions required to address privacy issues. 10
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
Privacy Capability Maturity model (Working Draft 29190) offers a privacy capa-­‐
bility maturity model to provide guidance to organizations (or to thirds parties) for assessing the level of maturity with respect to processes for collecting, using, disclos-­‐
ing, retaining and disposing of personal information. Other Working Groups in SC 27 maintain a number of projects with a relevant rela-­‐
tionship to privacy, for example ISO/IEC JTC 1/SC 27/WG 3 Security evaluation crite-­‐
ria is responsible for IS 15408 “Evaluation Criteria for IT Security” and its “Priva-­‐
cy” class; it addresses anonymity, pseudonymity, unlinkability, and unobservability. 4.4. Ethical principles and fundamental rights
The aim of the “ethical impact assessment” is to a) identify the key ethical princi-­‐
ples impacted by security technologies in a given context; b) to propose a list of questions to help identify the most critical aspects, and, finally, c) to provide guide-­‐
lines for developing ethically sound policies and technologies. The objectives described in the previous section should of course be taken into consideration while developing an ethical analysis. The main effort here however is to try to propose a concrete strategy to implement ethical decision making in the securi-­‐
ty context. As per points a) and b) above, the proposed model for building up the PRFST ethi-­‐
cal impact assessment is the one of holistic Privacy Impact Assessment. In order to overcome the criticism about the fact that conventional PIAs are usually too narrowly focused only on informational privacy, in his contribution on “A human rights per-­‐
spective on Privacy and Data Protection Impact Assessment”, Paul de Hert suggests that a privacy impact assessment can be seen as an assessment of new technologies’ compatibility with human rights. This can be achieved through seven basic tests, i.e. by addressing the following questions: • Is the technology used in accordance with and as provided by the law? • Is the technology serving a legitimate aim? • Does the technology respect the inviolability of the essence of all human rights? • Is the technology necessary in a democratic society? • Is the technology providing no unfettered discretion? • Is the technology proportionate, appropriate and least intrusive means? • Besides the respect for privacy, is the technology consistent with other human rights? The reference frameworks for implementing this test are the European Charter of Human Rights, ECHR and the case law of the European Court of Human Rights (EC-­‐
tHR). According to the author, it is also clear that “a right to have technology assessed before their launch is emerging as a human right” (de Hert, 2012). The Privacy & Security - Research Paper Series, Issue #9
11
4.5. Societal Impact
Based on ethical principles and fundamental rights that might be affected (which in-­‐
clude dignity, right to life (attention on the relation to safety), right to liberty and safety, right to respect for his/her private and family life, home and communications, right to the protection of personal data, freedom of assembly and of association, free-­‐
dom of movement) (Mordini, 2012). Privacy is a right which is protected and enshrined in the European Convention on Human Rights (Article 8: “Everyone has the right to respect for his private and family life, his home and his correspondence”). (Council of Europe, 1950) Data protection, on the other hand, receives a place in the EU Charter of Fundamen-­‐
tal Rights (2000; Article 8).(European Union, 2000) Furthermore, the Data Protection Directive (DPD) in 1995 set-­‐out EU-­‐wide guidelines on gathering, storing and pro-­‐
cessing of personal information, designed to empower and protect data subjects (i.e. the citizen) and their information. (Europoan Paliament and Council, 1995) While data processors are obliged to comply with the Charter and the DPD, protec-­‐
tion of privacy, as a violable right, permits access to information under specific cir-­‐
cumstances. One such limitation can be seen in legislation that compels private bod-­‐
ies to collect, retain and share data with security agencies in the ‘interests of national security’ but thereby violating “consent and retention” principles, in the process. Colin Bennett describes privacy as a “deeply contested” concept, one which “frames not one but a series of interrelated social and policy issues”. He adds, “The concept and the discourse can be, and are, moulded to suit varying interests and agendas”. (Bennett, 2012) Daniel Solove offers some examples that fall under the “sweeping concept of priva-­‐
cy” , they are: “freedom of thought, control over one's body, solitude in one's home, control over information about oneself, freedom from surveillance, protection of one's reputation, and protection from searches and interrogations”. As a result, those who debate privacy – “philosophers, legal theorists and jurists” -­‐ face problems in “reaching a satisfying conception of privacy”. (Solove, 2002) Recent attempts to conceptualise privacy have focused on the role that context may have to play in shaping privacy (Solove, 2002; Nissenbaum, 2003). Solove believes “privacy should be conceptualized contextually as it is implicated in particular prob-­‐
lems”. Data protection, on the other hand, rests within one of the (many) “social and poli-­‐
cy” issues to which Colin Bennett referred, and represents one of the many battle-­‐
grounds upon which protection of privacy (from individuals, corporations or states) is fought. (Bennett, 2012) 5. Preferences
The empirical category is based on an empirical survey for the general public across 27 EU Member States. At the core of the survey, there are three stated preference dis-­‐
crete choice experiments (SPDCE) corresponding each to real-­‐life contexts in which 12
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
issues of privacy, security and trust are involved. This questionnaire collects a set of demographic information about respondents including age, gender, education, work-­‐
ing status, relationship status, presence of children in the household and location (ur-­‐
ban/rural, etc.). Further, the questionnaire also elicits responses related to attitudes to surveillance, security, privacy, trust and risk taking. The three survey scenarios (Internet use and surveillance, Security measures in public transportation like train or subway, Electronic devices containing health data and its sharing in emergencies) feature aspects about technologies and related policy measures that allow to assess common criteria relevant for decision making such as: ⁃ Perceived/actual tension between data protection/privacy and security; ⁃ Accountability and transparency; ⁃ Trust/distrust Each of these sectors presents different privacy implications and shares some common features related to collection, storage and sharing of private data. As con-­‐
cluded in PACT root and branch review, privacy and security preferences can very much depend on the context and these preferences can also vary across different groups of individuals and also within the same context (sector). Furthermore, the trade-­‐off model, – according to which privacy and security are inherently in tension with one another, to the extent that, at a certain point, to increase one is unavoidably to decrease the other – was found to be highly questionable. Beyond the direct inference of the survey replies, the PRFST will evaluate which of these findings can be extrapolated. There is a possibility to make a general inference from the survey findings if such inferences are based on an analysis based on the principles to which the attributes relate (for example consent). There are some common security and privacy aspects in each of the analysed con-­‐
texts. For example, all the contexts present the scenarios with choices bases based on the extent of data stored, duration of storage of information and extent of sharing the personal information. These aspects related to security and privacy are inherent in a wide range of situations and the findings from the PACT survey can provide a useful reference for the design and considerations of the future security and surveillance technologies. Further, the advantage of using an approach based on real life choice scenarios is that the findings can capture the real life choices people make when presented with situations involving complex multi-­‐dimensional factors that must be considered in the round. In terms of utility to the decision-­‐makers, the use of the SPDCE permits a range of factors that may affect investments in security infrastructure to be considered across a common framework, thus eliminating an opportunity to ignore or play down priva-­‐
cy concerns because they are frequently expressed in different terms to those other inputs (cost; predicted effectiveness of a system; throughput) that can be used to characterise the security infrastructure. Thus, PACT aims to improve the treatment The Privacy & Security - Research Paper Series, Issue #9
13
and inclusion of these issues in decision-­‐making frameworks which has as an output a more ethical and privacy conscious assessment of such factors. 6. Needs
According with the previous definitions, the Norms comprise static factors because they experience minor changes over time, and also the Preferences in section above are static from the PRFST point of view because this information will be collected and doesn’t change in function of the PRFST user needs (but rather can be relevant for the user to consider i.e. to avoid public rejection of surveillance systems). Therefore, the Needs category will be considered as the base starting point for characterising the system scenario. This category provides the objective description of the security sys-­‐
tem in scope and sought advantages (justification to introduce the system) must be given. Using the PRFST such goals will be related not only to security goals (i.e. what is the main and secondary desired security outcomes) but also privacy goals (i.e. what is the main privacy concerns the system is aiming to address) and other benefits for the end-­‐user organisation or society. The goals act as starting points and will normal-­‐
ly be related to threats identified (at local, regional, national or EU level) which com-­‐
pel the decision makers to deploy certain measures (i.e. make certain security tech-­‐
nology investments). These Goals to achieve will be conditioned by specific framing conditions such as: • The actors and stakeholders (a data subject or a human or a non-­‐human agent interacting with personal data managed by a system) having operational privacy responsibilities and third-­‐parties (other stakeholders) which are im-­‐
portant to consider in the analysis (i.e. data protection authorities, civil society organisations). Any individual or position may carry multiple roles and re-­‐
sponsibilities and these need to be distinguishable, particularly as many func-­‐
tions involved in the processing of personal information are assigned to a per-­‐
son or another actor, according to explicit roles and authority to act, rather to a person or actor as such. • The systems2 involved and interfaces between them where personal in-­‐
formation is collected, communicated, processed, stored or disposed within a Privacy Domain3. • Privacy domains in scope and types of personal information affected cov-­‐
ering both physical areas (such as a customer site or home) and logical areas (such as a wide-­‐area network or cloud computing environment) that are sub-­‐
ject to the control of a particular domain owner. Domain Owners are the enti-­‐
2 A system is a collection of components organized to accomplish a specific
function or set of functions having a relationship to operational privacy management.
3 Privacy domain is a physical or logical area within the use case subject to
control by Domain Owner(s)
14
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
•
ties responsible for ensuring that privacy controls and services are managed in business processes and technical systems within a given Domain. Privacy Do-­‐
mains may be under the control of individuals or data subjects; data control-­‐
lers; capability providers; data processors; and other distinct entities having defined operational privacy management responsibilities. The Key choice points in flows of events and data should identify different moments during the life-­‐cycle that leads to compliance with provisions con-­‐
tained in a legal text or an administrative procedure or moments during the design of an ICT security solution, where important decisions need to be made in relation to security and privacy and in accordance with ethical and socie-­‐
tal principles, incorporating as well the knowledge extracted from citizens’ perceptions about these topics and having identified and understood potential sources of trust and concern. The decisions can start as early in the process as the moment when the main purpose or goal of the regulation or ICT solution is determined by the policy or decision maker (as this provides the overall finality that is intended to be achieved and different threats to privacy can start to be associated to the goal). In those early stages, choices that allow identifying applicable privacy and data protection regulations will also be relevant. An adequate initial ethical, societal and legal analysis is therefore instru-­‐
mental to establish with confidence and respectively the obligations and exemptions of the different parties involved and with respect to the handling of personal data and the overall definition of the system with respect to other forms of privacy and in re-­‐
spect to fundamental rights. This cannot be separated from the need to establish in the earliest stages the assets that require protection at least in terms of ensuring legal compliance which can also be ranked in value and for which threats can start to be identified in parallel. In general, all personal data should be considered an asset to protect considering the privacy targets (contained in the Data Protection Directive) as follows: • Safeguarding the quality of personal data (in consideration of data avoidance and minimisation, purpose specification and limitation, the quality of data and transparency). • Legitimacy of processing personal data (either by basing data processing on consent, contract, legal obligation, etc.). • Compliance with the data subject’s right to be informed in a timely manner. • Compliance with the data subject’s right of access, correction and deletion of data. • Compliance with the data subject’s right to object. • Safeguarding confidentiality and security of processing (preventing unauthor-­‐
ised access, logging of data processing, network and transport security and preventing accidental loss of data). The Privacy & Security - Research Paper Series, Issue #9
•
•
15
Compliance with notification requirements concerning data processing, prior compliance checking and documentation. Compliance with data retention requirements (retention of data should be for the minimum period of time consistent with the purpose of the retention or other legal requirement). Thus, combining views from a data protection perspective4 with an overall under-­‐
standing of system architecture will allow decision makers to iteratively refine details related to data collection, storage, processing and transfer capabilities. In this respect, an early identification of such informational assets to protect, can help to better es-­‐
tablish the relationship between personal information and associated privacy targets in sufficient granularity to enable the assignment in later stages of privacy manage-­‐
ment functionalities and supporting mechanisms throughout the lifecycle of infor-­‐
mation collection, storage, processing and sharing in a given security technology-­‐
based system. However, protection of assets must also be framed in the perspective of fundamen-­‐
tal rights (European Union, 2000) assurance in order to ensure that the aspects that go beyond informational privacy and related to such rights are also covered: • Human dignity (Art. 1 of the European Charter of Human Rights), which is one of the indivisible and universal values upon which the European Union is funded (together with freedom, equality and solidarity). • Right to integrity (including physical and psychological) and right to liberty and security of person (Art. 8 of the European Charter of Human Rights) • Respect for private and family life, home and communications (Art. 7 of the European Charter of Human Rights) • Freedom of movement (Art. 45 of the European Charter of Human Rights) and right to property (Art. 17 of the European Charter of Human Rights). • Freedom of thought, conscience and religion (Art. 10 of the European Charter of Human Rights) • Non-­‐discrimination (Art. 21 of the European Charter of Human Rights) • Integration of persons with disabilities (Art. 26 of the European Charter of Human Rights) • Respect for the rights of children and the elderly (Arts. 24 & 25 of the Eu-­‐
ropean Charter of Human Rights). Which conceives assets to protect in terms of informational privacy and focusses on the protection of personally identifiable information classified according to its degree of sensitivity
4
16
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
7. Combining PRFST factors
It is not possible conclude this paper only with the identification of the factors grouped into the three categories described in above sections. The interesting part of the PRFST is the ability of integrate the dynamic factors (Needs) with the “static” fac-­‐
tors (Norms and Preferences) and as a result from this combination to provide the guidance intended for the PRFST user to achieve the best decision based on their needs for considered scenarios. Therefore, in the next pages (at high level) the procedure to identify and achieve the PRFST objective of producing the guidance is provided. Figure 2: Joint factors to produce the PRFST Usually, in the classic Technology methodologies of development, the life-­‐cycle fol-­‐
lows a classic cascade pattern (from requirements analysis and functional require-­‐
ments identification to the technical design, implementation and testing), choices made become more specific and operational (i.e. technical choices between design patterns or security and privacy controls). Not only technical choices apply of course, as there will always be non-­‐technical procedures that determine how a solution is deployed and operated and that will require choice-­‐making to determine i.e. organi-­‐
sational approaches that are consistent with (or represent modifications to) pre-­‐
existing internal rules. The overall choices made along the different phases of the life-­‐
cycle of an ICT or security system will shape the overall outcome of the solution that is to be adopted (or re-­‐engineered), which may or may not create threats of its own, i.e. because of the way a certain system is finally implemented, certain threats may follow which are not necessarily a consequence of the intended goal of such a system. According with this general idea, the starting point in the development of a solution is defining the goals to achieve. To solve this main goal and all the secondary goals, it is necessary to consider the framing conditions that could be limiting constraints the goal (such as the economic cost, the time frame, the actors and systems involved in the solution, the managed information and the assets to protect). This leads to the consideration of different choices for technologies where different factors will come into play (i.e. how adequate the chosen technologies are to the sought goals and existing constraints, what threats the technologies or rather the way The Privacy & Security - Research Paper Series, Issue #9
17
they are to be used entail especially for privacy understood in a broad sense, what are the different areas where the introduction of the ICT solution can create ethical and societal impacts, what are the perceptions of citizens and potential reactions of trust and concern about these technologies, etc.). An important choice point relates to the identification and classification of potential privacy threats acting as limiting constraints (economic, time frame, etc.) which may not allow putting in place all measures needed to mitigate all risks. Therefore, the decision maker will need to choose which of the identified threats will be clearly nec-­‐
essary to mitigate. The last group of choices is related to the actual safeguards (technical and organisa-­‐
tional) that can be deployed in order to mitigate the threats. This requires considera-­‐
tion of different possibilities and consequences in terms of effort, time, maintainabil-­‐
ity, etc. (i.e. COTS solutions if they exist, open-­‐source solutions, ad-­‐hoc development, etc.). Once these key choice points have been identified, it is the moment to analyse how to place in them aspects of the solution that can address the different identified priva-­‐
cy threats. For this purpose it is necessary use the static tools, starting with the Norms category. In the fully developed PRFST a set of tools to analyse, validate, com-­‐
pile and valuate the goals to achieve and how to manage the limiting constraints to identify the potential privacy threats will be defined. After this, the Preferences tools will be used to identify citizens’ perceptions about the relevance of the potential threats and preferred system configurations, keeping in mind that this is a comple-­‐
mentary information tool because the such opinions cannot lead to infringe the Norms (Ethical and Societal principles) although they could influence policy makers (developing legislation and regulations). In the PRFST a Privacy Threat Index (PTI) will be produced to categorize the potential privacy threats and based on this catego-­‐
risation the PRFST can provide guidance and privacy controls to identify and, if pos-­‐
sible, to mitigate the privacy threats previously detected. The Privacy controls will be specified in terms of technical and procedural ap-­‐
proaches to mitigate privacy threats. Privacy controls are often presented in the form of policy declarations or requirements and not in a way that is immediately actiona-­‐
ble or implementable. The privacy risks are managed aiming to mitigate them, putting in place adequate privacy controls. Specific controls will be proposed in relation to the PACT Scenarios and guidelines will be given on how to choose controls for other situations. The goal is also to maximize trust / minimize concern by choosing ade-­‐
quate controls that help to ensure by default sufficient levels of technical trustwor-­‐
thiness, user awareness of system relevant aspects, openness/transparency/audit, and liability of actors owning/operating the system. 18
Israel Rodríguez Fernández. Alberto Crespo Garcia, Pedro Soria Rodríguez. M Nuria Ituarte Aranda
References
A. Amicelle, J. Bus, T. El-­‐Baba, C. Fuchs, E. Mordini, A. Rebera, N. Robison, D.Trottier, S. Venier, S. Wright, http://www.projectpact.eu/deliverables/wp1-­‐root-­‐branch-­‐
review/d1.1-­‐report-­‐on-­‐current-­‐theoretical-­‐framework , 2012 A. Crespo, N. Ituarte, P. Tsakonas, V. Tsoulkas, D. Kostopoulos. A. Domb, J. Levinson, D. Kyriazanos, O. Segou, L. Malatesta, C. Liatas, N. Argyreas, S. Thomopoulos, http://www.projectpact.eu/deliverables/wp1-­‐root-­‐branch-­‐review/d1.3-­‐
report-­‐on-­‐technology-­‐taxonomy-­‐and-­‐mapping , 2012 Bennett, C. The Privacy Advocates: Resisting the Spread of Surveillance, MIT Press, Massachusetts, 2012 [p1] Council of Europe, European Convention for the Protection of Human Rights and Fundamental Freedoms, as amended by Protocols Nos. 11 and 14, 4 November 1950 David Wright, IRISS D1.1 Surveillance, fighting crime and violence http://irissproject.eu/wp-­‐
content/uploads/2012/02/IRISS_D1_MASTER_DOCUMENT_17Dec20121.pdf, 2012 DESSI D2.6 Dimensions in Security Investment, p.7., http://securitydecisions.org/about-­‐dessi/publications/, Accessed: May 2013 European Council, Internal Security Strategy for the European Union, 2010, p. 19 European Council, The Stockholm Programme: An Open and Secure Europe Serving and Protecting Citizens, 5731/10, 03.03.2010 p. 65 European Parliament and Council, Directive 95/46/EC of European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Off J.L. 281 (Nov. 23, 1995) http://eur-­‐
lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML (Accessed July 2013) European Union, Charter of Fundamental Rights of the European Union, 7 December 2000, Official Journal of the European Communities, 18 December 2000, http://www.europarl.europa.eu/charter/pdf/text_en.pdf, Accessed: July 2012. E. Mordini/D.Wright Privacy and Ethical Impact Assessment Paper, Prescient D4, Ch. 4&5, 2012 de Hert, Paul, “A human rights perspective on Privacy and Data Protection Impact Assessment”, in David Wright and Paul de Hert, Privacy Impact Assessment, Springer, 2012. Michael Friedewald, Rocco Bellanova, Matthias Vermeulen, Serge Gutwirth, Rachel Finn, Paul McCarthy, David Wright, Kush Wadhwa, Dara Hallinan,Marc Langheinrich, Vlad Coroama, Julien Jeandesboz, Didier Bigo,Mervyn Frost, Silvia Venier, D1, A fundamental rights analysis of smart surveillance, http://www.sapientproject.eu/docs/D1.1-­‐State-­‐of-­‐the-­‐Art-­‐submitted-­‐21-­‐
January-­‐2012.pdf, p.2, Accessed May 2013 The Privacy & Security - Research Paper Series, Issue #9
19
Nissenbaum, H. Privacy as Contextual Integrity, Washington Law Review Vol 79, No. 1, February 2004: 119-­‐158. OASIS Privacy Management Reference Model (PMRM) TC, http://www.oasis-­‐
open.org/committees/pmrm OASIS Privacy Management Reference Model and Methodology (PMRM) Version 1.0, Committee Specification Darft 01, 26 March 2012, http://docs.oasis-­‐
open.org/pmrm/PMRM/v1.0/csd01/PMRM-­‐v1.0-­‐csd01.pdf PRISMS D5.1, Discussion paper on legal approaches to security, privacy and personal data protection, http://prismsproject.eu/wp-­‐
content/uploads/2012/06/PRISMS-­‐D5-­‐1-­‐Legal-­‐approaches.pdf, Accessed: May 2013 Sadhbh McCarthy, Report of the Societal Impact Expert Working Group http://www.bioenv.gu.se/digitalAssets/1363/1363359_report-­‐of-­‐the-­‐societal-­‐
impact-­‐expert-­‐working-­‐group-­‐2012.pdf , 2012 Solove, D. Conceptualizing Privacy, 90 CAL. L. REV. 1087 (2002). Valuesec Flyer, http://www.valuesec.eu/sites/default/files/valuesec_flyer_v3_2.pdf Venier, Silvia, Emilio Mordini, Michael Friedewald, Philip Schütz, Dara Hallinan, Da-­‐
vid Wright, Rachel L. Finn, Serge Gutwirth, Raphaël Gellert, and Bruno Turn-­‐ heim, "Final Report – A Privacy and Ethical Impact Assessment Framework for Emerging Sciences and Technologies", Deliverable 4, PRESCIENT Project, 25 March 2013 http://www.prescient-­‐pro-­‐
ject.eu/prescient/inhalte/download/PRESCIENT_deliverable_4_final.pdf?WSES
SIONID=9b48850e1309cec3f783e354cf8722b8, p. 5 Towards a New EU Legal Framework for Data Protection and Privacy, Challenges, Principles and the Role of the European Parliament, Study, PE 453.216, (2011), http://www.europarl.europa.eu/committees/en/studiesdownload.html?langua
geDocument=EN&file=54351