Modeling Privacy Requirements for Quality Manipulation of Information on Social Networking Sites Shan Chen and Mary-Anne Williams Innovation and Enterprise Research Laboratory Centre for Quantum Computation and Intelligent Systems Faculty of Engineering and Information Technology University of Technology, Sydney {shanc|mary-anne}@it.uts.edu.au Using social recommendation services as our problem domain, in the next section we present a motivating example that helps to elicit important privacy issues and requirements. A study of two relevant OECD privacy principles (2009) is presented with a focus on their semantics that is relevant to the problems identified. Privacy requirements are then modeled based on the semantics of the two principles. Finally, discussion and future work are presented. Abstract The volume and diversity of information shared and exchanged within and across social networking sites is increasing. As a result new and challenging requirements are needed for quality manipulation of the information. An important requirement is information usability with privacy dimensions. Existing social networking sites do not provide adequate functionalities to fulfill privacy requirements of information use. This is largely due to the lack of a privacy-by-design approach that conducts an effective privacy requirements analysis as a means to develop suitable models for social networking that protect privacy. To bridge this gap, this paper analyses and models privacy requirements for a recommendation service in social networking sites. Problem Domain The fundamental privacy problem is identified as a problem of right with three types of rights at focus namely choice, consent and control (3CR) (Chen and Williams 2010). In each right space social problems dominate exploitation of rights (if can be obtained sufficiently) when concerning privacy with social implications. This paper considers the implication of social relations - i.e., relationships. The following example of a social recommendation service called “People you may know” (PYMK) shows the inter-play between each right of the 3CR. Introduction Social networking has been an important activity within human societies for millennium. Social networking systems (SNS) provide unprecedented opportunities for individuals and organizations to share information. Enabled by Internet infrastructure, socialization using SNS is free of many spatiotemporal restrictions. Advantages over off-line social networking have attracted an enormous number of users and rapidly increased the kinds of SNS in number, size nature and scope. As a result, information in SNS has grown dramatically. Growth of SNS has significantly facilitated communication, exchanging and sharing of information among users and third parties. However, the increasing number of privacy breaches reported in the media demonstrates that, from users’ perspective, there is insufficient support for privacy preserving storage and manipulation of their personal information. Examples of privacy breaches leading to negative impact on home and job security can be found in Moses (2009) and France-Presse (2007). It can be seen that poor quality manipulation of information can create barriers among users and reduce their confidence in sharing and exchanging information, and can furthermore result in negative impact on SNS development. To promote SNS in society and to continue to develop innovative services, there is an urgent need to develop more effective privacy management. In recognition of privacy-bydesign approaches, requirements modeling is the first important step for privacy-aware system development (Chen and Williams 2010). Example Mary received a PYMK list from her social network on MySN site, allowing her to send a message to people on the list. She was surprised to find many school friends she had lost contact with on the list. However, she also felt compromised because she saw those who share the same type of profession as her were on the list. She was disappointed not being able to keep her professional information disjoint from her personal social network and to keep her away from those professionals she did not want to network with, because she believed if she saw others’ information they could also see hers. Her concern was confirmed by a friending request from her boss the next day. Mary joined another social network on RealSN which allowed her to choose who could know her existence in the network (i.e., she could choose who she would not be recommended to) and she believed that she could now stay away from her boss. However, she did not know one of her best RealSN friends was her boss’s daughter who shared online experience with her father. As a result Mary received a friending request from her boss on RealSN. These scenarios highlight the interrelationship between the three rights as: choice provides a base for consent, and 42 Purpose Specification restricts relationships because information can only be collected for the purpose of sending social recommendations. This raises important questions regarding the nature of consenting to recommendations. For example, in consenting to a recommendation does the user also consent to spin-off recommendations that might include business propositions like purchasing a product. The Principle of Use Limitation enforces a social relationship can only exist for the purpose of social interactions, e.g., a religious relationship is for religious interactions and not for trading interactions. Our research is motived by the need to develop privacy sensitive recommender systems and we use both principles as guidelines for developing a set of privacy requirements. Towards this aim, we study the semantics of the two principles, i.e., the meaning of the principles w.r.t. privacy namely privacy semantics in the Principle of Purpose Specification and the Principle of Use Limitation. consent establishes a platform for control execution. The simplest binary choice option is to set the default: to-be-recommended or not-to-be-recommended. Potential privacy implications are: i) not-to-be-recommended option provides no basis for consent, thus, consent is not provided; and ii) to-be-recommended option opens a platform for setting consent. The second option raises an underlying problem of social relations - i.e., who can be considered as privacy-friendly network candidates that a user can use to help determine appropriate network connections in the social network. The 3CR model emphasizes users’ control over the information they provide access consent. However, in a network environment, users’ expectations can exceed their availability, in terms of their ability to control their own information, due to a lack of awareness of their situation and uncertainty regarding the dynamics of the network. Thus, the level of control the user has depends on the kind and degree of consent they establish and their connections in the network. Situated in a dynamic and distributed network (i.e., online environment), users are unlikely to have complete relevant knowledge of their place in a social network. This calls for services that can identify users’ information based on their state of consent, that can assist users to identify conflicts between their expectations and availabilities, and that can inform them of the privacy implications of their consent settings. To this end, a comprehensive set of requirements are essential. Inadequate requirements can reveal users’ uncertain expectations and inappropriate consent can easily be constructed, leading to abuse of information. Thus, quality manipulation of information that respects privacy in a way that complies with the 3CR is required. Privacy Semantics Purpose Specification By restricting the use of the collected data to the purposes of collection, the Principle of Purpose Specification holds properties of data and purpose. The function of restriction further requires rich semantics of the data and the purpose. The following questions help to explore the requirements to make restrictions functional and robust. • Data – What is the nature of the data? What is its size, granularity and volume? – What are the conditions bound to the validity of the data (e.g., location and time forms of authenticity and source)? Privacy Requirements and OECD Privacy Principles As an open problem in online communities, various privacy concerns have been raised and international efforts have been undertaken to address them. However, the increasing number of privacy breaches reported in the media almost everyday reveals that there is insufficient support for manipulating information with respect to users’ privacy. This reflects the existing online applications lack of support for the OECD Privacy Principles, in particularly, the principles of Purpose Specification and Use Limitation. The criteria of these two principles comply with the 3CR model as requirements are: • Principle of Purpose Specification restricts the use of the collected data to the purposes of collection - it concerns the consistency of the purpose of data usage and the purpose for which they were collected. This principle reflects users’ rights of choice to consent in the 3CR model. • Principle of Use Limitation enforces data to be used for specific purposes for which consent has been given - it concerns the consistency of the data usage and the purpose of usage. This principle reflects users’ rights of consent to control in the 3CR model. In the context of social recommendations, both principles should play a dominating role. Typically, the Principle of • Purpose (implies usage - i.e., purpose of collecting data implies permitted usage of the collected data, thus, purpose is bound to subjects that will use the data) – What permissions comply with the purpose used to gain consent for the collected data? Purpose for collecting data implies permissions must be in compliance with the purpose presented when gaining consent to use the collected data. This suggests permissions as requirements for the fulfillment of purpose. – What subjects will use the collected data? What kind of subjects can be considered? An issue is not explicitly included in the Principle of Purpose Specification is crucial, namely the same data can (and often) has different purposes for different subjects. Such subjects can be social entities (e.g., individuals, groups, communities, etc.) or matters (e.g., topics, activities, evens, etc.) In the case of social entities (which is considered in this paper), the privacy concern of an individual is extended to his network. So an important question involves the determination of constraints that must be observed between the candidate social entities. In a social network, this is determined by the “position” 43 of the candidate within the network and therefore concerns the relationship between social entities. What social entities (of type and/or position in the network) are bound to data of what kind to what degree and at what level of detail? – What obligations need to be fulfilled to grant the permissions? I.e., purpose specification needs to include obligations, if any. consent - i.e., users will have sufficient meaningful choice options to provide their consent. This requires a deep understanding of the complex core concepts supporting the principle. • Data Object – If the data object is composite, then the user should be able to construct consent for any components of the data object. – If the validity of the data object relies on certain conditions, then the user should be able to provide consent to any of the conditions. – If hierarchical structures are involved - i.e., ∗ the data object has a hierarchy, e.g., relationship information as a data object, e.g., there is already a set of hierarchical relationships and the relationship of interest can be located in the hierarchy, or the relationship is established at a higher abstraction level of agreements, e.g., trading; or ∗ the data object can be mapped to a hierarchical object, e.g., a location related concept “Ultimo Australia 2007” can be mapped to a set of location concepts at different levels of detail - i.e., “Ultimo→Sydney→NSW→Australia”; then the user should be able to consent on the level of detail in the hierarchy. Use Limitation To enforce data usage to comply with the restrictions of the purpose specification, the Principle of Use Limitation requires an operational understanding of the purpose and the corresponding possible usages of the data. If the use limitation is strongly conditioned, it is necessary to understand deviations - i.e., what are considered deviated usages from the purpose? In a social network environment, deviations can be easily created by the data flow because information can be accumulated and propagated on the path of the flow and can result in various types and level of detail of deviations at different positions (i.e., nodes as social entities). The following questions explore the requirements for the fulfillment of restrictions: i) What is the purpose of the data usage? ii) What usages deviate from the purpose? iii) What are the possible paths of data flow where deviations can occur on the path? Modeling Privacy Requirements • Purpose Since privacy requirements are person-dependent, each user has a set of rules to enforce fulfillment of their requirements. We refer to this set of rules concerning privacy as a user’s privacy policy. Using the two principles as guidelines, the privacy policy of a user is scoped to describe rules for manipulating of the user’s personal information. When a piece of personal information is a data object, a user’s privacy policy states that, data objects can be manipulated by social entities, if and only if the manipulation satisfies the conditions of the data object’s existence in relation to the subject. A privacy policy reflects the policyholder’s view of a piece of the world in which the user wants to protect their privacy. Research has found that applying the notion of abstraction to the subject matter reveals a hierarchy of compounds and facilitates exploring essences (Arranga and Coyle 1997). We use this finding to construct the policyholder’s view about their world using core concepts that describe the world. To this end, we model privacy requirements that can be used to develop frameworks for system implementation. The following sub-sections describe such a privacy requirement model that: i) implements the two principles in compliance with the 3CR model, and ii) optimizes users’ rights by using hierarchical structures to enrich the semantics of core concepts to accommodate users’ rights at different levels of detail. – Complex composite purposes should be able to be constructed for different conditions on the data object e.g., data object D is valid at time t1 , t2 , and t3 : ∗ at time t1 D can only be used for purpose p1 , and at time t2 used for p1 and p2 and not p3 where p1 includes p3 ; ∗ at time t1 D can only be used for either purpose p1 or purpose p2 and not for both together, and at time t2 must be used for p2 and p3 where p1 includes p3 ; or ∗ D can only be used for both purpose p1 and purpose p2 , or after time t1 used for either purpose p1 or purpose p2 and not for both together. To reflect the requirements above, two concepts are defined as follows: Purpose Unit A purpose unit, denoted by pu, is a finite set of purposes which are conditions of usage for a data object. Purpose as conditions of the data object’s existence may or may not be related to social entities. If social entities are served, the purpose unit may be augmented with constraints and social entities may be required to carry obligations in order to be able to manipulate the data object. Such a purpose unit for a data object D is defined as pu = (SEpu , Opu , Apu ), where SEpu is a set of candidate social entities carrying obligations in Opu for using D with constraints in Apu . Principle of Purpose Specification - from Choice to Consent Data Object The validity of a data object refers to the satisfaction for a purpose unit or a set of purpose units on the existence of or In compliance with the 3CR model, the Principle of Purpose Specification necessarily reflects users’ rights of choice and 44 Binding Purpose to Usage creation of the data object. Let D denote the data object. The validity of D is denoted by valid(D, pu). Let P U be the set of purpose units for D. For any pui ∈ P U , if |P U | i=1 valid(D, pui ) is satisfied, then, D is valid on P U . Recall that a data object is valid for a social entity only when the collection/creation purpose is satisfied. To enforce the usage to be in compliance with the purpose, the use of a data object can occur only when the data object is valid. This relationship between data object and purpose provides a logical support for binding purpose to usage for use limitation and consistency between the purpose and the usage. In the following we establish a set of definitions and rules as guidelines for binding purpose to usage, namely Privacy Validity, Information Flow and HX Optimization. Principle of Use Limitation - from Consent to Control In compliance with the 3CR model, the Principle of Use Limitation necessarily reflects users’ rights of consent and control - i.e., users have sufficient power to control personal information based on the consent they provided. This requires comprehensive attributes to hold rich semantics to allow expressive consents to be established for better usage control. To this end, the following concepts are considered: • Usage implies permissions for use, while limitation holds obligations and conditions. Both permission and obligation have a type which might be defined at different levels of detail for various levels of detail of the data object of interest. Conditions to support limitations require sufficient attributes to accommodate restrictions, and the flexibility of collaboration between such attributes to provide a basic support for specifying limitations with rich semantics. • Permissions and obligations are meaningful only when they are bound to social entities, which are candidates carrying obligations to use the data object under the permission. To accommodate rich semantics, hierarchical abstractions are applied to these concepts like the data object - i.e., for each concept, the following scenarios must be accounted when determine requirements: i) the concept can be located in a hierarchical concept; or ii) the concept can be mapped to a hierarchical concept. Privacy Validity Privacy Validity of Data Object (PVDO) Given a data object D, let P U C and P U U be the sets of purpose units for the purpose of collecting D and the set of purpose units for the use of D, respectively. Let P U C = {pu c1 , pu c2 , ..., pu cm } and P U U = {pu u1 , pu u2 , ..., pu un }, such that for pu ci ∈ P U C, pu ci = (RT ci , O ci , A ci ); for pu uj ∈ P U U , pu uj = (RT uj , O uj , A uj ). Let valid(D, P V DO) denote the privacy validity of D. Then, valid(D, P V DO) is satisfied iff, P U U ⊆ P U C |= (∀pu uj ∈ P U U ∃pu ci ∈ P U C : pu uj ⊆ pu ci ) ∧ (¬∀pu ci ∈ P U C∃pu uj ∈ P U U : pu ci ⊆ pu uj ), and (∀pu ujy ∈ pu uj : pu ujy ∈ pu ci ) ∧ (RT uj ⊆ RT ci ) ∧ (O ci ⊆ O uj ) ∧ (A ci ⊆ A uj ). Privacy Policy Privacy policy of a data object D is a 3-tuple of (P U, P E, Ref ), where • P U is a set of purpose units for use of D, • P E is a set of permissions can be granted to use D for purpose in P U , and Permission Let HX denote a set of hierarchical concepts X supported by the system. A permission that can be granted for manipulation on data object D ∈ HD , is denoted by pe, defined as a 3-tuple of (SEpe , Ope , Ape ), where • SEpe is a set of candidate social entities of the data object D, such that SEpe ⊆ HSE ; • Ope is a set of obligations, such that Ope ⊆ HO ; and • Ape is a set of axioms that describes additional constraints on se ∈ SEpe , such that Ape ⊆ HA . Network connections can introduce implications through information propagation, positively or negatively. Thus, granting permissions for a social entity to manipulate a data object while maintaining the permitter’s control on the data object must necessarily consider the social entity’s network position. In other words, the connections between the social entity and others in the network, i.e., the network of the target entity and relationships held between entities in the network. Thus, relationship status can be used to facilitate the specification of purposes, defined as P U = (RTpu , Opu , Apu ); and the construction of permissions, defined as P E = (RTpe , Ope , Ape ), where RTpu , RTpe are sets of relationship types such that RTpu , RTpe ⊆ HRT and RT is a set of relationship types. • Ref is a set of references for purpose unit in P U such that, Ref : P E → P U . A permission can be granted for a social entity to use the data object D, if and only if, the consistency between P U and P E is satisfied. P U is consistent with P E, if and only if valid(D, P V DO) is satisfied. Information Flow Relationship by Connectivity Information flow can lead to privacy implications because information is circulated via network connections based on the way entities are connected, i.e., relationships. To calculate the possible paths on which the information will be circulated, a relationship is modeled by two attributes: type and connection degree. Let CP be the set of connection paths from sei to sej , CP = {cp1 , cp2 , ..., cpm }, such that cpk ∈ CP denote a set of social entities linking sei to sej , cpk = {se1 , se2 , ..., sen }, sei ,sej ∈ cpk . Let r be the relationship between sei and sej , denoted by r = (sei , sej ). Let typeof (r) be the type of relationship r, such that rt = typeof (r) ∈ HRT , where HRT is a set of underlying hierarchical relationship types supported by the system. Let cd denote the connection degree between any two entities 45 sei , sej ∈ cpk , j > i. Then, the relationship by connectivity (rbc) from sei to sej is rbc = (rt, cd), cd = j − i. It can be seen that, the concept of relationship by connectivity takes a social factor - i.e., social connectivity - into account. In a social network, particularly in the context of this paper - i.e., social recommendations, such a factor can be used to restrict permissions by social context, which has high impact on individuals’ privacy. To utilize this factor, the definitions of purpose and permission are accordingly changed to: pu = (RBCpu , Opu , Apu ) and pe = (RBCpe , Ope , Ape ), where RBCpu and RBCpe are sets of relationship by connectivity for purpose and permissions, respectively. denote the layer of x ∈ HX . Let layer(x) ∈ [0, n] with smaller numbers indicate higher abstraction levels in the hierarchy, such that layer(“Object )=0 and increase by 1 each subsequent layer down the hierarchy. Given a data object D, let P U C and P U U be the sets of purpose units for the purpose of collecting D and the set of purpose units for the use of D, respectively. Let P U C = {pu c1 , pu c2 , ..., pu cm } and P U U = {pu u1 , pu u2 , ..., pu un }, such that for pu ci ∈ P U C, pu ci = (RBCpu ci , Opu ci , Apu ci ), (rtpu cix , cdpu cix ) ∈ RBCpu ci ; and for pu uj ∈ P U U , pu uj = (RBCpu uj , Opu uj , Apu uj ), (rtpu ujy , cdpu ujy ) ∈ RBCpu uj . Then, the LP V for three dimensions is: Deviation on Information Flow For use of data object D under permission pe = (RBCpe , Ope , Ape ), the information flow can be expected to reach the highest connection degree specified in RBCpe . For example, if RBC = {(“f riend , 1), (“work , 2), (“f riend , 3)}, then D is expected to reach “a friend of a colleague of a friend”. However, the actual information flow needs to be measured against obligations carried by all the entities on the connect path. For example, connection paths that satisfy permission pe1 = ({(“f riend , 1), (“work , 2), (“f riend , 3)}, O, A) are “A − B − E − H” (i.e., cp1 = {A, B, E, H}) and “A − B − D − I” (i.e., cp2 = {A, B, D, I}). For cp1 , the flow terminates at H which is a network end-point. For cp2 , I connects to G, where the connection is established (i.e., relationship as data object) with the obligations that they (i.e., I and G) will share all friends’ information with each other. Thus, while pe1 only permits the information to reach as far as three degrees away, the actual flow will at least be four degrees away on path cp2 . Under the pe1 , we say cp2 is a deviated path, and I is a deviation source. When a deviation is detected, the permitter is notified of the deviation source and path. Such a notification allows the pe1 to be revised accordingly and this process can be repeated until no deviation is detected. • Relationship by Connectivity Let layer(rtpu cix ), layer(rtpu ujy ) denote layers of rtpu cix and rtpu ujy in HRT , respectively. For any (rtpu cix , cdpu cix ) ∈ RBCpu ci and (rtpu ujy , cdpu ujy ) ∈ RBCpu uj , if layer(rtpu ujy ) >= layer(rtpu cix ), rtpu ujy ∩ rtpu cix = ∅ and cdpu cix = cdpu ujy , then PVDO in this dimension is satisfied, denoted by valid(D, P V DO RBC). • Obligation Let layer(opu cix ), layer(opu ujy ) denote layers of opu cix and opu ujy in HO , respectively. For any opu cix ∈ Opu ci and opu ujy ∈ Opu uj , if layer(opu ujy ) >= layer(opu cix ) and opu ujy ∩ opu cix = ∅, then PVDO in this dimension is satisfied, denoted by valid(D, P V DO O). • Constraint Let layer(apu cix ), layer(apu ujy ) denote layers of apu cix and apu ujy in HA , respectively. For any apu cix ∈ Apu ci and apu ujy ∈ Apu uj , if layer(apu ujy ) >= layer(apu cix ) and apu ujy ∩ apu cix = ∅, then PVDO in this dimension is satisfied, denoted by valid(D, P V DO A). Privacy Validity by Level (PVDO by LPV) If valid(D, P V DO RBC) and valid(D, P V DO O) and valid(D, P V DO A) is satisfied, then valid(D, P V DO) is satisfied. HX Optimization The PVDO offers a rigorous criteria to enforce consistency; however, less flexility and choice options. The set of HX concepts (described in the previous section) can be used to optimize the criteria to allow more options while achieving consistency between purpose and usage. To this end, we introduce the concept of level of privacy validity below. Permission Transitivity With privacy concerns, permission transitivity measures the implication of information flow in the following two ways: • If permission can be transferred to all the social entities on the path of the information flow, privacy implication is tolerant1 . To avoid negative implications due to complex dynamic networks, privacy implications should not be tolerant - i.e., permissions should not be allowed to be Level of Privacy Validity (LPV) In compliance with the definitions of purpose (P U ) and permission (P E), LPV holds three dimensions: relationship by connectivity (RBC), obligation (O) and constraint (A). Each dimension requires a different rule. To illustrate, the concept HX is modeled. 1 “tolerant” means the user does not care about any consequence. If the permission can be transferred to all the social entities on the path of the information flow, we say that the user who gives the permission is not aware of or does not care about the consequences. We assume all users care about the consequences but might not be aware of (and not possible) all the consequences. Dimensions of LPV Let HX be a set of hierarchical concepts X with a default concept “Object” at the top of the hierarchy. Let layer(x) 46 transferred to other entities. Permissions can, however, be transferred to a new relationship that is “upward compatible” - e.g., permissions granted for a “trade” relationship can be transferred to a “buy and sell” relationship, if the latter is at a lower abstraction level compared to the former in the relationship type hierarchy. Given a set of permissions P E = {pe1 , pe2 , ..., pen }, for any pei ∈ P E, pei = (RBCpei , Opei , Apei ), (rtpeix , cdpeix ) ∈ RBCpei and rtpeix ∈ HRT , if layer(rt) > layer(rtpeix ), rtpeix ∩ rt = ∅ and rt ∈ HRT , then (rtpeix , cdpeix ) = (rt, cdpeix ). • Permission granted for use of data object D can be transferred onto data objects at the lower abstraction levels of HD , iff D ∈ HD . rights on purpose expectations for a data object - this can be extended to accommodate legislation and regulations if required; ii) relationship by connectivity to calculate information flow with permissions - the concern of social connectivity strictly restricts the transition of permissions on connection paths; and iii) deviation on information flow to predict deviations and discovering possible negative privacy implications - a fundamental requirement for privacy-awareness. With the aim to use the Principle of Purpose Specification and the Principle of Use Limitation in the users’ right space to achieve quality manipulation of information for privacy purpose, we have studied the semantics of and the logical relationship between these two principles and show how they can be utilized to enforce privacy policies. Moreover, we have noted that the Principle of Purpose Specification is deficient in accommodating one-to-many relationships between data object and purpose for quality manipulation in compliance with users’ rights, which can be an issue in a practical context. The definition of purpose unit provides an effective representation for manipulating such relations. The work presented in this paper has exposed significant issues that arise in the privacy protection domain and in the philosophical foundations - i.e., quality manipulation of information by rights. It has opened up new opportunities to develop a transdisciplinary approach naturally spanning diverse disciplines such as computer science, information systems, law, philosophy and sociology. Our plan for the next stage of the research is to incorporate this work into a layered architecture proposed in Chen and Williams (2010) and develop a rigorous formal model for intelligent privacyaware systems with practical impact. Obligation Inheritability Obligations can be required to be passed on to relevant entities. For example, group members essentially carry on the obligations attached to the group. Passed-on obligations need to be made explicitly for reasoning about new permissions for an inherited entity. Given permissions , Ope , Ape ), pe = (RBCpe , Ope , Ape ) and pe = (RBCpe such that (rtpex , cdpex ) ∈ RBCpe and rtpex ∈ HRT , and (rtpey , cdpey ) ∈ RBCpe and rtpey ∈ HRT . if layer(rtpex ) < layer(rtpey ) and rtpex ∩ rtpey = ∅, then, Ope ⊆ Ope . Discussion and Future Work Privacy is one of the main concerns that prevents users from fully enjoying the benefits that online services can provide. Although various privacy-enhancing technologies (PET) have been proposed (PETS 2009; WPES 2009), the increasing number of privacy breaches reported in the media almost everyday demonstrates that there is insufficient support for users to control their information while using these services. This fundamental problem has been identified as the users’ choice, consent and control rights. On the other hand, the Internet infrastructure provides users limited ability to obtain knowledge of their situations due to the dynamics and the evolution of networks. Thus, privacy-aware systems that provide users with knowledge about what online systems like social networks store about them are required. In compliance with the rights of choice, consent and control, a privacy-aware system requires a privacy-bydesign approach and such an approach starts from the requirements development (Chen and Williams 2010). In recognition of the need for the development of privacy requirements, this paper addresses the problem from a manipulation of information perspective, arguing that quality of manipulation is the key to preserve information privacy. With respect to choice, consent and control rights, the Principle of Purpose Specification and the Principle of Use Limitation are used as criteria to measure the quality of the manipulation of information. To maximize users’ rights, and to facilitate manipulation, hierarchical structures are utilized i.e., the HX Optimization. A set of novel concepts is developed to implement the two principles: i) privacy validity for ensuring the user’s References Arranga, E. C., and Coyle, F. P. 1997. Object-oriented COBOL. Cambridge University Press. Chen, S., and Williams, M.-A. 2010. Towards a comprehensive requirements architecture for privacy-aware social recommender systems. In Link, S., and Ghose, A., eds., Proceedings of the Seventh Asia-Pacific Conference on Conceptual Modelling (APCCM 2010), 33–41. Australian Computer Society Inc. France-Presse, A. 2007. Home trashed in myspace party. http://www.news.com.au/story/0,23599, 21549624-2,00.html. Moses, A. 2009. Social not-working: Facebook snitches cost jobs. http://www.smh.com.au/articles/2009/ 04/08/1238869963400.html. OECD. 2009. OECD guidelines on the protection of privacy and transborder flows of personal data. http://www.oecd.org/document/18/\\0,3343, en_2649_34255_1815186_1_1_1_1,%00.html. PETS. 2009. Privacy enhancing technologies syposium. http://petsymposium.org/2009. WPES. 2009. Workshop on privacy in the electronic society. http://wpes09.unibg.it/. 47