Embedding e-Science Applications: Designing and Managing for Usability Marina Jirotka and Sharon Lloyd Oxford University NeSC The Changing Landscape 16th April 2009 Embedding e-Science Applications: Designing and Managing for Usability • A 3-year project funded by the EPSRC • Project staff: Marina Jirotka (PI) Anne Trefethen (CI) Sharon Lloyd (Advisor) Dimitrina Spencer (Researcher) Ralph Schroeder (Researcher) and Grace de la Flor (DPhil student) (Andrew Warr researcher Year 1) • Working closely with the Oxford e-Social Science (OeSS) project, the e-Horizons Institute and the UK e-Science Usability Task Force (UTF) Objectives • To develop an online toolkit defining processes and practices for managing collaboration to facilitate usability in e-Science projects • Through engagement with project collaborators, to investigate approaches, tools and techniques that may enable the development of shared understanding of e-Science project expectations, management and implementation • To develop recommendations, guidelines and procedures that facilitate the effective integration of e-Science technologies with existing work practices whilst also allowing potentially new ways of working • To consider usability of nationally provided services through a broker, and to draw upon case studies to provide insights into the use of the NGS and large-scale resources like it • To consider specific tools and technologies that allow user engagement with projects such as personal Access Grids. • To develop a set of case studies leading to recommendations for managing usability on e-Science projects Embedding Use of large scale infrastructure Task Forces STC, ETF, UTF OGF NGS HPCx CSAR OMII - Product development Use of services/ applications Research Projects: dealing with specific problem Dame, e-DiaMoND, Integrative Biology, NeuroGrid, Carmen… Use of collaborative tools and approaches Access Grids, PIGS Commercial Solutions What is Usability? • As defined by the ISO standard ISO 9241 Part 11, usability can be measured only by taking into account the context of use of the system – who is using the system, what they are using it for, and the environment in which they are using it • Furthermore, measurements of usability have several different aspects – effectiveness (can users successfully achieve their objectives) – efficiency (how much effort and resource is expended in achieving those objectives) – satisfaction (was the experience satisfactory) • Who is the system built for? Management? End users? Who are the users? Typically… Applications Portals Middleware/Core infrastructure development Users of Standards, Products and Trends Middleware Developers Infrastructure/Services - challenges • to 'productise' the outputs from e-science projects/initiatives and to ensure outputs were developed in a scalable and robust fashion. • how to ensure that what is developed is usable for everyone? • What do users expect of an infrastructure? - robust and sustainable egs HPCx vs CSAR - user forums very technical • Less uptake than expected – Inadequate understanding of kinds of services – Insufficient resources to make it happen • National vs local solution – Whose responsibility is to ensure usability? Infrastructure - lessons learned • Mode 1 Provision of national services and infrastructure. Seamless and sustainable provision - different mode of engagement with users: training - handholding - documentation.. • Mode 2 Use of standards and services to develop own infrastructure for a specific scientific problem - need visible/transparent infrastructure where users can see what it is doing and modify it • Applications and infrastructure co-evolve • Gap exercises with users • Localisation - local staff, system administrators, groups of users – But many projects involve cross-institutional work • No large scale data sets to work across institutions • No one solution as an institution to investigate • Researchers have to work out own mechanisms for long-term collaboration Users and Applications - challenges identified • In depth qualitative studies of several key e-Science projects reveal lack of impact - applications not being used beyond lifetime of the project • User requirements not clearly understood - little expertise in elicitation or how they fit into the development cycle • Different types of ‘users’ - middleware developers, end users.. • Stakeholder requirements often poorly conceptualised - who is a stakeholder? • Embedding of applications seen as an additional requirement once system developed Users - lessons learned • Early strong user/stakeholder buy in and feedback - engaging people who are v busy - communication across different groups translation exercise • Engage in project vision - recalibrate throughout project lifetime • Showcase technical possibilities - milestones in project plan • Understanding current activities tools and techniques • Develop stakeholder analysis to ensure right partners and people in the community • Ensure enough time and resources for engaging users efficiently • Focus on exploitation and impact - is this research? and who funds? Collaboration and Communication - Challenges • Instantaneous methods for communicating are important in dynamic teams e.g. use of video conferencing requiring a weeks notice is problematic - or limited skill set • Partners may have favourite audio/video services - either initiate change, or ensure interoperability of services - applies to visualisation tools also • Methods for recording and recollecting information are vitally important for wider dissemination of project knowledge • Translation of information to communities requires consideration of target audience constantly and often may require external development - good for public engagement • Support for training people and institutional buy in to provide tools such as AG Collaboration and Communication - lessons learned • Methods and tools need to be inclusive - so select an approach that interoperates with all platforms - compliant technologies - Neurogrid - Cancergrid commercial solution (coffee room - open activities could have been intrusive) • Using shared repositories and Wikis to collate decision making processes - who does this and maintains these repositories even after the project is over • Certain tools such as Crewe and Memetic potentially interesting for recording - but seemingly no active embedding Project Management - challenges • Scientific vs operational management – Imperative that there is clear role definition and identification of who takes the scientific lead and who takes the operational lead – or one and the same person? Neutral PM beneficial. • Project initiation – key to ensuring project progresses with clear understanding by all project members of what is expected of them. • Communication methods – Developing and maintaining shared visions and objectives • Closedown and Sustainability – How do you assess the delivery of a project and what legacy do they leave behind? Project Management - lessons learned (1) • Taught methods not wholly applicable – Process skills needs to be coupled with social /management skills (e.g. NLP) – ‘Executing plans’ , but if cannot motivate the team … ? • Strategic vs operational management – Scientific vs operational - is PI both? – Different skill sets - may not be for entire duration of project • Project initiation – Defining roles and activities, key to maintaining the project teams – Stakeholder questionnaires, ‘cheap’ means of monitoring project team health • Communication methods - for different purposes – Teleconferences, wikis, maillist, websites, showcases, newsletters • Development and exploitation plan – Can be used to communicate how individuals fit into the bigger picture – Consider sustainable routes - users involved to sustain activity – For every deliverable what can be done to push beyond project Project Management - lessons learned (2) • Management Styles - Use of language important ‘We’ ‘Our’ - Empowerment drives problem ownership - Embarrassment drives delivery! • Project closedown – – – – Needs time… Must indicate what will happen to output Delivering exploitation plan Consider what to do with website, wikis, reports, documents etc and whether others can benefit from them – Lessons Learned exercises - useful for all contributors – Blueprint documents useful to document what could not be achieved content often results in new proposals for follow on projects and enables projects to publish knowledge that is not research. Consider Scientific and operational concerns Involving users from project conception to closedown and beyond - strategies Initiation activities Showcasing technical potential Closedown activities Exploitation plans Lessons learned activities Blueprint Open, modifiable and transparent infrastructure (not only by SA) Ongoing agreements between users, users and developers, and other stakeholders Build it and they will come Technical decision making in isolation from users Users determining requirements Disciplinary silos PM as requirements engineer Rigid inflexible technical vision No stakeholder analysis at project inception and/or throughout project lifetime Fixed waterfall development No management of user expectations Suggestions • Provide training and support in operational management of large scale multi-disciplinary projects? • Can we learn from the EU Network of Excellence approach bringing together communities of interest? • Perhaps OMII + open tools + knowledge base + training = ‘toolkit’ for communities of interest?