There's Data Lurking in Your Labs! - Penn OpenScholar

advertisement

PENN

Arts & Sciences

There’s Data Lurking in Your Labs!

May 16 th , 2012

Warren Petrofsky

School of Arts & Sciences

University of Pennsylvania

Copyright Warren Petrofsky, 2012. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.

Some Quick Logistics

• A few people (my wife, my colleagues, anyone I’ve ever presented in front of), tell me that I talk too fast.

• We have an hour, and it’s just me up here, so I’m going to do my best to keep this to a nice, leisurely, conversational pace.

• If you start to feel like you’re watching the Micromachines commericial, give me a signal

• Asking questions throughout will also help keep the pace, so please do ask questions

The Obligatory “About SAS” Slide

• Largest of 12 schools at Upenn

• 625 Faculty

• 715 Staff

• 1650 Graduate Students

• 6500 Undergraduate Students

The Challenge: Data is Everywhere

•I could only wish that half of the stuff I find was stored in the cloud

•Instead it is:

On tapes:

On “Servers”

On “Servers”

On …

Not a Surprise

•Anyone shocked by these photos?

•Anyone think this happens only at Penn?

•We knew this data was out there, what we didn’t know, was how intractable a problem this was

•Next I’ll talk a little more about our structure and policy landscape

•Then I’ll jump into the fun part: what we tried, what didn’t work, and where we went next

Our Policy Landscape

• In March of 2010 Penn released a comprehensive update to our Computer Security Policy

• Combined several earlier policies and addressed new risks

• Included specific requirements for the management of servers holding Confidential University Data

• Most controversial - required servers with confidential data be managed by full-time IT staff

• It states:

Our Policy Landscape – Cont’d.

“ Management and Support

A regular, full-time University staff member with an IT position designation must serve as the system administrator. The system administrator must be identified, and registered in ISC's Assignments Service. The system administrator must have attended Penn security training (if offered) or equivalent for the relevant operating system within the past three years. At times, system administration may be delegated to a third party or University member who doesn't meet the foregoing criteria. In such situations, the School or Center must designate a regular, full-time

University employee to oversee the system administrator. The appointed individual providing oversight shall be fully accountable for compliance with this policy, and specifically must ensure that:

1. he or she is registered as a contact in ISC's Assignments Service;

2. third parties or designated University members have the necessary skills and training to protect University systems and data; and

3. third parties or designated University members are clearly informed of their obligations to comply with this policy.

For guidance on these duties, contact the local Security Liaison.”

So We Have Good Policy

•So, now we had a well written policy

•Only one problem - we had no idea where these servers were

•We were required to register our servers as critical hosts, but we knew that there were servers spread throughout our school, administered by grad students and postdocs

•We weren’t worried about the servers we could point to, those were the ones we were already running

•The ones we weren’t running were the ones that PI’s had purchased with research funds and never mentioned to us

•So how were we to find and secure these servers?

•(Here’s where the failing starts…)

What Didn’t Work – Getting Faculty

Excited About Policy Compliance

We sent emails like this:

Dear SAS Faculty Members,

As you may know, in March of this year the University updated our Computer

Security Policy to address the changing risks to our systems and information.

According to this policy, every member of the University Community has a role in helping keep our information safe.

In particular, we would like to call your attention to section VIII.3.1. which requires that servers hosting confidential data be administered by full time IT staff, or at least, have a full time IT staff member as a liaison to the administrator. Many of you may have systems in your labs that are administered by your graduate students or research assistants. We look forward to working with you to identify systems that may have confidential data so that we can help keep these systems secure.

If you have systems with confidential data, please contact your LSP and we will schedule a meeting to help plan next steps.

What Faculty Saw – Actual Feedback

According to my good friend the chair of the Math department, our well-intentioned email you saw above, when processed by Faculty, looked like this:

Dear SAS Faculty Members,

As you may know, in March of this year the University updated our Computer

Security Policy

Lessons Learned

• We had built good policy

• We were properly concerned with actually complying with the policy

• We knew that we had to implement this at the level of individual labs

• We made the mistake of thinking that Faculty are motivated by compliance

• We pretty much sent a demotivational email

• We received zero responses

What Didn’t Work – Getting Faculty

Excited About Hackers

• I would hear from LSPs about ancient web apps used for critical research processes and I’d try to meet with the research staff who ran the server

• If they agreed to meet with me, I’d go in and talk about input validation and the constant scans for web vulnerabilities we were seeing in our HIDS

• They’d tell me, “We’ve been running this same website for 10 years, and never had a problem.”

• I’d say, “A lab down the hall just had all of the lab procedures on their perl wiki deleted and replaced with spam links to porn sites!”

• They’d laugh and usher me out the door

What Didn’t Work – Getting Faculty to

Show up for Interviews

• Security and Privacy Impact Assessments (SPIA) has been great

• Every School and Center annually reviews where they have confidential data, what the current protections are, and what the current risk level is

• This would seem like a natural way to find these servers

• SAS has LSPs conduct SPIAs in their departments

• But, the labs most likely to have the servers we’re concerned about are least likely to respond to appt. requests

• These labs are our blind spot. We’re looking for servers which, by definition, IT staff don’t even know exist. We can’t know what we can’t see.

So, Asking Didn’t Work

• We were pretty sure those servers were out there, but:

• Faculty wouldn’t answer our emails

• SAS does not control the network

• Penn did not have an IDS

• So, how were we going to find these?

• <Shameless Plug> If you’d like details on our solution, my colleague, Justin Klein Keane will presenting at 1PM today.

</Shameless Plug>

• Short answer: Justin built a discovery and asset management tool that includes results from port scanning of

SAS’ 16,500 potential IPs

Did We Have Any Such Servers?

•Yes, yes we did.

Let’s Take A Step Back – Why Are We In

This Situation?

Economics

How Did We Wind up with This Stuff?

•Grants

•First you get the grant, then you get the money, then you get the HARDWARE

•You write the grant, you get the money, you buy what you want

•If that’s the last grant you get for the next 9 years, what you wanted then is what you have now

But Why Didn’t the Data Go on a Central

Server?

•Funding rules

•Capital equipment costs come out of grant funds before overhead

•By faculty calculations, they can get 60% more for their money, if they buy their own equipment.

•(This isn’t really true, but given the way they budget currently, it is.)

•If a service is a technology offered standard to all faculty, then it cannot be billed to a grant

A Twofold Approach

• One approach for existing systems

• Another for new systems and new grant proposals

Moving Forward with Existing Systems

• Identify the biggest risks to Univ. data (confidentiality, integrity, availability)

• <Shameless Plug> If you’re interested in how SAS has built continuity plans throughout our Natural Sciences division, my colleague Christine Brisson will be running a fantastic interactive session at 4PM today, that will show you how to run a Tabletop Exercise </Shameless Plug>

• Identify the biggest pain points for PI’s

• Find the intersection

• Start there

Start with the Risky and Annoying

How risky is this?

How much of a pain is this for labs to run?

Not All About Confidentiality

• You may have noticed in my last slide that I called out the

CIA (Confidentiality, Integrity, and Availability)

• Some of our biggest exposures weren’t to confidential data, but to the research mission of SAS, with millions of dollars of grant-funded research stored on decades-old storage servers

Once More into the Labs

We were concerned about three basic problems:

1.Systems connected to devices that were not patched, but were exposed to the Internet

2.Storage systems/servers that were not patched and were not backed up

3.Servers that were not patched or maintained that were exposed to the internet and were potentially serving homegrown dynamic web services

(Did I mention that Penn has no edge filtering except blocking windows file-sharing ports?)

We Started with 22, 80, 443

• We were pretty sure that if a lab was running a server with ssh or web services, they were probably running more than that.

• We asked the LSP to make the first move

• We did not bring a one-size-fits-all solution

• We asked faculty and graduate students to talk about any current frustrations with their research technology

Teasing Out the Obstacles

• When we would met with faculty about their current systems, there was a strong preference for the status quo

• They rarely had to think about their setup (graduate students insulated them from any pain points)

• And whenever they did, the person they needed to talk to about it was their graduate student who was bound to do what they said

So What Do Faculty Want?

• A system that “just works”

• Full control

• Zero bureaucracy

• Full value for their grant dollars

Web Was Easy

• Most labs had websites as an afterthought and were static html built ten years ago

• Some long-gone graduate student had configured apache on their “lab server” and their descendants had been updating pages (but not OS or apache) ever since

• When asked if they would like help running this service, most were surprised

• Most frequent response: “We only run this because SAS

Computing won’t host our website.”

• WHAT?

Our Pitch for Static Websites

• We will provide this as a standard (FREE) service

• We’ll duplicate your current site on an SAS-run server

• We’ll back it up, offsite

• We’ll preserve your current hostname

• We’ll create accounts for anyone you tell us to, and we’ll give them the permissions you specify

• We’ll have it ready for you to test by the end of the week (if we don’t make it quick all momentum is lost and they rarely get back to us)

Our Pitch for Dynamic Websites

• If you are happy with your current solution, we will help you secure it:

•We’ll perform a code review and suggest changes needed to address critical vulnerabilities

•We’ll help evaluate your current patching solution and firewall settings

•We’ll install OSSEC-HIDS and monitor your server for attacks and compromises

• If you are looking to update your site, we have a couple of options:

•OpenScholar – A Drupal distribution built by Harvard for academic websites

• OpenScholar is quick, you’ll have a new site within days

• OpenScholar keeps you in control – You can add users and give them editing rights

• OpenScholar is FREE

•Drupal – We will setup a basic Drupal site for your lab, but you will need to hire someone to build your dynamic site in Drupal

• We have performed a code review on almost 100 modules to expand

Drupal’s capabilities, and will review any additional modules you are interested in using

• We will review your Drupal site for security issues before it goes live

• We will maintain and update the Drupal server

Why Was Web Easy?

• We could make it a standard service (FREE)

• It was a hassle for graduate students

• It was not closely linked to the science

• Faculty could retain control over access and editing permissions

Web – Lessons Learned

• Schedule the server retirement in advance

• Offer to take over the hostname

• Check that the server is really off

• Offer secure destruction of the hardware (prevent backsliding)

• OpenScholar is a nice blend of centralized management of the server/architecture, with distributed control of permissions and access control

• Progress can be quick. We retired ten web servers in one department in one semester.

• Word will get around. Once we started migrating websites graduate students from other labs heard about it and started asking their LSP to take their site off their hands.

• Working one department at a time increases this word of mouth

• While the web service itself was usually not a mission critical service, the poorly administered lab web server was often a critical vulnerability for the rest of the lab data, with shared passwords, access to home volumes, etc.

Storage Was Hard

• Storage was tough because it came in all shapes and sizes:

• Local storage for fast-writes coming off of devices (small, medium, large)

• Networked storage of lab research data (small, medium, large, and Woah!)

• Archival backups of lab research data (usually large –

Woah!)

Storage Was Hard – Cont’d.

• Varied not only in amount, but in requirements for speed and breadth of access

• Genomics labs are writing 2.5 - 5TB/day (Woah!)

• Psychologists recording mouse electrophysiology write about 50GB/day (large)

• Some labs need to make their entire data archive available to colleagues off-campus

Storage Outpaced Backups – An

Opportunity

• Most labs had used grants funds to purchase devices

(sequencers, NMRs, etc.), and had purchased storage to catch the data at the same time

• Many labs, feeling the crunch of ever-growing data archives, had used smaller subsequent grants to purchase additional primary storage

• These smaller grants were not sufficient to grow backup space

• Labs wound up with single copies of massive data repositories

A Conversation About Backups and

Money

• Faculty: “Backups should be like electricity. When we buy a microscope, we assume the University will provide the electricity. That’s why we pay overhead. Same thing with backups. If we buy storage, you should provide backups.”

• Warren: “But, how do I anticipate needs for 2.5TB/day of new backups when faculty decide to ramp up their Genomics research, for example?”

• Faculty: “If I get a million dollar grant and setup more sequencers, then you’d better plan to have more backup capacity.”

We Have to be Involved in the Grant

Cycle

• The conversation above is exactly why Computing needs to help faculty write grants.

• It is a misunderstanding to think that indirect costs cut into faculty research funds

• Granting agencies grant the requested budget plus the negotiated indirect funds

• Problems arise when faculty grant applications budget for hardware costs as direct >$5000, and thereby exclude the linked indirect costs

Backups Are an Easier Sell than Primary

Storage

• As we’ve seen above, faculty have good reason to want primary storage to be local

• Fast writes

• Ease of expansion

• Backups, on the other hand, are a pain for them

• Someone has to make sure they’re working

• They just keep getting bigger

Backups – Today Cheap, Tomorrow

Free?

• We currently offer researchers networked backup in the 100GB-1TB range at less than our cost

• (This is still a problem, because they’re still paying with afteroverhead dollars, unless we purchase hardware > $5000 to accommodate their backups. The subsidized cost is meant to ease this pain.)

• In the > 1TB range we recommend that they purchase a dedicated backup solution and we will administer it for them

• Based on conversations with faculty and academic deans we are looking for ways to provide backup of all but the largest research datasets as part of standard computing service

Backups – Free with Primary Storage?

• Nasuni is a solution we’re just about to start piloting.

• Nasuni provides fast local storage with a bridge to Amazon

S3 for both HSM and permanent archiving

• Billed per TB of live storage, snapshots are free

• Can buy a physical box for about $5000.

Why Would That Work?

• Nasuni would let labs that are reluctant to buy into a shared service buy a physical box that they could see

• The service and management would be centralized

• But! Access control could be delegated using standard

Windows ACLs and OUs

• Nasuni’s billing lets us charge labs for primary storage

• Backups are “FREE”

• We could do the same without the cloud component

New Ways of Working – Bending the

Grant Incentives

• As we’ve discussed, faculty are justifiably reluctant to buy services from us with post-overhead dollars

• From their perspective it should either be covered by overhead, or should come out before overhead

• We are working with our Director of Research Administration to look for opportunities to treat cloud and centralized storage solutions as equipment costs, or at least, to ensure that if they will be using a service the grant application requests the appropriate indirect funds

Our Overall Approach

• Find servers wherever they are

• Begin with “how can we help” not “you must move that”

• Be ok with only reducing not eliminating risk

• Meet the non-negotiable demands for local control

• Build constructive relationships by providing services like

OSSEC-HIDS monitoring and grad-sysadmin pizza

• Make it easy for faculty to use centralized services with grant funds (change the funding model and provide boilerplate)

Acknowledgements

• Tracy Petrofsky and Christine Brisson provided extremely helpful feedback on this presentation

• Deborah Marshall, SAS Executive Director of Research

Administration, offered critical insights into the grant funding process

• Clare Din provided the superb photography

Questions?

Contact Info

• Warren Petrofsky

• petrofsk@sas.upenn.edu

• 215-573-0999

Download