Experiences from Cyberinfrastructure Development for Multiuser Remote Instrumentation Prasad Calyam, Ph.D. David Hudak, Ph.D. Ashok Krishnamurthy, Ph.D. Karen Tomko, Ph.D. Cyberinfrastructure and Software Development Group, OSC IEEE e-Science Conference, Indianapolis, December 10th 2008 Overview • Multi-user Remote Instrumentation? • Application Scenario and Components • OSC-developed Solutions • Deployment Case Studies 2 Remote Instrumentation Overview • Academia and Industry use scientific instruments – E.g., Electron Microscopes, Telescopes, Spectrometers – Used for research and training/classroom purposes – Such instruments are expensive to buy and maintain • $450K - $ 4Million initial purchase + Staff $ to maintain • Remote Instrumentation – Remote access of instruments, related devices and their data resources via the Internet • Benefits – Access for remote students and researchers – Return on Investment (ROI) for instrument labs – Avoids duplication of instrument investments for funding agencies 3 Remote Instrumentation Application Scenario 4 OSC’s Remote Instrumentation Program • RI cyberinfrastructure development for Ohio-based universities involves using OSC’s state-wide resources – Networking, HPC, Storage, Analytics • Pilot program funded by the Ohio Board of Regents • Goal: “Leverage Ohio’s investments in scientific instruments, wide area networking, high-performance computing, and data storage to foster academia-industry collaborations involving remote instrumentation” 5 OSC’s Remote Instrumentation Partners • The Ohio State University – Scanning Electron Microscope, Material Science Engg. Dept. – Raman Spectrometer, Chemistry Dept. – McGraw-Hill Telescope, Astronomy Dept. • Miami University – NMR Spectrometer, Chemistry and Biochemistry Dept. – Unipulsed EPR Spectrometer, Chemistry and Biochemistry Dept. – Transmission Electron Microscope, Geography Dept. • Ohio University – Nuclear Accelerator, Physics and Astronomy Dept. 6 Remote Instrumentation Components Instrument Lab Site Remote User Site • Remote Observation • Resource Scheduling • Remote Operation • Sample Handling • Voice/Text Chat • Lab Notebook OSC • Use Policy • Web-portal Development • Usage Billing • Data Storage Management • Real-time Analytics • Network and Data Security 7 Technical Challenges • Network bandwidth – Last-mile bottlenecks lead to improper operation – can cause expensive instrument damage – Frame rate and video quality tradeoffs • Communications – Remote User(s) and Operator co-ordination – VNC/RDP, VoIP, Videoconference, Presence, Control passing, Web cam – Simultaneous multi-device views for user workflows • Dead man's switch – Fail-safe method to stop the service in case instrument Operator becomes incapacitated • File system access – Web services – User accounts, Data read on instrument file system, Data read/write for analytics on mass storage file system • Network Security – VPN, Ports, Firewall rules, Encryption/Authentication 8 Policy Challenges • Scheduling Policy – Prioritizing users – PI/Co-PI, Graduate Students, Industry – Synchronizing calendars of devices and personnel – Sample handling • Licensing – Remote users observation/operation and analytics • Service Level Agreements – Vendor – ISP/ASP • Safe-use Policy – Expert privileges, Novice privileges • Billing – Setup surcharge, Fee/hr, Fee/session, Resource units 9 Case Studies • Account of OSC experiences with cyber-enabling various kinds of scientific instruments – Solutions evaluated – Solutions developed – Open issues • Three Case Studies – OSU Material Science and Engg. Dept. – Electron Microscopes – OSU Chemistry Dept. – Raman Spectrometer – MU Bio-Chemistry Dept. – 850 MHz NMR 10 Case Study-I: OSU CAMM • OSU Center for Accelerated Maturation of Materials (CAMM) has acquired high-end Electron Microscopes – Used for materials modeling studies at sub-angstrom level • OSC providing networking, analytics and storage support for remote microscopy – Permanent console at Stark State for Timken access • Hardware-based (ThinkLogical) KVMoIP solution – Image processing of samples (automation with MATLAB) for Analytics service – Lab Notebook for image management • Remote Microscopy Demonstrations – Supercomputing, Tampa, FL (Nov 2006) – Internet2 Fall Member Meeting, Chicago, IL (Dec 2006) – Stark State University, Canton, OH (Mar 2007) Network Connection Quality Vs User Control • Higher TCP throughput (i.e., mouse and keyboard activity) on poor network connections Increased user effort with keyboard and mouse on poor connections • “Congestion begets more congestion” 1 Gbps LAN – Expert Task-1 60 B/s Task-2 Task-3 Public 100 Mbps LAN – Expert Task-1 900 B/s Task-2 Task-3 100 Mbps WAN – Expert Task-1 Task-2 Task-3 1400 B/s 60 s User expends minimum effort with keyboard and mouse to complete use-case 100 s User expends notably more effort with keyboard and mouse to complete use-case 12 140 s User expends a “lot” of effort with keyboard and mouse to complete use-case Image Processing • Automated Matlab processing of electron microscope images – Alternate to Adobe Photoshop (Fovea Pro 4 plug-in) filters that take multiple days to process – Filters: E.g., Image Blurring/De-Blurring, Image Dilation/Opening – Matlab GUI development for sample filters testing before batch jobs Processed Image Inputs Gaussian Blur Dilate Image 13 Case Study-II: OSU Chemistry • Recent purchase of a Combined Raman - FTIR Microprobe – To get complimentary Raman-IR information about chemicals • OSC custom developed a Remote Instrumentation Collaboration Environment (RICE) software – Enable local students and researchers to work from comfort of their offices or homes – Access for remote collaborators: California State University, Dominguez Hills, CA; Oakwood University, Huntsville, AL • Active RICE testing and concurrent development in progress – “Pink Screen” GPU problem – Dual-screen resolution issues – Overlay error issues in Vendor software 14 OSU Chemistry Instrument Lab 4XEM Webcam Live View on Web browser; Dual monitor PC with advanced network & multimedia cards Cyber-enabled Instrument 15 “Pink Screen” GPU Problem • Software-Software VNC issue – OS not aware of GPU video processing • Solution: Use Hardware-Software KVMoIP (e.g. Adder), or Hardware-Hardware KVMoIP (e.g. Thinklogical) Remote VNC Client View 16 Dual-screen Resolution Problem • Default VNC (i.e., Ultra VNC) distribution issue with dual monitors and extended desktop Left Monitor Right Monitor 17 Remote VNC Client View Dual-screen Resolution OSC-Solution • Solution: OSC-patch with increased image geometry Left Monitor Right Monitor 18 Remote RICE Client View Overlay Error Issue in Vendor Software • Remote VNC client refresh causes local video overlay error • Solution: Adhoc trials - notified vendor support Left Monitor Right Monitor 19 Local View with error message OSU RICE Solution Screenshot 20 OSU RICE Solution Features • “Network-aware” video encoding – Optimizes frame rates based on available network bandwidth – Manual video-quality adjustment slider • “Network-aware” action blocking – Warns user of network congestion – Blocks user-actions during high network congestion scenarios • Collaboration tools for Multi-user support – – – – Peer-to-Peer VoIP/ Multi-user VoIP Multi-user Colored-text Chat Multi-user Presence Multi-user Control-lock passing • Multiple display resolutions – Small screen – Full screen – Dual screen (remote site with dual monitors and extended desktop) 21 OSU RICE Solution Demo 22 Case Study-III: MU Biochemistry • Recent purchase of 850 MHz NMR – first of its kind in North America • For studying supra molecular architectures and functional materials • RICE integrated with web-portal for management of remote instrumentation sessions, user collaboration and data – Access for remote collaborators: Bowling Green State University, Ohio University, Muskingum College, Talawanda High School 23 MU-NMR System Deployment 24 Web-portal User Work-flows 25 Web-portal Features • User Accounts and Privileges • Management of Instruments, Projects, Samples, Sessions, Experiments • KVMoIP and RICE access control • Asynchronous chat for remote monitoring of experiment progress • Experiment data archival at OSC storage • Analysis of stored data sets using OSC-hosted Topspin software 26 Web-portal Architecture 27 Web-portal and RICE Screenshots 28 RICE use-cases for Research and Training • Remote participants can view expert (also remote!) controlling a scientific instrument – Efficiently: Multi-party VoIP, Presence and Chat collaboration – Reliably: Network awareness mitigates instrument damage • Expert can pass control to remote participants – Train students to operate the instrument during class – Allow another expert to study the sample under study • Researchers and Students can conduct experiments at their assigned slots on the instruments 29 Conclusion • Developing cyberinfrastructures for RI requires: (a) Understanding and overcoming multi-disciplinary challenges to develop solutions (b) Developing reconfigurable-and-integrated solutions that need to be tailored on a per-instrument basis (c) Close collaborations between instrument labs, infrastructure providers, and application developers 30 Future Directions • “Reconfigurable-and-Integrated” Tools and Web-portals – RICE, Wikis, Lab Notebook, Mailing lists, Calendar, … • Human-centered Remote Instrumentation solutions for “at-theinstrument” experience Human-aware Codec Adaptation ROI Video Encoding 31 Thank you! 32 MU RICE Solution Demo 33