USC Research Cyberinfrastructure (RCI) Activities 2014- 2nd Quarter Phil Moore, Director of Research Cyberinfrastructure, phil@sc.edu http://www.sc.edu/rci Management RCI hosted a four-day XSEDE HPC summer bootcamp workshop covering hybrid computing, MPI, OpenMP, and OpenACC, 17 students participated Relocated Planck HPC cluster to Swearingen Installed RCI ‘Condo’ base cluster (9 nodes-180 cores) and additional Institute for Mind Brain compute nodes (6 nodes-120 cores) in UTS Annex datacenter Dell “Seed” server (60 Intel I7 cores) received for evaluation Organized meeting with the CEC and CAS IT staff to discuss current MATLAB licenses Organized meeting with college IT staff and USC MathWorks sales representatives to discuss campus academic license Sponsored student workshop taught by MathWorks applications engineer presenting features of latest MATLAB release and Toolboxes, 114 students attended Discussed Open Science Grid and access to high energy physics data (Pordes, Moore, McSwain) Participated in collaboration meeting with VPR, SRNL and research faculty (Moore) Accepted invitation to review ACM Modeling and Simulation 2015 conference submissions (Moore) Interview published in UofSC-Today news article covering RCI group expertise, USC High Performance Computing (HPC) activities and contact information (Moore) Conferences and Workshops ACM Modeling and Simulation 2014 Conference, Tampa, FL (Moore, Torkian) Research Faculty Engagement A current list of HPC users and applications is available from Dr. Phil Moore (phil@sc.edu) USC School of Medicine (M. Nargakatti, ,Yang, Moore) Discussed research applications, network connectivity and remote data sites Marine Science (Morris, Torkian, Moore) Converted Excel –Visual Basic salt marsh model to Python-Matplotlib McCausland Brain Imaging Center (Rorden, Torkian) Achieved 100X speedup of FSL brain imaging calculations using NVIDIA GPU and Parallel Matlab for grant proposal Physics (Johnson, Moore, Torkian, Sagona, Elger, Students) Hired graduate and undergraduate students for ASPIRE Python software development projects Mechanical Engineering (Cao, Torkian, Moore) Discussed software development collaboration Institute for Mind Brain (Richards, Henderson, Fillmore, Heller, Moore, Sagona, Torkian) Reviewed HPC hardware and software requirements for brain simulation and analysis Electron Microscopy Center (Ghoshroy, Moore) Discussed collaboration opportunities Earth and Ocean Sciences (Owens, Voulgarus, Moore) Discussed potential collaboration topics Computer Science and Engineering (Matthews, Sagona, Elger) Configured HPC cluster resource and created accounts for computer science parallel programming class Test HPC cluster for BLAST simulations (Norman, Sagona, Student) Hired undergrad student to restore HPC cluster located in School of Public Health Earth Science Research Institute (C. Knapp, Torkian) Migrated petroleum research database from Access to PostgreSQL database English Department (Gavin, Sagona, Torkian) Installed application on HPC cluster for modeling complex systems Philosophy Department (Dickson, Lin, Dougherty, Sagona) Discussed development environment on HPC cluster for multi-agent neural network GPU application Computer Science and Engineering (Bakos, Gao, Moore) Moore on Yang Gao’s Ph.D. Committee Technical Meetings UTS (McSwain, Wagner, Minor, Miller, Moore, Sagona, Elger) Technical meeting to document network and power requirements for new RCI condo cluster in Annex Datacenter Hewlett Packard (Moore, Sagona) Technology updates from HP engineers providing information about current and future HPC systems and cluster data storage solutions