ADVANCE TO THE NEXT LEVEL: The Soft Grid and the Future of Smart Grid Analytics Ben Kellison, Analyst, Smart Grid, GTM Research Agenda The Emergence of the Soft Grid The Rise of Big Data What is Analytics for utilities? Driving Adoption of Analytics Trends and Predictions Guiding Investment in Analytics The Emergence of the “Soft Grid” August 2009 – GTM announces the Soft Grid In the Article: -AMI and infrastructure vendors are largely established, software will be the future -Smart Grid is the platform, apps are needed -2010 will usher in a period of increased entrepreneurial interest and VC investment in the software needed to support and enable all major smart grid sub-markets GTM Research and the Soft Grid in 2012 Q3 2012 Soft Grid and Analytics smart grid market research report August 14th and 15th Soft Grid Conference in San Francisco \ Big Data’s Growth Global Data is reported to be doubling (2X) every 2 years: moving from gigabytes to terabytes to petabytes to exabytes (change in the orders of magnitude) the exponential growth of data is putting a strain on tradition business intelligence (BI) and analytic solutions. Utilities are not alone: Boeing jet engine generates 10 terabytes every 30 minutes; WalMart has 1 million customers transaction/day = database of roughly 2.5 petabytes Enterprises creating so much that they now have digital “exhaust data” – soon need the Waste Management of big data Big Data’s Growth (continued) No exact metric for “big data” – now typically equals dozens of terabytes up to several petabytes Structured vs Unstructured Data (size, source, type, and velocity) 5 billion cell phones used in 2010 ( ¾ of the global population ) 235 terabytes data stored - Library of Congress (2011) 1 exabyte = 4,000 times the Library of Congress In 2010, we added 13 exabytes (Enterprise: 7 exabytes, Consumers 6 exabytes) $600 – to buy a disk drive that can store the world’s music collection Moving from Infrastructure to Software One technology revolution begets the next, we are “moving up the stack” from infrastructure, to data networking, to end-to-end systems, to real time complex event processing – or “analytics.” We are on the cusp of a tremendous wave of innovation in the utility sector, that will forever change the way utilities operate. (technology change, organizational/business process change, new services) “Analytics” provides unprecedented control and command Data Fusion: Multi Factor Anomaly Assessment Complex Event Processing Rule Engine: If 10 meters gasp, 1 transformer irregularity detected, and 5 customer outage tweets than take the following actions : _______ . Source: GTM Research, STI Extension of Analytics from Historic to Real-Time Predictive Analytics -Complete Situational Awareness -Business Intelligence (BI) Energy Trading with “live look” at the Grid -Simulation/Visualization Source: GTM Research Grid Optimization and Operational Intelligence -Asset Mgmt Analytics -Crisis Mgmt Analytics -DMS Analytics -Outage Mgmt Analytics/Fault Detection & Correction -Weather/Location data -Mobile Workforce Mgmt -Energy Theft -Behavioral Analytics -Tiered Pricing - Trading, Selling Negawatts (DR) -Building Energy Mgmt -Power Analytics (Load Flow) -Social Media Data Integration -DG/EV/Microgrid Analytics Coming to terms with analytics: Cluster analysis (unsupervised learning, used for data mining) Data fusion and integration (analyze data from multiple sources) Data mining * (statistics + machine learning +database mgmt) Genetic Algorithms (aka evolutionary algorithms – survival of the fittest Neural Networks (finding nonlinear pattern recognition) Network Analysis (discovering the value of certain nodes) Grid Optimization (redesigning complex systems according to measures > cost, reliability, latency) Machine Learning/ Artificial Intelligence (AI) – algorithms that allow machines to perform better over time Predictive Modeling Signal Processing Spatial Analysis Simulation Time Series Analysis Visualization Drivers for Active Smart Grid Analytics (too many to list) True ROI or effectiveness of DR programs and AMI data Justify the $billions that have been spent on AMI infrastructure Identify financial issues of time of use pricing, smart meter integration and the meter-to-cash process (discovering incorrect billing, meter reads, etc) Fine tune understanding of transformer issues, load anomalies, based on granular customer data, rather than aggregate peak load expectations Better customer segmentation – customer DR programs offered, asset deployment, etc Drivers for Active Smart Grid Analytics (cont) Using geospatial intelligence to visualize grid operations, crisis management, customer behavior and patterns The growth of “virtual power plants” especially as CALISO, ERCOT begin to gain discreet, granular data on customer by customer level the emerging ICCP standard (Inter-Control Center Communications Protocol) specifically IEC 60870 part 6 (IEC 60870-6/TASE.2) data exchange over WANs between utility control centers, utilities, power pools, regional control centers, and Non-Utility Generators. Increased speed of action with visual decision support scientific research to support that visual perception supports cognition faster than other methods, statistics, reading, listening GTM’s Ten Trends & Predictions 1) Many of the companies that will lead the utility analytics transformation are NOT in this market today Expanding from other industries, or realizing that existing algorithms (i.e. financial services) fit a market need in the utility space 2) The first task of Big Data for utilities has been meter data (via MDM systems); transformer sensors, PV, EVs and other grid assets are next. 3) Nine out of the nine finalists for IBM’s Global Entrepreneur of the Year title are analytics startups. That’s either an anomaly or proof that analytics is on the cusp of transforming every industry. IBM expects to have $16 billion in annual analytics revenue by 2015. GTM’s Ten Trends & Predictions (cont) 4) Hadoop (and Cassandra) are for Real Open-Source Platforms/ Processing Engines (and database mgmt systems) for the distributed processing of large data sets across clusters of computers using a simple programming model IBM, Oracle, EMC and Microsoft are all now adopting (none have a strong history of using open software) 5) Talent pool in big data/analytics is very shallow at the moment. For anyone with even basic knowledge of Hadoop and/or Cassandra, getting a job is a piece of cake, but there just aren’t that many people in the world that understand the science of core big data. Data Analytics is becoming a Science/field, Master Degrees are now offered for the first time GTM’s Ten Trends & Predictions (cont) 6) VC landscape will continue to be ‘very hot’ on big data. Cloudera, Hortonworks – big data infrastructure layer for big data, have received nearly $90M in funding in he last 1-2 years But, the next wave is still waiting. If you want to E.T.L. (extract, transform and load) the data, the apps space is light – 2012 will be the year of application innovation for analytics/big data 7) Utilities are not saying ‘no’ to Cloud NoSQL: Massive unstructured data opportunities relational data model is falling – whole new paradigm is opening up, not one server, one schema any longer 8) Large-scale enterprise software-as-a-service platform becoming viable- due to: low cost of capital, flexibility/upgrades, 3rd party expertise leveraged GTM’s Ten Trends & Predictions (cont) 9) Analytics will end the “treat-every-customer-the-same” regulatory model The mind set in areas like asset deployment (capex) will shift, as it proves to be a waste of money in optimizing and running the system. Customers will still deserve equal service, but to a point. Power quality issues for a Google datacenter and Mr Jones house are not the same. 10) Software is only as valuable as its ability to integrate – huge moment right now (2012-2016) for data/systems integrators Leading Vendors in Analytics Established Leaders EMC IBM Informatica Corporation Information Builders Microsoft MicroStrategy Oracle OSIsoft SAP SAS Teradata VMWare New Analytics Companies Algorithmics Cloudera Localytics QlikView Splunk Tableau Software Tibco Software SoftGrid Companies Aclara DataRaker EcoFactor Ecologic Analytics (acquired) eMeter (acquired) Energent Opower Power Analytics SilverSpring – EMC (partners) Space-Time Insight Tendril Existing IT Architecture Challenge – “Accidental” Smart Utility IT Enterprises today are very rare, instead “accidental architecture complexity” is what we find In the past, ad hoc point-to-point integration between pairs of applications was sufficient to handle basic needs like entering outage reports from customer service applications into an outage management system – “stovepipe spaghetti” Utilities need Service Oriented Architecture (SOA or “Enterprise Service Bus”) need a flexible, multi-disciplinary approach Key Insight : to extract the greatest value, need the right tools and right architecture so that you can offer self-service (instant web based access), speed (in memory analytics) and wide data access and collaboration Utility Spending on SoftGrid Big Data: Design/Strategy [BEFORE] 1 Define and Integrate Big Data – (Asking the right questions ! ) access, search, integration, discovery, reporting, system upkeep, etc 2 Identify the necessary components to better manage it access to all employees ( i.e. self service), in-memory analytics - instantaneous, not having to wait, greater amounts of data in real-time, ad-hoc approach collaboration with others internally/externally - data sharing access, evaluate, collaborate, share data mash-up/fusion, mixing data to create new blended data: requires a simple, seamless integration process, ability to perform calculations on shared programs, systems, worksheets, integrate blended data and make it quickly visible 3 Create actionable intelligence - Getting value from big data high speed/real-time data especially important in making market pricing decision; critical control and protection decision, etc Big Data - Extracting Insight and Value [AFTER] Enabling advanced modeling & simulation to discover market peak load demands, expose renewable energy variability, and improve overall grid performance Innovating new business models, products, and services Segmenting populations to customize actions tailored levels of service (prediction: electric bills will mirror cable/telco bills in 5 years) Replacing/supporting human decision making with automated algorithms control and protection in the face of more DG integration, extreme weather, and the increase in the digital economy (power quality) will required sub-second reactions Grid 2.0 “The replacement value of the world’s physical infrastructure is tens of trillions of dollars. It gets more and more expensive to replace it,” he said. “We have to maintain it, and we have to manage it more effectively.” - Steve Mills, the senior vice president of software at IBM. Further Reading: `“How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did” (Fortune, Feb) “How Companies Learn Your Secrets” - NY Times Magazine (Feb16th, 2012) Thank you Coming up Next: Panel at 8:00 am delving deeper into analytics with: Scott Bussier, Ecologic Analytics Larry Chalupsky, Grid2020 Bill Kallock, Integral Analytics Grid Analytics: grid awareness under different conditions Graph Source: STI