In an attempt to open up the European gas and electricity markets, the European Commission proposed its European Union’s Third Energy Package in 2009, a legislative push that obliges the EU member states to implement smart meters and dynamic electricity pricing. This translates to substantial capital investments and direct costs in producing, distributing, installing and deploying smart meters for the utilities companies.
Smart meters transmit energy consumption readings at a very fine resolution of 15 minutes or even lower. The sheer volume of these readings is quite substantial a.k.a Big Data. These measurements have to be collected, stored and processed for billing and other purposes. This implies the utility companies have to ratchet up their data center and Information Technology operations substantially, which further adds to the indirect costs of the regulatory push. To further derive benefits and value from the aggregated data, these companies should also develop or source in Big Data competences.
Till now, in most European member states, private households were invoiced only once a year, while business consumers are invoiced on a more frequent basis. Regardless of the frequency of communication, both private and business customers received only basic information about their energy consumption. The energy consumption data has been manually collected by the utilities. Consequently, the utility companies have a dearth of quality data on their consumers’ energy consumption patterns. This had limited the companies’ ability to engage with their customers, understand their diverse needs, develop innovative solutions and co-create value through a common platform.
The metering market is currently undergoing a transformation, as smart metering systems are increasingly deployed by both, private and business customers. These meters are increasingly viewed as the means of architecting a consumer and data-centric decision-making framework, demand response planning and business strategy for the utility companies world-wide moving into the future.
IT edge to energy production
Through an IT-enabled platform that aggregates smart meter data, energy usage patterns could be understood at a much finer granularity. Such a deep understanding is obligatory in launching a dynamic energy pricing model by a utilities company. This mandates excellent analytics tools that could process volumes of consumer data, identify usage patterns, reveal denominators and enable dynamic pricing mechanisms.
By studying the patterns revealed by smart meter readings, incentive programs and dynamic pricing slabs could be offered to motivate users to reduce or shift energy consumption to non-peak periods. This creates a win-win scenario for both, the utility companies and the consumers; the consumers benefit from cheaper energy prices during non-peak periods, while utility companies could continue running their plants and generators at a constant capacity and energy output level. The consumer surplus thus created traverses the various elements of the value chain, thereby benefiting all participants in the marketplace.
Understanding the smart meter readings also reveals incisive insights into the lives of the customers. As machines and devices are increasingly connecting to the internet, it is now possible to profile the energy consumption and usage patters of these equipments and propose pre-emptive checks, maintenances, replacements, etc.
Prosumers and Micro-markets
With the increase in consumer awareness of renewable energy sources, such as photovoltaic panels, rooftop windmills and heat pumps, the utility companies could collaborate with those households by subsidizing their investments and purchasing their surplus energy output at competitive prices. Such a “prosumer” i.e. producer- consumer strategy requires a better understanding of the Micromarket strategy that requires cloud infrastructure,computing framework, customer relationship management system and innovative analytics capabilities.
Furthermore, with the growing interest in electric cars, it is now possible to create value by leveraging the distributed infrastructure at marginal cost. The batteries available inside these electric cars have the capacity to cater to the energy needs of an average household for almost a week. The utility companies could avail this infrastructure by charging these batteries during the nights, when energy prices are typically lower, while maintaining a constant generation capacity. During daytime, when the energy prices are relatively higher due to higher demand by the industrial and business customers, the utility companies could tap into these electric cars to maintain a constant power supply without having to fire up a coal plant of to operate a nuclear station.
Through better understanding of the underlying infrastructure, consumer behavior, usage patterns, etc. through analytics, utility companies could recruit those consumers into their “crowd sourced energy management” platform with the motivation of better demand response planning, capacity planning, energy storage and distribution management.
Current scenario in IT department
Currently, the IT landscape across the utilities industry is heterogenous with a myriad of data analysis and capacity planning applications. Historically, these applications were built “one-off” with heavy emphasis on coding. Despite their effectiveness, the applications’ functionality was limited in that these applications consume data in standard formats, apply standard algorithms, and present results through standard pre-defined reports. Over 75% of the total code is dedicated to “infrastructure” processes, such as data aggregation, data clean up (preparation), application flow, storage of results, etc.
However, smart meters and the underlying smart grid infrastructure has opened the pandora’s box, meaning the utilities companies now are faced with an enormous deluge of incoming data from sensors, customers, devices, etc. These data reside in several “system of record” archives that scales to trillions of records representing tens of petabytes of data that require further processing to extract information. Such data tsunami puts the traditional approach in jeopardy, which mandates an entirely new approach to data analytics. Competitiveness, hence, hinges not just on IT infrastructure, scalability, quality and quantity – attributes that were traditionally bastions for competitiveness – but also on extracting relevant insights – obvious and subtle – at cheaper costs and decreased time to market.
Enter Data Scientist
Hence, there is an increasing need to substitute the current approach with one that spawns across all elements of the corporate value chain, bridging Corporate & Business strategy, IT Strategy, Software development, Algorithm construction, Mathematics, Natural and Social sciences – what Thomas Davenport from Harvard Business School calls the “Data Scientist”.
In short, utility companies should seek these professionals who understand the entire spectrum of the value chain from “bigger picture” i.e. the broader overarching corporate strategy, market place, competitive landscape, operations management, etc. to studying the intricate patterns, structures, correlations and causations represented by data. The data scientists understand the basics of data, how to dissect the business problem and translate into manageable units, look at data for insights, select the appropriate mathematical approaches to use, test different hypotheses to choose the best approach, derive insights from data and communicate these insights, results and business impacts cogently to all stakeholders.
As could be inferred, these capabilities are beyond the realms of these traditional companies, whose competitive advantages have been in building power plants, operating these plants and distributing energy with minimum or no down time. For these organizations, Information Technology (IT) has largely been an enabler of other business functions.
However, these companies are increasingly realizing that applying analytics capabilities to large volumes of data they are currently aggregating could provide consistent, high-quality insights into operational dynamics of the business entity at different levels of granularity. Achieving the maturity level to operate analytics functions in-house mandates substantial investments in both, resources and capabilities – elements that take time to nurture and develop.
Furthermore, operating a Big Data / Analytics- centric data center is beyond the capabilities of the IT departments, which had traditionally evolved with the growing business needs in these organizations. The core axiom of the IT departments in these entities is to deliver a superior quality of existing portfolio of services with minimal or no down time. An IT outage for a few minutes may have ramifications throughout the organization from manufacturing, finance, accounting, logistics, etc. and may have a substantial financial impact. Hence, these departments are usually resource constrained, unwilling to “experiment” with new technologies and are usually wary about deploying “bleeding edge” products into their existing landscape for the fear of these technologies’ lack of a proven pedigree.
Opportunities for IT outsourcing companies
Hence, it is obvious to identify the Catch-22 situation that is prevalent in these organizations; IT department, in its ambition to become a strategic asset, contends with providing a high quality service, while, at the same time, seeking to deliver value- adding services by deploying innovative solutions and technologies that have a direct impact on the companies’ top and bottom line.
However, with the increasing adoption of public and private clouds, outsourcing companies, vendors, service providers and consulting companies could develop custom solutions that address these pain points and break the deadlock that most such IT departments are currently trying to come to terms with.
By deploying and operating parallel IT infrastructure, aggregating data, analyzing for patterns and delivering business insights, value adding services are effectively delivered with no disruptions to business continuity. Working with customer data requires appropriate governance controls and mechanisms, risk management frameworks to be properly implemented and compliance with strict regulatory standards. The landscape complicates further when processing data pertaining to consumers’ personal information. These constraints pose opportunities for specialist companies in these areas. In addition, data analytics render additional insights by integrating relevant structured and unstructured data from within and outside the organization, correlating distributed data sets, etc. These are capabilities that service providers have to develop, in addition to the technical capabilities around Big Data that these organizations are currently developing.
In closing, the utility industry could no longer continue to achieve perfection in the upstream activities i.e. power generation and distribution as a source of competitive advantage. To continue capturing value, it is obligatory that these companies begin considering Information Technology (IT) as a strategic asset and begin making substantial investments in infrastructure and analytics capabilities. Much of the value to be captured in the future stems from downstream activities i.e. the path towards the customers. Understanding its customer base is the first step towards creating sustainable value. This means investing in customer relationship management systems, developing interfaces to capture customer feedback through traditional and emerging channels, such as Twitter, Facebook, Blogs, etc., understanding smart meter data and mining for obvious and subtle intelligence by relating disparate sets of data through robust analytics.
About the author: Mithun Sridharan is a General Manager at BlueOS LLC, an advisory based in Germany, where he is res pons ible for driving the s trategic s ales initiatives and managing cus tomer engagements in Digital transformation & Analytics. Prior to BlueOS, he was an Account Manager with Oracle Corporation, where he drove strategic partnerships with key enterprise accounts and major Independent Software Vendors in Europe and the USA. He brings with him over a decade of International experience in Management Consulting, Business development, Strategic Marketing & Product Management. He holds an MBA from ESMT Berlin and a Master of Science (MSc) from Christian Albrechts University of Kiel. He is a Harvard Manage Mentor Leadership Plus graduate, an SAP certified Business Intelligence Professional, a Project Management Professional (PMP) and a Certifed Information Systems Auditor (CISA). He also served as the Communication Chair for the German Outsourcing Association in 2013 and is based in Heidelberg, Germany.
Contact: Mobile: +49 176 9792 4897 eMail: firstname.lastname@example.org
LinkedIn: http://de.linkedin.com/in/mithun Twitter: http://twitter.com/jixtacom