Becoming data-driven will be a difficult transition for utilities, but it’s a journey worth making, writes Charlotte Brown, Energy Sector Director at data science consultancy Tessella, part of Capgemini.
Utilities are undergoing changes unlike anything they have experienced before. New distributed energy sources, electrification of transport and heating, sustainability goals, and changing consumer behaviours will all upend electricity, gas and water.
This will mean significant decisions about where to put new generation, transmission, and storage infrastructure. Even without this disruption, there is an imperative to get better at maintaining and managing what we already have – no one can be pleased about the 3bn litres of water leaked every day.
More detailed usage data from smart meters can help forecast demand. IIoT sensors and actuators throughout generation, distribution and storage can give detailed data on assets. But getting meaningful insights is not as simple as gathering data and spotting correlations.
Finding the signal in the noise
To make meaningful decisions about infrastructure investments or upgrades, we need highly precise predictions, not general trends.
Knowing that energy demand rises during halftime at football matches is easy enough with the data we have. Knowing when a local population is going to see a sudden surge in EV drivers, or a sewer is going to flood, requires a much more nuanced approach.
This involves finding the key insights in lots of noisy sensor and smart meter data. It may mean layering in other diverse data sources, such as weather, geological, consumer behaviour, engineering data on asset design, etc.
To go beyond correlations and deliver meaningful insights that truly allow you to understand what’s going on with your assets and the diverse populations they serve, we need reliable data and the right combination of skills to extract meaning from it.
The challenges of utilities data
The hardest bit is getting meaningful data in the first place to feed models. Utilities, like many engineering organisations, were not built around data and so face a number of legacy challenges. These include:
- Insufficient data: Much infrastructure is underground and hard to collect data on. And data is often not what you really want (the nearest address of a burst pipe, rather than it’s GPS coordinates). We need to find clever ways to infer data from imperfect data sources.
- Hard to find data: A lot of legacy data is blueprints, maps and handwritten notes that need to be found and translated before they can be fed into models
- Engineering data is not consumer data: Even where they have data, it is measurements of physical systems such as pipe corrosion and mechanical stresses, not consumer patterns. Engineers often worry – understandably – that data teams will ignore the real-world context and just look for correlations. Understanding different types of data and creating conversations between the two are important to unblock this.
- Making insights explainable: When there is a risk of not keeping the lights on or the water flowing complex techniques like machine learning must be explainable and understandable before they will be embraced – an area where the data industry has often fallen short.
- Insufficient data experts: Data science teams are usually small and overstretched and can be distant from the business itself. Teams may have expertise, but not the range of skills or remit to do everything needed. A combination of data management, data engineering and modelling is needed, but also people who understand engineering data and what it represents in the real world. Critically, teams must include people who can bridge the gap between subject matter experts and data experts and the wider business.
Building a data-driven organisation
For those that get this right, the opportunities from data are huge. The challenges of drawing insight from complex data are significant, but these can gradually be overcome with the right approaches to data management, modelling, skills, and collaborations.
The key is to start small. Identify viable use cases where you have good data and can quickly deliver valuable insights, and focus on these. Use the process to identify gaps in data collection, data management, skills, and communication structures, and start to plug these. Gradually move onto slightly harder projects whilst feeding in learnings and advancing your digital maturity.
Getting all this right will eventually allow utilities to become insight-driven. Such an organisation would have mastery of its data. It would constantly launch data projects, build predictive models and use the insights to make more effective decisions on managing and upgrading infrastructure. This is a continuous journey, not a single project.
Tessella’s new whitepaper, How Utilities Can Move to Insight-Driven Operations, discusses these and other challenges in more detail, and how to overcome them.