Everywhere you look, data and digital technology are disrupting industries around the world. It’s no different for the energy sector; the traditional models of generation and retail are being flipped on their heads and distribution and consumption models won’t be far behind.
The software decisions our industry makes over the next five years will determine the economic and social impact of change, and one area that is causing real consternation is data infrastructure. This is unsurprising given the rate and scale at which data is being collected and the plethora of new service providers entering the market bringing advances in generation, storage and metering tech. The pace of change is such that any business without a strong data practice in place (or plans to establish one) will certainly be feeling the chasm widening between themselves and those that do.
But here’s the catch; collecting data isn’t useful unless it can be stored, organised, and accessed in a way that supports great decision making. And this is an important point because while data infrastructure underpins a strong data practice, it’s an area where the energy sector lags behind other industries.
There is a silver lining to being behind
In a lot of cases, the reason the energy industry is lagging is because it operates on legacy systems. Having worked with many organisations to improve their data infrastructure across a range of different sectors, there are systems I have come across that I have never seen before in my career, particularly in the energy sector, and it’s only through tenacity and not compatibility that some of the tech relics in this sector work at all! (and while I’m certainly not an old geezer, I’m no spring chicken either). Some of the technology dominating the stacks of generators and retailers are from the turn of the century, which are not equipped to deal with the data requirements of today. While our systems tend to comply with the basics such as regulatory requirements, from an operational perspective, we need to modernise systems to make them usable and accessible on a large scale.
The silver lining here is that others have paved the way in data infrastructure. This is a rare situation where you can solve your modernisation problem just by throwing money at it! Usually there’s significant investment in research and development when pioneering because you’re blazing a new trail. The advantage of walking a well trodden path is that you don’t need to expend a lot of energy; just bring in the seeds and sow the crops by adopting known and prudent solutions. Of course, I am being overly simplistic here. The downside to lagging is that it takes a while to catch up, and unless you really put the hammer down and invest time and capital to make significant strides forward, you’re likely to always be a step behind.
The flow-on effect of legacy
Here’s the other challenge of having a legacy system in place; you’ve built teams and skill sets around maintaining it. Why is this an issue? Because those teams are likely so busy working on keeping the system running that they don’t have time to modernise it.
The energy industry has traditionally been dominated by engineers of all shapes and sizes, so it has a strong engineering mindset. The challenge of upgrading to modern technology shouldn’t be daunting for people with these skills; they’ve built dams and wind farms and huge transmission networks so we know it’s not a capability issue. The barriers are time and cost. There’s a lot baked into the initial set up costs of big systems whose job it is to facilitate the generation of revenue. Likely, the legacy system is the product of one large vendor, augmented by a skilled in-house team maintaining it. For the internal team, maintenance is a full-time job—there is no extra time left for development and modernisation as opportunities arise. And once you’ve formed the vicious cycle of not modernising something, you’re stuck with an ancient system.
Here’s my favourite example of this conundrum. Before I became a data scientist, I was an astronomer. At my workplace there was one telescope, and one computer that ran the focusing systems of the telescope. It doesn’t take an expert to understand that if you can’t focus the telescope, you can’t see anything, which means you can’t use the facility (and these facilities burn through millions of dollars a night). The focusing computer was a Solaris system; Sun Microsystems installed it somewhere in the vicinity of 1994. This computer drove me mad; it was so old and there were many better, more modern alternatives. I often wondered, “why is this thing still here?”. When I last checked in 2014 it was still there, and I’d hazard a guess that’s still the case today. If it is, it’s through no lack of a need to swap it for modern equipment, but for lack of time to do so.
Bridging the gap between old and new
Here’s another challenge - the practical application of managing the gap between modernising infrastructure and the desire to catapult forward to shiny new tech like Robotic Process Automation (RPA) and Artificial Intelligence (AI). Data moves a lot faster than physical infrastructure and the benefits of these new types of tech are becoming more widely known, so it’s easy to see why C-suites and boards around the world are pushing to get these things in place. However, there is often a lack of understanding around the outcome potential of moving to these new technologies, particularly as it relates to specific industries, the transformative potential of strong data infrastructure, and indeed what’s involved in getting them implemented.
We’ve just reached the peak of the hype cycle in energy, whereas other sectors were here some time ago. The telecommunications sector is a good example. While telcos have always been technically advanced, they too have undergone a digital and data transformation over the past decade and are now in a place where they are realising the value of modern tech. Telcos spent a lot of time collectively figuring out the best way forward and laying the groundwork. It’s very easy to look at telco as a case study and want to emulate its current position, but what we don’t see is the investment in infrastructure over the last decade. In other words, it’s easy to say “this company is getting great benefits from RPA, we should implement it too”, but the reality is it’s not something that happens quickly. The biggest lag, in my opinion, is in understanding the journey to data maturity. Moving away from legacy technology to modern infrastructure, implementing new tools like AI and RPA and establishing a strong data practice is a process. There are ways to bridge the gap and steps you can take to progress forward… but that’s for another day and another blog.
If your business is operating legacy systems and you need some advice on how to modernise your infrastructure to be fit for purpose and keep pace with the industry, get in touch with the team at Flux.