The energy world is undergoing massive change, rethinking systems designed more than a century ago to make room for the rise of smarter, cleaner technologies. It’s an exciting time – virtually every industry is electrifying in some way, electric vehicles (EVs) are gaining market traction, and there is an active transition to support Distributed Energy Resources (DERs), “small-scale energy resources” usually situated near sites of electricity use, such as rooftop solar panels and battery storage. That last one is a big deal, and as the International Energy Association (IEA) points out, the rapid expansion of DERs will “transform not only the way electricity is generated, but also how it is traded, delivered and consumed” moving forward.
To an observer, all this change is positive, sustainable, and long overdue. But practically speaking, the rapid acceleration of renewable energy and electrification is creating added stress and straining the limits of our grid. Along with the pressure from renewables, the world’s power systems also face critical challenges from extreme weather events related to ongoing climate change – droughts in Europe, heatwaves in India, severe winter storms in the US – all resulting in an exponential rise in inspection, maintenance, and repair costs. Leaders in the utility sector are now laser-focused on increasing grid modernization, reliability, and resilience.
Take a Picture, It’ll Last Longer
For utility companies, their equipment is often their most important asset and requires constant, meticulous upkeep. Performing this upkeep depends on a steady stream of data (usually in the form of images) that utilities can analyze to detect operational anomalies. Gathering that data is done in many ways, from drones and fixed-wing aircraft, to line workers physically walking the site. And with new technology like UAVs/drones and high-resolution helicopter cameras, the sheer amount of data has increased astronomically. We know from our conversations with many utility companies that utilities are now gathering 5-10X the amount of data they’ve gathered in recent years.
All this data is making the already slow work cycle of inspections even slower. On average, utilities spend the equivalent of 6-8 months of labor hours per year analyzing inspection data. (Provided by West Coast utility customer interview from utility collecting 10M images per year) A big reason for this glut is that this analysis is still largely done manually, and when a company captures millions of inspection images each year, the process becomes wildly unscalable. Analyzing for anomalies is so time consuming in fact that most of the data is outdated by the time it’s actually reviewed, leading to inaccurate information at best and repeat inspections or dangerous conditions at worst. This is a big issue, with high risks. Analysts estimate that the power sector loses $170 billion each year due to network failures, forced shutdowns, and mass disasters.
Building the Utility of the Future with AI-Powered Infrastructure Inspections
Making our grid more reliable and resilient will take two things – money, and time. Thankfully this is where new technology and innovation can help streamline the inspection process. The impact of artificial intelligence (AI) and machine learning (ML) on the utilities sector cannot be overstated. AI/ML is right at home in this data-rich environment, and as the volume of data gets larger, AI’s ability to translate mountains of information into meaningful insights gets better. According to Utility Dive, there is “already a broad agreement in the industry that [AI/ML] has the potential to identify equipment at risk of failure in a manner that is much faster and safer than the current method” which relies on manual inspections.
While the promise of this technology is undisputed, building your own customized AI/ML program in-house is a slow, labor-intensive process fraught with complications and roadblocks. These challenges have caused many utility companies to seek out additional support from external consultants and vendors.
3 Things to Consider When Evaluating Potential AI/ML Partner
When looking for an AI/ML partner, actions matter more than words. There are a lot of slick companies out there that might promise the moon, but utility leaders should drill down on several important metrics to accurately evaluate impact. Among the most important is how the vendor describes/delivers:
Growth of the Model Over Time – Building varied datasets (data that has a lot of anomalies to analyze) takes a significant amount of time (often several years) and certain types of anomalies don’t occur with a high-enough frequency to train a successful AI model. For example, training an algorithm to spot things like rot, woodpecker holes, or rusted dampers can be challenging if they don’t occur often in your region. So, be sure to ask the AI/ML vendor not only about the quantity of their datasets, but also their quality and variety.
Speed – Time is money, and any reputable AI/ML vendor should be able to clearly show how their offering speeds-up the inspection process. For example, Buzz Solutions partnered with the New York Power Authority (NYPA) to deliver an AI-based platform designed to significantly reduce the time required for inspection and analysis. The result was a program that could analyze asset images in hours or days, instead of the months it’d taken beforehand. This time savings allowed NYPA maintenance groups to prioritize repairs and reduce the potential of failure.
Quality/Accuracy – In the absence of real data for AI/ML programs, companies sometimes supplement synthetic data (i.e. data that has been artificially created by computer algorithms) to fill gaps. It’s a popular practice, and analysts predict that 60% of all data used in the development of AI will be synthetic (instead of real) by as soon as 2024. But while synthetic data is good for theoretical scenarios, it doesn’t perform well in real-world environments where you need real-world data (and human-in-the-loop interventions) to self-correct. Consider asking the vendor for their mixture of real vs. synthetic data to ensure the split makes sense.
And remember, the work doesn’t end once you’ve selected your partner. A new idea from Gartner is holding regular “AI Bake-Off” events – described as “fast-paced, informative sessions that let you see vendors side-by-side using scripted demos and a common dataset in a controlled setting” to evaluate the strengths and weaknesses of each. This process establishes clear metrics that are directly related to the scalability and reliability of the AI/ML algorithms that then align with utility business goals.
Powering the Future of the Utility Industry
From more efficient workflow integrations to sophisticated AI anomaly detection, the utility industry is on a far brighter path than even a few years ago. This innovation will need to continue though, especially as T&D inspection mandates are set to double by 2030 and the government announced energy infrastructure maintenance and defense as top national security priorities.
There is more work ahead, but one day we’ll look back at this time as a watershed period, a moment when industry leaders stepped up to invest in the future of our energy grid and bring utilities into the modern era.
Credit: Source link
Comments are closed.