Let the agents handle it: Automating the energy lifecycle with LLMs // Adam Sroka // Agents in Production 2025
speaker

Dr. Adam Sroka, Director of Hypercube Consulting, is an experienced data and AI leader helping organizations unlock value from data by delivering enterprise-scale solutions and building high-performing data and analytics teams from the ground up. Adam shares his thoughts and ideas through public speaking, tech community events, on his blog, and in his podcast.
Many organizations aren't getting the most out of their data and many data professionals struggle to communicate their results or the complexity and value of their work in a way that business stakeholders can relate to. Being able to understand both the technology and how it translates to real benefits is key.
Simply hiring the most capable people often isn’t enough. The solution is a mix of clear and explicit communication, strong fundamentals and engineering discipline, and an appetite to experiment and iterate to success quickly.
If this is something you’re struggling with - either as an organization finding its feet with data and AI or as a data professional - the approaches and systems Adam has developed over his career will be able to help so please reach out.
Cutting-edge data technologies are redefining every industry and adopting these new ways of working can be difficult and frustrating. One day, there will be best practices and playbooks for how to maximize the value of your data and teams, but until then Adam is eager to share his experiences in both business and data and shed some light on what works.
SUMMARY
What happens when AI agents stop just suggesting next steps – and start running the project? In this talk, Dr. Adam Sroka (CEO & Co-Founder, Hypercube) shares the learnings behind Jellyfish, an agentic AI platform designed to manage the end-to-end lifecycle of renewable energy assets and deliver 50% efficiency gains. Built to replace time-consuming, manual work done by project managers, analysts, and engineers, Jellyfish combines proprietary AI models with multi-agent workflows to automate planning, data collation, reporting, and real-time analysis – with human oversight built in. Adam will break down the system architecture, from workflow design and automation strategies to real-time analytics and user accessibility. He’ll also touch on the commercialization path, including IP considerations, and the broader role platforms like this will play in accelerating net-zero targets.
TRANSCRIPT
Adam Sroka [00:00:00]: Thank you very much Skylar. Appreciate it. There's a big picture of me. I hope everyone's enjoying the event so far. Some unbelievable talks. More to come later. So yeah, here's my little segment for those of you that don't know me.
Adam Sroka [00:00:26]: I'm Adam Sroka, CEO and co founder of a company called Hypercube and I'm going to be talking about my favorite topic today, my two favorite topics, large energy projects and large language models. So I'll take a whirlwind tour of some of the stuff that we're up to at Hypercube. Now please. Any questions, let me know or find them in the chat and I'll do endeavor to answer them. So who are we? We are Hypercube Consulting. As said, I am based in gloomy Edinburgh. Today we are the UK's leading data and AI consultancy for the energy sector. So unlike a lot of other tech consultancies, we only really work in energy and do energy projects.
Adam Sroka [00:01:14]: We have lots of interesting partnerships. We have expanded into the US and we're looking at at coming to the EU this year. And I would say our specialism is really bringing a lot of like deep domain expertise to energy sector projects that focus on data, data and AI. I grew up as a data scientist and spotted that the energy sector was a little bit backwards and there was a lot of data hungry use cases coming for it. So I thought I would jump ship and see if I can help accelerate the transition and net zero and climate change via my data AI skills. It's a really interesting space if you're not familiar with it because there's lots of crunchy like hardware, there's geography to deal with, there's physical constraints and there's a lot of financial like modeling and stuff that goes on. So if you like lots of data IoT finance time series, it's really, really good fun. So today we're going to be talking about one of the products that we are launching from, Hypercube, which has a codename Jellyfish, which made its way onto the talk.
Adam Sroka [00:02:21]: So that was me trying to come up with a thumbnail to make it sound stupid. And we haven't given it a better name since, so it answers our postcard. But effectively it's energetic AI platform or large energy infrastructure. Why energy infrastructure? Well, a big number. The 2025 estimated global spend on energy infrastructure is set to tip just over US$3.3 trillion. That is a bonkers number. It's estimated that $150 trillion will need to be spent globally in the next 15 to 20 years to build all of the infrastructure that we need to meet net zero target. That's such an eye watering sum of money for those of you that aren't familiar.
Adam Sroka [00:03:09]: Like even a small onshore wind farm that is producing several, maybe 100, 200 sort of megawatts of power can cost north of $500 million. Like a lot of money. And they take a long time to build. This is an industry that's getting busier, it's getting more competitive, there's more people entering, the space is becoming more attractive from large capital expenditure and there's a lot of things to manage. But to illustrate some of that, some of the timeline so I've talked about, that's a huge number, right? Global spend. It takes about 10 years to build like an offshore wind farm. So take an offshore wind farm. But these similar kind of timescales apply to things, things like big oil rigs or like onshore wind.
Adam Sroka [00:04:07]: They kind of vary a little bit. But illustrative for an offshore wind farm there's these several stages. You do origination where you, you try and organize the capital and kind of the consenting and stuff for this kind of project to go ahead. You then find sites, you then go into site survey and design. And this can take years because of things like ecological constraints and like local network infrastructure and things like that. You can't plonk a wind farm anywhere, right? It has to be able to connect to the grid and deliver its power somewhere. These are really constrained high voltage systems that take years to move the right people skills and hardware to connect things up, right? But if you get your survey designed, you get through planning permission, all the local authorities like all that, you get to what's called off taking consent award, it's going to take a year. This is like a financial exercise where you're dealing with the like lawyers and how this thing's going to make money and banks and so on and so forth.
Adam Sroka [00:05:04]: Eventually you get to financial close, which can take another year. This is saying, right, actually I've got a wind farm site, I've got all the planning. I am going to make over 34 years, I'm going to make a profit, so can I raise $500 million to go and build the thing? And you get a big fat yes and you're really happy and you go into the construction phase. You've spent like eight years almost and you're not done anything. You've not even put a shovel in the ground. And then this number varies anywhere from Kind of three to seven years it takes to build these things, depending on scale, where they are, how hard it is to get to. So you can get like all these are average numbers from Europe and UK. We're looking at 12 years before you've turned the thing on, right? And then you get 20, 30 years out of the lifetime of these and hopefully you reap all the benefits and all that money back.
Adam Sroka [00:05:55]: This is a complicated process. Lots of people involved. We are working with one organization that building an onshore wind farm. They've got 120 people. They're all finance, accountant and project manager types, right? Working with them and others in the space. We came across another company that were building large batteries and they said that actually there was one point in their project where if one woman left they would shut the whole worth pursuing because the whole project was kind of in her head and it was all like she was so pivotal that it would have been so hard to unpick what she was doing that they'd have ground to a halt, right? And they were talking to us because they wanted to scale from doing two projects a year to doing 12. And they just thought we don't understand how this would be possible. And at one point they said to me, we are dealing with millions of words spread across thousands of documents written by hundreds of people across dozens of companies.
Adam Sroka [00:06:47]: Just think, right, you've got planning, you've got ecology, you've got logistics and supply chain, you've got to look at, talk to banks, talk to lawyers, talk to local landowners. Like all of these things is lots of parties cross company boundaries, it gets really messy and we're actually still using like tools developed in the 80s like PDFs, emails and things like that, like phone calls, rainbow colored spreadsheets to manage these things. Why? Well, first of all, no software specifically exists for this problem space. And secondly, computers were bad at this stuff until a few years ago, right? Like reading a 300 page commercial contract and making sense of it or understanding meeting notes or planning permissions and like playing consent forms and stuff. Computers just weren't good at this up until the advent of sort of large language models and the current hype train and things like that. So almost zero innovation in this multi trillion dollar industry like has happened in four years. So now's the time, right? And we've spotted this. So we think with some initial pilots that we've done, there's enormous savings that could be made that would allow these groups of people to go a lot faster, be a lot more efficient and do more interesting things with their time and ultimately make the cost of developing a wind farm a little bit cheaper and more effective.
Adam Sroka [00:08:12]: So we get 10, 0. That's what we really excited about. How does it work? So what is it? Well, effectively the way we've built this and sort of knitted all this together is instead of trying to build like one big agent or one big piece of software to rule the whole thing, actually we have gone down the route of really razor thin agents that are doing pretty simple tasks and bundling hundreds of them together in big long chains that map to the business processes and what the humans in these businesses are doing. And so in our initial funding application when we were looking at putting this together, one of our bids, we did a lot of research and worked with operations research teams and some of our customers and we came up with this high level process and we just call it our change capture flow. And this is effectively on the bottom left. We just, this is again, this is kind of pseudo business speak, some of this stuff, but at the bottom left it basically says we will take every process, every data input to data output event that happens in that chain and map it like a great big graph. So what we're doing here is we're thinking every time there's a phone call, a document, there's some sort of form, there's some sort of permission occurs across the development of this wind farm. We want to capture what were the inputs, what were the actions and what were the outputs.
Adam Sroka [00:09:37]: And this has taken a long time to do for a very thin sliver across one of these projects because they are very complicated. But the beauty of this approach and the way you can model things in this way is you can build really thin layers aligned to a single use case. You pick one stakeholder in the business and you master what they're doing and you try and deeply understand the way they're. That thread works and then you can layer more and more on top. And then all we do is we basically have this process that says every time an element is updated, our graph is updated, new data comes in. We have an integration layer that pulls that in that is MCP based these days. So we started this about eight months ago, but that's MCP based now. And actually a lot of the work we do is actually cemented in SharePoint, things like that.
Adam Sroka [00:10:31]: These organizations are very Microsoft Office based. So it's quite nice. There's not hundreds of systems we need to integrate with. There's not really complicated Things. Microsoft, it's SharePoint, it's Excel, it's PowerPoints and PDFs via Outlooks email. So a new email comes in with a new PDF that's an update to a contract. We have an integration layer. An agent that understands that event has occurred, spots it is relevant and classifies that it is relevant to something on our chain, sorry, our graph of events that are relevant to the project.
Adam Sroka [00:11:05]: We detect the change, we use a large language models to parse the object and understand what was in it, what was the nature of it, Was it a transcript from a call? Was it anchor email? Was it an update to selling a spreadsheet? Then we classify that change. And in the initial stages of this we are still using a lot of very human in the loop until the this classification agent or agents are a kind of catchy term for a few systems here. But so it's akin to a recommendation system. Like I believe this element affects. Sorry, this data change affects this element. But effectively, until that gets to a very high level of confidence like north of 90% we are keeping a human approval gate here to say yeah, okay, that does affect that. When that is up and running and in the pilot we've got there to where this happens, then that becomes an autonomous thing where it can do all this automatically. It then updates the element that's downstream of it in the graph based on what we've mapped and it cascades changes through the graph, right, Saying that okay, if I update that email comes in, I need to update that spreadsheet or if I change a value in a spreadsheet then that has a knock on effect to actually this Gantt chart that now needs updated.
Adam Sroka [00:12:21]: So I'll go and another element that like agent will come along and change that. And again every element we're mapping one agent to it that really understands what's going on in that business process has all the context, has all the rules, has all the eval with the job of that one piece, we just chain hundreds of these things together that cascades through the graph. Again, we are recommending changes at this stage because we want to get this right, we don't want to confuse or delay these projects. And this is still under development. And then if necessary we'll get to the point where this is recommending new actions or things that a human might do that haven't been automated into the graph of changes and business processes. Recommend to the human again that human in the loop element, they can act on it or not. How are we coming up this graph? Good old fashioned consulting business analysis. Just deeply understanding what the customer does, talking to them, working through their problems.
Adam Sroka [00:13:20]: What's the architecture of that look like? Fairly. The architecture of that is probably fairly straightforward. I hope that's large enough for everyone to see. On the left we have a load of systems or emails, spreadsheets, stuff I've talked about. We have an integration layer. So depending on whatever the context is, we have mtp, we have SFTP service, sometimes we've got subscriptions to topics, so on and so forth. And then we will a pretty basic data platform that can understand and process them. And then we have this sort of overarching orchestrator that does a lot of the job of calling the change data capture flow process, the action fulfillment library.
Adam Sroka [00:14:06]: So which agents do I call? And then a huge library of agents for it to work on and then it will move on to autonomous or semi autonomous downstream actions, as you can see on the right, which may call on other integration layers and sort of go back around the loop. Right. We are in the early stages of designing UIs for this thing, but most of our customers in this pilot stage are really fond of just like a simple streamlit or chainit GPT chatgpt type interface where they can talk to what's going on. But the majority of the interface actually ends up being email spreadsheets like it's in the office tools that they're using day to day. One of the things to say about this that is quite challenging is permissions and how permissions cascade. So we really focused on on Azure at the moment directory and things like that because again, that is a hard problem to crack. We don't want to have data leakage across division boundaries or things that shouldn't be happening. And so we.
Adam Sroka [00:15:08]: There's some good solutions out there, but we are super cautious that some of the data in this space is really, really sensitive. So we're just being extra cautious at the moment in that stage of our. I'll run through this, you'll probably know all this stuff, but this is effectively like how does that all thread together from a kind of workflow point of view and effectively. So the example I like to use here, my wife's an ecologist, right. So if you, if you're in the UK and you want to go and build a wind farm or build a big battery or solar gas power station or something, you have to get an ecologist to come along and check that there's no bats or badgers in the region. And if there are, you have to move them. And so for bats they can only be surveyed up until the end of September. And so imagine with the last week in September, your bat survey is scheduled and once that's approved, you can move to the next stage of the planning process.
Adam Sroka [00:15:58]: Right, well, my wife shows up or your ecological consultant shows up and said, phones you up in night time because it's late and they have to do this at night. Oh, by the way, it's raining, we can't do the survey, so it's been cancelled. And actually it's the end of the bat season, so we can't do the next survey till March. That phone call, unbeknownst to the ecologist, has just created a six month delay to revenue on a 600 million pound project. Right. And actually, what if that phone call, that email or however it comes through, lands in the inbox of someone who's on holiday or someone who's not available, someone who doesn't understand how important that is. That is like a kind of category one red alert problem. You need to find another route, someone that can do the survey outside the normal window as soon as possible because this is on the critical path for the whole project.
Adam Sroka [00:16:49]: Right? So that email comes in and effectively into end user application like Outlook. Well, we have a monitoring service and you'll see it's called Tesseract here we kind of switched from Jellyfish to Tesseract at one point. We still aren't settled on a name. The monitoring service detects that some element, some data has come into the system and it affects an element on the graph. Right. We have the Orchestrator agent is then responsible for calling the library of agents that are relevant and it goes through Finds Task, specifically does the processing. It then might do things like check the contract for process slippage penalties. With the ecology consultancy, it might also update the critical path on a Gantt chart in a separate work stream.
Adam Sroka [00:17:30]: It might also write an executive report to say this has happened, so that when the CEO comes in in the morning, his hair's on fire, he has a good understanding of what's there. Then it sends the prompt, all the context, all the stuff it needs onto the foundational services like Claude or Gemini with Platform, and then that sends the response back to the agent, which will do the job of processing it to the downstream actions and so on, so forth. Finally, the end user is alerted within the workflow to approve or define, refine what came out of this setup. Right? And that once you've built all that, there's some really clever, fun things you can do with it, like chat with the project Like a lot of these wind farms for example, have really significant impact on local government. So they have to answer lots of questions to local government. Government. Well, actually we've proven that you could load all the context of the latest status of the project into a portal and give local government access to that. You can do things like automatically update financial reports, speak to your bankers and so on and so forth.
Adam Sroka [00:18:36]: It's proving to be a really useful, a really valuable tool to a lot of the pilot customers and we're really excited by it. So that's me for now. Thank you. I've got time for questions, I think. I just want to say if anyone wants to hear a bit more about what we do or is keen on looking at how do you build these kinds of business cases, then we have a guide that we wrote. There's a QR code here, but I can share that. You'll find us [email protected] I'd love to nerd out about this stuff or what you're up to. And thank you very much for your time today.
Skylar Payne [00:19:10]: Awesome. Thank you so much. That was great. I, you know, this whole time I was watching this, I was like, when are you gonna like release this? To manage my life, I was just thinking about all the times that are just like I'm, you know, oh, this thing changed and now I gotta go update all these other things and usually I just don't do it.
Adam Sroka [00:19:31]: Yeah. And it, well, the thing is the reason and so go back to the big 3 trillion number, right? Like when your project is a 600 billion dollar wind farm, that little delay is actually really valuable. So it's worth investing the time. And the beauty of this space is wind farm developers, for example, they're not competitive, none of them. I don't care if you build your wind farm faster than me, I just want to build mine fast. So they're quite happy to share data, insights and therefore as a product it will learn and get better. The big challenge is mapping like a kind of standard for all of the processes that go into it. That's, that's the real tricky bit.
Adam Sroka [00:20:08]: But the actual tech and the, the agents, they're pretty simple.
Skylar Payne [00:20:12]: Yeah, totally cool. We have one question in the chat and I think like, you know, the best way to ask this is do you see any sort of irony in the fact that you're using agents and LLMs in the energy industry given how much energy they these AI systems tend to use?
Adam Sroka [00:20:37]: Yeah, no, good question. Ultimately I think they use a lot less energy than a Project manager driving to an office in a diesel vehicle doing all this right. Like it is right. And, and they, these things are wasteful but I think we, these are only, they'll only get more efficient first of all. And secondly, if it's to drive down the cost of building green energy infrastructure, then I, that's how I sleep at night. That's how I justify it to myself. But no, yeah, I understand the debate around the cost of large language models. I actually think I see like the Stargate project or whatever it's called, like 500 billion, like when that came out I said like 30% plus of that budget will have to be energy infrastructure because basically the main input to a data center is how much power it's going to use and where it's going to get power from.
Adam Sroka [00:21:32]: So it'd be interesting to see how much new energy infrastructure pops up across China, the US simply to support these new data centers for AI and.
Skylar Payne [00:21:43]: Totally, totally. So given we have a few minutes left, what do you think is next for HyperCube's journey into applying AI in this industry?
Adam Sroka [00:21:56]: Yeah, we're doing some other stuff. The other really fun project we're building is a compliance automation tool. So testing like for big large enterprises that can't do they basically we like a couple of enterprise customers that bumped into the problem if they literally can't approve any large language models because it's too scary from a governance process like there's no one to tread the path yet. How do you get legal and infosec and IT and cyber compliance and data protection just to go? Yeah, all right, fine, let GPT loose on all our customers data. So we're actually building a tool to do testing and automation of compliance processes as well. That's been really fun because we're doing like live monitoring of the latest prompt injection attacks, bias detection, all this stuff. Like is my energy company public facing chatbot teaching people how to make bombs and like poison each other and all this stuff? It's, that's been a really good fun project and we're learning more and more as we go, but it moves at night.
Skylar Payne [00:22:57]: Totally, totally awesome. Well, thank you for your time. We'll have to catch up again soon. We'd love to hear about what else you've been up to, but yeah, with that being said, take care, Adam.
