FogHorn Q+A: energy efficiency at the edge

FogHorn are a leader in developing edge-native AI solutions for commercial and industrial IoT. We spoke to Jason Plent, Managing Director of EMEA, about their recent work in energy efficiency.

FogHorn is a rapidly growing company, and they recently appointed Chris Penrose, who set up AT&T’s IoT division, as their new COO. FogHorn’s Edge AI platform has been deployed across a wide range of industries and for a number of different use cases. Smart buildings are an example of this, where the platform can be deployed for lower energy consumption, predictive maintenance, better security or health and safety. In this article we speak to FogHorn’s Jason Plent, Managing Director of EMEA, about how their platform can improve the energy efficiency of smart buildings.

Can you define what edge computing means to you?

We’re edge native, so we focused on edge from absolutely the beginning. If you look at current on-premise solutions, they’re all vertical, you have your fire alarm system and your temperature system and your door control system. There is no built in interoperability between those different vertical. The wonderful thing we can do is we can sit as the thin intelligent overlay, and we can bring the intelligence by combining all of those systems together, understanding those systems, and providing control back into them. So, an answer to this question, I would refer to my five pillars of edge: speed, latency, data quality, security, cost. If it’s all done in the edge you bring all of those systems together, and you avoid all those issues.

Why would you not just run your solution in the centralised cloud?

I refer you back to the previous question, which is the five pillars: speed, latency, data quality, security, cost. If you’re pushing up to the cloud, you’ve got the cost of the data transmission, you’ve got the down sampling, the low fidelity of the data, you’ve got speed issues, you’ve got cost issues.

What type of edge is being used (e.g. on-premise edge, device edge etc.)?

Edge gateways on premise right now is the most convenient way for the customers to deploy. I think I can see looking forward that our software will be integrated more absolutely, we can go virtual, we can go on the network, we don’t have to go on a box. That flexibility in terms of the platform deployment is really important. Physical location in a way is irrelevant, just put us near the source of the data.

FogHorn have been doing a lot of work looking at smart buildings and making them more energy efficient. Could you tell me about some of the use cases involved in this and the part that FogHorn’s Edge AI Platform plays?

The Florida school project was the first really big, at scale, energy project for us. The idea was very simple in the sense that we had a very complex technology capability that we had been able to apply to industrial manufacturing, and then to oil and gas, and then to sports cars. So, we were working at super high speed with super high-fidelity data streams, processing them and pushing control signals back. We were then attempting to apply that in a sector unknown to us, could we go into something as simple as a school? Could we pick up data from the existing building management system (BMS)? And could we use the interoperability between the data streams to do something really clever with the control of that asset, if that asset was something like a high school? And the resounding answer was yes. We can take the existing BMS data flows, we can combine them with new sensor types. So, all of a sudden, the rather crude and rudimental BMS system now has this intelligent overlay, which is starting to forecast and link the data flows together in a way that’s not been seen before. What does that mean for the energy performance of the building? Well, it means things are, in quite simple terms turned off automatically if they’re not being used. It turns out about 15% of Florida school rooms are empty at any one time, and yet they were still being heated or cooled. That’s a 15% energy saving on your bill right out the gate, just by that basic feature of the system.

How much do these deployments cost? Is it realistic to expect that in the future this kind of technology will be integrated in all new buildings?

Absolutely. For example, we have such a wealth of old building stock that needs vast improvement. Government legislation means you can’t rent or lease certain stock in the UK because it doesn’t meet the efficiency criteria. Retrofitting intelligent overlay to BMS systems in existing building stock is going to be an enormous business. A building built in 1990 has a security fire alarm data stream and an electrical data stream, and that’s it. The buildings being put up now, I would fully expect them to have lots of high fidelity data streams soon or inbuilt. They would need an edge server to process those and then push control signals back out to make that building more efficient. You could do this for hundreds, if not thousands, of data streams. We already do it for electric cars so why not buildings as well?

Cost wise? Well, in most cases it’s incredibly cost effective. If you want a new BMS you would expect to spend 500 grand on something that gets you halfway there, our end-to-end solution is far less than this. From us you can have weather and occupancy, humidity and temperature, the fire alarm system, person identification, health and safety detection, and so on and so on all built into the mix. We can only do this because of the new sensor capability and the new processing capability.

What kind of companies do you see as important potential partners in these deployments?

Well, look at the supply chain. So, sensor developers, chip manufacturers, if Intel come out with something that’s twice as fast as what they have right now, that’s a bonus for us. Hardware: Dell, Microsoft and HP – we’re working with them on projects. Systems integrators: we’re a small Silicon Valley software team, for large scale deployments we need the Microsofts of the world. We need these big engineering capable software-familiar systems integrators to help us physically deliver.

How important (if at all) are telcos in enabling edge computing?

I think massively important, 5G gives us those data streams that we need. So, it’s a natural place for us to sit as a closed loop learning facility at the bottom of that 5G tower. We provide the closed loop, but data flows in high density and high speed get pushed through our processing, we can do complex analytics in real time on the edge. And then we can push out a really great answer to the telco customer, on behalf of the telco, we’re not interested in being seen as a branded entity, we’re quite happy for our engine, to be part of the telco’s package to their customer. We think the telcos and other integrators see us as a part of that package.

Are there other applications in the energy sector that you can see the Edge AI platform being deployed for?

Yes, absolutely. Energy is stuck with medieval data, half hourly data six months late. If we get to one-minute or one second data, delivered in three minutes segments, you can really see what’s going on in different parts of a location. Then we’ve reached the Renaissance. But I’d really like to get to where we can have granular data at the sub second level, then we can start to predict the performance of assets based on their changing electrical consumption. We can make them more efficient by controlling that consumption and by controlling the behaviour. You can apply this to EV solar charging, for example. We can use analytics to decide when to store solar energy in a battery, when to use solar energy to charge the car, and when to use the electric grid instead. The AI is capable of understanding that the grid will be expensive in five hours’ time and it doesn’t want to buy then, it will guide the storage and then call on the storage when its needed for the manufacturing process, or heating and cooling, or whatever the function is.

Do you see the green energy sector as a major vertical for edge-enabled solutions?

Yes. There are internal factors driving the need for edge such as commoditisation of sensors, the hardware and the software capability, but there’s the external factors too: government legislation, climate change, rising utility costs, commodity costs, all of that is a nightmare. There are positive factors as well stemming from increased demand and pressure for green energy: net zero ESG, circular economy, sustainability, decarbonisation. For example, some high street banks now have loan policies that they will not lend money to companies unless the companies can prove their sustainability credentials. No more greenwash where you take terrible data from six months ago, and you try and massage it and put it into a sustainability report. It’s not granular enough, it’s not detailed enough. And you certainly didn’t use your potential for automated control.

We need to be serious about net zero. Everyone is suddenly announcing they’re going to do it by 2030, 2040, 2050. Right? Well, the only way you could do it is with proper data. Imagine a scenario where the AI is controlling the most expensive parts of your energy consumption in a much more efficient way. Because it’s got an eye towards weather, it’s got an eye towards occupancy, it’s keeping your building more secure, it’s making everything more efficient, it’s learning. It’s repeating the benefit of what it discovers. That closed loop of learning is getting more and more efficient all the time, eventually, it will reach a point where it’s brilliant at controlling what you already have. But you’re then going to add in these green renewable sources, your wind and your wave and your solar and then it will start integrating those elements into its finessed control. And you will get rewarded by your investors, by the government, and by your sustainability team, which at the moment is sitting around embarrassed because they don’t have any data. It will be the system that will give them not only the data, but the control that they need to drive an efficient, comprehensive, holistic, efficient performance around them.

Author: Matt Bamforth is a consultant, specialising in edge computing, telco cloud and 5G

About Dalia Adib

Edge computing practice lead

Dalia is the Edge Computing Practice Lead at STL Partners and has led major consulting projects with Tier-1 operators in Europe and Asia Pacific on edge computing strategies, use cases and commercial models. She co-authored the research report “Edge Computing: Five Viable Business Models” and been an active speaker at events including Edge Europe and Data Cloud Congress. Outside of edge computing, she supports clients in areas such as 5G, blockchain, digital transformation and IoT.

Read more about Edge Compute

Overview

About edge compute and edge cloud

An overview of edge computing and edge cloud to highlight the key questions being asked by the wider ecosystem and telecoms operators who are exploring the opportunity

Read more

Research

Turning vision into practice

Our Telco edge computing: Turning vision into practice research gives an overview of the telco opportunity and seeks to address the key challenges for operators pursuing edge

Read more

Webinar

Edge business models and how to execute them

A joint webinar with MobiledgeX and STL Partners exploring edge cloud business models and the value proposition for application developers in augmented reality

Read more