Smart Mobile Labs Q&A: Edge computing applications to limit the impact of COVID
The progression of software and IT services to the cloud has accelerated strongly over this period, driven by increased use of remote-working, and the need for maximum scalability in times of uncertain demand and supply of goods. That said, there has been redoubled interest in hybrid cloud architectures to ensure resilience – something which edge-computing also has a role to support.
We interviewed Rüdiger Hnyk, Chief Product Officer at Smart Mobile Labs, to find out his views on COVID-19 and edge computing, as an edge application developer and innovator. Smart Mobile Labs provide video optimisation and delivery solutions for live streaming, leveraging mobile networks and edge computing. We previously discussed the edge computing use cases Smart Mobile Labs is exploring and commercialising in our report What edge developers want from telcos now.
More on STL Partners’ views on the impact of COVID-19 can be accessed via our report COVID-19: Now, next and after and webinar of the same title, which features panellists from Elisa in Finland (VP of Service Assurance and Cyber Security) and Dialog Axiata in Sri Lanka (COO).
You’ve been exploring the use of your technology in first responder situations, clearly something highly relevant to the COVID-19 situation. Could you explain how this application works?
When you are talking about an emergency environment, you usually have a number of ambulance vehicles in use, there may be 20-30 serving the area near the hospital. Imagine they are all equipped with a mobile phone which has our app. This app allows them to continuously live stream what they are seeing to one another, to the ambulances, to police cars and to a central control room (potentially in the hospital), all via our EVO server [on the network edge]. It shows you multiple views at the same time [see diagram below] – 12 different camera views, which will be in the field and in the hospital. Now, they can all communicate seamlessly.
So, when there is an incident [during the emergency situation], the controller can select one of the sources, zoom in, and automatically forward it to all mobile phone in the field. There is also integrated push-to-talk functionality, there’s an SOS channel that can be subscribed to and the controller can say communicate to the whole group.
What’s great about this is that you can use normal mobile phones; there is no need to buy expensive equipment or anything like that.
Do you have other applications for the healthcare industry?
We can extend the first application to include real-time overlay of the patient’s vital signs in the ambulance. This is just an HTML overlay, so all you need to do is access the IT system from the hospital and arrange it in a simple webpage using a template.
Where do edge computing and network technologies play a part?
The EVO server needs to be close to the mobile base station in an edge computing environment, so to speak, so that all the streams can be distributed locally, very quickly. Mobile phone carriers usually have one controller node [base station] which aggregator 5-10 distributor base stations in the area and serves 100,000 people or so in a city. Typically, our servers are in the base station shelter of the controller node. We also do campus/private deployments where the server may be in a truck or in a data room at the stadium, for example.
You can do this on a public network, or in a private [LTE/5G] network, or we can equip ambulance vehicles with a local base station. For military customers, they will set up a private 4G/5G network to ensure everyone can communicate securely. In Europe, namely Austria, Germany and Switzerland, regulation is supporting 5G campus networks and allowing enterprises to own a 5G licence (100 megahertz) without the need for a huge budget.
However, private networks are not necessary; in countries where there are advanced mobile networks, then a public network can be used. The EVO server secures the content; only group members can see the streams and, in order to get the app or launch the app, you need to log into the server and set a password.
Outside of private networks, is there other network functionality that you are trying to leverage as an app developer?
With the mobile phone carriers, we are discussing distributing via LTE broadcast mode. The idea is that when you have such an incident, and it’s similar in sporting events, you have a problem with scalability in the downlink. So, if the app is being used by 100,000 people, at Wembley Stadium, for example. Then it would make sense to use broadcast mode. This is a certain mode available in 4G (and 5G) which is not heavily used today and was originally designed for TV distribution. It could be used for ad-hoc live streaming to an infinite number of people in the cell. This ensures there is always enough capacity and avoids overloading the cell, by using only a small part of the spectrum in broadcast mode. You would still use unicast in the uplink [because it is unique data, so cannot be broadcasted]. There is a project in Europe called LIPS, to converge production and distribution networks, that we are involved with.
How critical is it to be at the base station, rather than a central office location, for latency?
It’s not a big difference. It’s just then optimised, of course. Our application has 150 milliseconds, end-to-end. There’s actually A slide on this here. [In terms of latency distribution] 50 milliseconds are in the display on the receiving side, probably 70 on the sender side for the encoding in the phone and the rest of the 30 milliseconds are in the network. So, if you put the cable not at the base station, but in the closest Google Cloud or Microsoft Azure PoP, then you add 10-20 milliseconds. [In that case] the 160-170 milliseconds latency is hardly noticeable.
Why not use AWS/Azure then?
First, this will be different country-to-country. In the U.S., UK, Europe, the network is good, plus AWS is using city PoPs now [but it is not the case everywhere]. In these countries, they may have 20-30 smaller PoPs in the country, but we [developers] cannot select them for a server deployment. You would only get this performance if your streams to the [hyperscalers’] caching servers in the city PoPs. But, they reserve these features for the very big customers only. You need to be YouTube or Netflix in order to negotiate and use these city PoPs. A hospital [for example] can go on the AWS portal and rent a server but will only be able to select from the available regions such as “Europe – West” or “Europe – South”.
What server requirements do you need?
You can use a very simple server actually; you need a quad core machine with 8GB of RAM only because it’s just doing screen forwarding. You can run this to the maximum of the 10G ports you typically have there. We do limit the number of people cooperating in the application to 1,200; if you have 5mb/s HD streaming, this is equivalent to about 6GB/s in total, so it would use much of the 10G port.
How replicable is this technology to other use cases / scenarios?
The Coronavirus environment and the heightened demand on emergency and healthcare services, means that this can be used for other medical events as well. But it can be applied to other use cases. What we are applying is our standard app, which is what you see here. Some of the key functionalities that can be leveraged in other situation is the fact that someone can receive channels, send channels with their own camera and view colleagues on a map, plus leverage the embedded push-to-talk.
The interesting aspect is that Microsoft Teams is using the same technologies we are using. But we are optimised for speed. This means that the app is directly programmed on the hardware codecs in the phone. So we are achieving 150 millisecond phone-to-phone roundtrip latency, which is hardly measurable by human beings. It’s the fastest technical setup you can do today on a mobile app without expensive (TV) cameras. Modern phones have such good hardware support that the battery lasts when doing 8 hour streaming in HD quality. For example, we have test the Samsung series, the S10 and S20.
In today’s [COVID] environment, this technology is highly relevant to allow people to continue business as usual and ensure remote working can operator smoothly, by replicating as much as of the physical experience as possible.
All in all, we are looking to an infrastructure vision which looks a little bit like this here [see above diagram]. In your country, you have: your base stations and you have your edge compute services. For example, you have our application running on it, and then you can get all these real time streams with the quickest path to the action. A clear beneficiary will be autonomous driving, because this also needs local decision making, using data from cameras, things from other cars or cameras at crossings, and then displaying the outcomes as an overlay on the windscreen, like a third mirror in your car.
How mature is your application? Who are your main customers today?
We are shipping this to customer – it is a real product. You can download and test it today via our SML portal – this is what we call the director app because you act like a movie director! You can go ahead today and test it out yourself using one of our public servers.
Today, we have many customers in the sports domain, using EVO server to enable new features in sports stadiums for the spectators, for example Formula 1 or a football match.
It’s also applicable to the security world. We are shipping it today to policy or security groups, for example when security personnel need to secure a certain area. They sometimes also use drones to monitor a fence, for example. In these cases, you need real-time imagery and edge computing because the security personnel are part of the action and viewing a stream that they are seeing in front of them in real-time. Similar to an augmented reality scenario. In essence, you are part of the action. And of course you want it to be low latency, as it’s critical; you need to get the information as fast as possible.
We have some deployments in Japan and China, but Western countries are the main markets for us. This is generally in line with where the 5G hype is currently going on. End-consumers and the mobile carriers are looking for new applications to prove 5G technology and that’s where we feel our applications are key. It’s autonomous driving, it’s gaming – mobile gaming – it’s entertainment multiscreen, IT security, etc. It’s also e-health, like we discussed in the Coronavirus use cases. It’s all about grouping and distributing streams of high interest locally to mobile phones so that people are better informed and closer to the action.
Special thanks to Rüdiger for this interview:
Chief Product Officer @ Smart Mobile Labs
STL Partners edge use case service
- Use case description
- Key drivers for using edge computing
- Potential business models
- Potential partners
- Customer and end-user proposition
Read more about Edge Compute
About edge compute and edge cloud
An overview of edge computing and edge cloud to highlight the key questions being asked by the wider ecosystem and telecoms operators who are exploring the opportunity
Turning vision into practice
Our Telco edge computing: Turning vision into practice research gives an overview of the telco opportunity and seeks to address the key challenges for operators pursuing edge
Edge business models and how to execute them
A joint webinar with MobiledgeX and STL Partners exploring edge cloud business models and the value proposition for application developers in augmented reality