What is an edge server?
Edge servers refers to servers (compute resources) that run the processing at an edge location, which can be anywhere along the edge spectrum – usually from on-premises edge to regional edge. Edge nodes is another term used frequently, which could either refer to a broader set of compute resources (i.e. including end-devices too) as well as a cluster of edge servers.
The nature of an edge server differs across different types of edges, depending on the use case and where the edge compute resource is deployed.
CDN edge server or regional edge
As seen in our edge computing investments webinar, most capital being deployed today is going into data centre facilities at the regional edge. This is being driven in some part by CDN providers looking to further distribute their locations and increased demand for data centres at a more local level. For example, in the U.S., latency is significant in Tier 2 cities as large as Austin, Texas. These locations can be almost as large as a hyperscale data centre, therefore the edge nodes are likely to be a standard data centre server.
Network edge server
At the network edge, most edge nodes will reside in data centre-like environments, particularly in the next few years given that telcos will largely leverage existing data centres in their network. However, as edge computing expands into deeper parts of the network, such as a deployment at a base station, the environment will be different to a traditional data centre. For example, in a smaller scale deployment, cooling will be delivered differently compared to a hyperscale data centre with thousands of servers. In some cases, edge could be deployed alongside small cell infrastructure, which would mean the edge server would likely to be standalone and need to be ruggedised since it will not have an enclosure to ‘live in.’
On-premise edge server
At the enterprise or on-premises edge, i.e. edge computing at a factory, shopping centre, office space, etc., edge servers can take different shapes or forms. Some edge deployments will be in on-premises data centres at take the form of a standard data centre server. However, in industrial edge deployments in particular, you may have a single edge device for running workloads, for example at an oil rig. Given the harsh environments, this would need to be ruggedised. In retail, the requirements are totally different – they have limited space to be able to install an enclosure for the edge node yet and would need equipment that can be hidden away from view as much as possible. Lastly, telcos and OEMs are exploring changing existing customer premises equipment used for networking to host non-networking applications. These could either be enterprise CPE boxes, Wi-Fi gateways or programmable logic controllers in industrial settings.
Device edge node
STL Partners defines the device edge as either edge compute residing on the end-device (e.g. a smart camera) or a separate small device attached to the end-device. One example of this is for asset monitoring; manufacturers are attaching small edge nodes to their customers’ assets to be able to monitor the condition and use the analytics to provide new services. The edge nodes are less likely to be ‘servers’ per se, but take the form of a small computer or simply be additional processing hardware installed on the end-device.
Key trends in edge hardware
The nature of edge servers is evolving. Some of the trends we are seeing, particularly in data centre servers, may extend into edge servers, whereas others are still open questions.
One of the key factors for why cloud took off is because of its attractive “as-a-Service” business model, which allowed customers to spread costs over time in an OPEX-based model, rather than pay CAPEX up-front to build IT infrastructure. In order to replicate the advantages of the cloud commercial model at the edge, we are seeing the growth of Hardware-as-a-Service models. In other words, the customer pays for the server in a recurring fee model. This can either be a subscription model, consumption-based or a managed IT services fee. HPE with its Greenlake portfolio is one of the earliest proponents, however others in the industry, such as Lenovo, Dell and AWS, have all jumped on the trend too.
2. COTS vs. specialised hardware
One of the challenges for edge computing solution providers and infrastructure developers is determining the processing capabilities within the edge server. Some use cases that require heavy visual data processing or image rendering will need GPUs (Graphics Processing Unit). Others that need high performance computing or low latency, high throughput processing may require specialised hardware accelerators, such as FPGAs (Field Programmable Gate Arrays) or ASICs (Application-specific Integrated Circuits). However, for an edge data centre operator or anyone designing a blueprint for edge infrastructure, it is difficult to achieve economies of scale as easily using specialised hardware compared to COTS (common-of-the-shelf) CPUs.
3. Storage and compute convergence
In some use cases, edge servers need to be as small as possible, for example if it is an attachment to an existing asset that will be used for monitoring the asset’s performance (condition-based monitoring). Converged infrastructure allows hardware to be used for both storage and compute, therefore simplifying a deployment by avoiding the need to have separate hardware for each process. We are starting to see an extension of this in computational storage. This moves compute even closer to storage to reduce the amount of data that needs to travel between the too, which would be particularly beneficial for ultra-low latency use cases.
4. White box CPE
In telecoms, some operators see their opportunity in edge computing around changing the nature of customer premises equipment (CPE). The industry has been opening up the previously vendor-locked CPE by disaggregating software (network services) from the underlying infrastructure. For example, Verizon worked with ADVA to create their universal CPE platform to allow customers to use COTS and select network services from multiple vendors. We have covered this at length in a previous report SDN / NFV: Early Telco Leaders in the Enterprise Market. The next step is now for these same boxes to run non-networking functions to become edge compute platforms in their own right by adding an IaaS layer. For example, a bank’s retail branch can use the CPE to run its branch networking services, but also compute for processing workloads related to enterprise applications, such as analytics for video security, IT security, access management, etc.
About Dalia Adib
Edge computing practice lead
Dalia is the Edge Computing Practice Lead at STL Partners and has led major consulting projects with Tier-1 operators in Europe and Asia Pacific on edge computing strategies, use cases and commercial models. She co-authored the research report “Edge Computing: Five Viable Business Models” and been an active speaker at events including Edge Europe and Data Cloud Congress. Outside of edge computing, she supports clients in areas such as 5G, blockchain, digital transformation and IoT.
Read more about Edge Compute
About edge compute and edge cloud
An overview of edge computing and edge cloud to highlight the key questions being asked by the wider ecosystem and telecoms operators who are exploring the opportunity
Turning vision into practice
Our Telco edge computing: Turning vision into practice research gives an overview of the telco opportunity and seeks to address the key challenges for operators pursuing edge
Edge business models and how to execute them
A joint webinar with MobiledgeX and STL Partners exploring edge cloud business models and the value proposition for application developers in augmented reality