Although much of the excitement around edge computing is in the applications and platforms, hardware is still a critical component. Key questions we answer in this article is what an edge server is, how it differs for different use cases and types of edges and what the key trends are.
What is an edge server?
Edge servers are servers (compute resources) that run the processing at an edge location, which can be anywhere along the edge spectrum – usually from on-premises edge to regional edge.
An edge node refers to a broader set of compute resources (i.e. including end-devices too) as well as a cluster of edge servers.
The nature of an edge server differs across different types of edges, depending on the use case and where the edge compute resource is deployed.
CDN edge server
A CDN edge server is a server tha exists at the regional edge and supports CDN workloads in a more distributed manner than is done today. Our edge computing investments webinar indicates that most capital being deployed today is going into data centre facilities at the regional edge. For example, in the U.S., latency is significant in Tier 2 cities as large as Austin, Texas. These locations can be almost as
Network edge server
At the network edge, most edge nodes will reside in data centre-like environments, particularly in the next few years given that telcos will largely leverage existing data centres in their network. However, as edge computing expands into deeper parts of the network, such as a deployment at a base station, the environment will be different to a traditional data centre. For example, in a smaller scale deployment, cooling will be delivered differently compared to a hyperscale data centre with thousands of servers. In some cases, edge could be deployed alongside small cell infrastructure, which would mean the edge server would likely to be standalone and need to be ruggedised since it will not have an enclosure to ‘live in.’
On-premise edge server
At the enterprise or on-premises edge, i.e. edge computing at a factory, shopping centre, office space, etc., edge servers can take different shapes or forms. Some edge deployments will be in on-premises data centres at take the form of a standard data centre server. However, in industrial edge deployments in particular, you may have a single edge device for running workloads, for example at an oil rig. Given the harsh environments, this would need to be ruggedised. In retail, the requirements are totally different – they have limited space to be able to install an enclosure for the edge node yet and would need equipment that can be hidden away from view as much as possible. Lastly, telcos and OEMs are exploring changing existing customer premises equipment used for networking to host non-networking applications. These could either be enterprise CPE boxes, Wi-Fi gateways or programmable logic controllers in industrial settings.
Device edge node
STL Partners defines the device edge as either edge compute residing on the end-device (e.g. a smart camera) or a separate small device attached to the end-device. One example of this is for asset monitoring; manufacturers are attaching small edge nodes to their customers’ assets to be able to monitor the condition and use the analytics to provide new services. The edge nodes are less likely to be ‘servers’ per se, but take the form of a small computer or simply be additional processing hardware installed on the end-device.
Key trends related to edge computing servers
The nature of edge servers is evolving. Some of the trends we are seeing, particularly in data centre servers, may extend into edge servers, whereas others are still open questions.
1. Hardware-as-a-service
One of the key factors for why cloud took off is because of its attractive “as-a-Service” business model, which allowed customers to spread costs over time in an OPEX-based model, rather than pay CAPEX up-front to build IT infrastructure. In order to replicate the advantages of the cloud commercial model at the edge, we are seeing the growth of Hardware-as-a-Service models. In other words, the customer pays for the server in a recurring fee model. This can either be a subscription model, consumption-based or a managed IT services fee. HPE with its Greenlake portfolio is one of the earliest proponents, however others in the industry, such as Lenovo, Dell and AWS, have all jumped on the trend too.
2. COTS vs. specialised hardware
One of the challenges for edge computing solution providers and infrastructure developers is determining the processing capabilities within the edge server. Some use cases that require heavy visual data processing or image rendering will need GPUs (Graphics Processing Unit). Others that need high performance computing or low latency, high throughput processing may require specialised hardware accelerators, such as FPGAs (Field Programmable Gate Arrays) or ASICs (Application-specific Integrated Circuits). However, for an edge data centre operator or anyone designing a blueprint for edge infrastructure, it is difficult to achieve economies of scale as easily using specialised hardware compared to COTS (common-of-the-shelf) CPUs.
3. Storage and compute convergence
In some use cases, edge servers need to be as small as possible, for example if it is an attachment to an existing asset that will be used for monitoring the asset’s performance (condition-based monitoring). Converged infrastructure allows hardware to be used for both storage and compute, therefore simplifying a deployment by avoiding the need to have separate hardware for each process. We are starting to see an extension of this in computational storage. This moves compute even closer to storage to reduce the amount of data that needs to travel between the too, which would be particularly beneficial for ultra-low latency use cases.
4. White box CPE
In telecoms, some operators see their opportunity in edge computing around changing the nature of customer premises equipment (CPE). The industry has been opening up the previously vendor-locked CPE by disaggregating software (network services) from the underlying infrastructure. For example, Verizon worked with ADVA to create their universal CPE platform to allow customers to use COTS and select network services from multiple vendors. We have covered this at length in a previous report SDN / NFV: Early Telco Leaders in the Enterprise Market. The next step is now for these same boxes to run non-networking functions to become edge compute platforms in their own right by adding an IaaS layer. For example, a bank’s retail branch can use the CPE to run its branch networking services, but also compute for processing workloads related to enterprise applications, such as analytics for video security, IT security, access management, etc.
Is a server a network edge device?
A server is not a network edge device. A server is hardware on which edge software workloads run – this could be both on customer premises or at a site within a telecoms operator’s network. A device tends to refer to the piece of hardware or end-point typically owned by the end-customer. For example, devices could include IoT sensors or cameras.
In conclusion, edge computing servers are servers that are distributed beyond core data centres closer to end users. They will run a multiple of different edge workloads on them, spanning from IoT, to video analytics, to augmented reality solutions.
Are you looking for advisory services in edge computing?
Read more about edge computing
Edge computing market overview
This 33-page document will provide you with a summary of our insights from our edge computing research and consulting work:
When will edge private cloud supplant colocation?
This article will outline the core differences between edge colocation and edge private cloud, explore the merits and demerits of each.
Edge computing types: 4 edge types defined
There are four main types of edge computing: network edge, regional edge, on-premise edge and on-device edge. The type of workload and its requirements will determine which type of edge is most applicable in what circumstances.
Edge computing in universal CPE (uCPE)
Enterprise networking has evolved in the last decade, moving away from proprietary appliances, to universal platforms that allow for flexibility and choice for how to manage enterprise network services and functions. Question is – can these be extended to provide edge computing for non-network services? This article evaluates the opportunity and provides examples for companies innovating in the edge uCPE space.