Making the Case for Edge Data Centers (EDC)

Modern computing methods deployed by many organizations are at a crossroads. While the transition from on-premise to off-premise computing in the cloud is well established, another trend that is gaining momentum is the explosive IP-based device growth within enterprise environments driven by the Internet of Things (IoT). Cloud computing provides scale and efficiency in managing traditional IT workloads, but the nature of IoT requires nearly real-time exchange of information between endpoint devices and the platforms that supply data-rich analytics and operational control. These seemingly divergent concepts require newer design approaches as both cloud-based computing and IoT applications will coexist as organizations seek to leverage the benefits of each technology within their IT operations.

 

Edge computing extends cloud network, computing, and storage services to edge devices.

Fog computing is a derivative of edge computing, where the processing, storage, and communications are conducted locally via an IoT gateway (or fog node) and then routed over the internet backbone. This architecture greatly reduces the transmission latency between edge devices and IoT gateways.

Edge-Devices-Graphic-960

For example, in a hospital system, a network of edge or fog nodes might be deployed to support remote patient monitoring devices, transmitting the data more efficiently and securely than sending it to a centralized data center. Sensor data would be processed faster, allowing clinicians to respond more rapidly to treat critically ill patients. The data would also be more secure because it would be decentralized and stored locally, enabling clinicians to access the data through edge network rather than depending on backhaul connections to the network core.

Given that the number of connected/IoT devices is rapidly increasing, and these applications often require lower latency to function optimally, edge computing must be considered as part of an overall data center strategy.

Depending on your organization’s needs, you may require a mix of edge and cloud computing and possibly an on- or off-premise data center.

Edge Data Center Considerations

Criteria

Description

Edge Applications

Three broad categories of applications centered around IoT, content delivery and disruptive technologies (e.g. blockchain, AI, self-autonomous vehicles)

Physical Location

Geology and topology, gravity and interconnection, the last mile

Availability and Latency

Process data and services as close to the end user as possible; an architecture that allows the compute and content delivery process to happen within 10 milliseconds

Implementation Strategy

Built environment that can take several physical forms, such as traditional DC design, micro data centers, light poles and pedestal mount enclosures

Standardization vs. Customization

Consideration should be given to ease of deployment and standardized modularity in design due to speed and scale requirements

Physical Infrastructure

Power distribution, cooling method, connectivity, and physical security

Efficiency of Deployment

Material site readiness, specialized installation services, total project management

To help you learn more about developing data center strategy, Anixter’s technical experts have developed a variety of free resources on data center challenges, trends, best practices and technology solutions.


Ask an expert

To learn more about the latest data center technology and standards, contact us to speak to an Anixter technology expert.