Making the Case for Edge Data Centers (EDC)

Cloud computing provides scale and efficiency in managing traditional IT workloads, but the nature of IoT requires nearly real-time exchange of information between endpoint devices and the platforms that supply data-rich analytics and operational control. These seemingly divergent concepts require newer design approaches as both cloud-based computing and IoT applications will coexist as organizations seek to leverage the benefits of each technology within their IT operations.

Learn more about this topic by reading our TECHbrief on Edge Data Centers.

 

Transcript: Making the Case for Edge Data Centers (EDC)

Hi, I’m Andy Jimenez.

The transition from on-premise to off-premise computing in the cloud is well established, but another trend that is gaining momentum is the explosive IP-based device growth within enterprise environments driven by the Internet of Things (IoT). However, while the cloud provides scale and efficiency in managing traditional IT workloads, the explosion of data, devices and interactions that accompanies IoT requires nearly real-time exchange of information. Due to the centralized architecture associated with cloud computing, there will always be inherent network latency and performance issues, especially for devices and data that reside far from a centralized public or private cloud data center.

This is where edge computing comes into play. Rather than the processing, storage and communications being conducted off-premise in the cloud, edge computing brings processing close to the data source so it doesn’t need to be sent to the cloud or other centralized systems for processing. By eliminating the distance and time it takes to exchange data between local and centralized sources, speed and performance are greatly improved.

As organizations seek to leverage the benefits of both IoT applications and cloud computing, newer design approaches will be required to enable both technologies to coexist within a single environment. One such example is the introduction of fog computing, a derivative of edge computing, where the processing, storage, and communications are conducted locally via an IoT gateway—or fog node—and are then routed over the internet backbone. This architecture acts as a bridge and greatly reduces the transmission latency between edge devices and IoT gateways. 

For example, as utilities rely more heavily on real-time data to more efficiently run systems, they may deploy a network of edge or fog nodes to support remote monitoring devices at substations to transmit the data more efficiently and securely than sending it to a centralized data center. Sometimes this data is in remote areas, so processing near the source is essential. Other times, the data needs to be collected from several sensors. Fog computing architectures could be developed to solve both of these issues. With fog computing, sensor data would be processed faster, allowing the utility to respond more rapidly to an issue, and it would also be more secure because it would be decentralized and stored locally, enabling the utility to access the data through edge network rather than depending on backhaul connections to the network core.

As the edge data center market evolves, the physical constraints related to a broad range of on premise and off premise installation environments will need to be addressed with an implementation strategy that caters to repeatability, scale, and flexibility. 

Depending on your organization’s needs, you may require a mix of fog, edge and cloud computing and possibly an on- or off-premise data center.

To learn more about developing a data center strategy, read our TECHbrief on this topic or visit anixter.com/datacenter.