Head over to our on-demand library to view sessions from VB Transform 2023. Register Here


This article is part of a VB special issue. Read the full series here: Thefuture of the data center: Handling greater and greater demands.

In recent years many data centers have transitioned from a centralized model to an edge computing approach. This brings computing closer to the point of use, reducing latency and improving performance. 

The impetus for this change has been a surge in AI-driven software and the widespread adoption of 5G technologies for more efficient data packet delivery.

Enterprises, cloud service providers and telecom companies are investing heavily in digital edge data centers to optimize applications like streaming video, telemedicine and factory automation. Additionally, edge data centers facilitate implementation of advanced technologies such as augmented and virtual reality (AR/VR) and autonomous vehicles.

This paradigm shift entails a redefinition of data centers and a complete overhaul of their architecture.

Shift to the edge, gain an edge 

To gain a competitive edge in today’s fast-paced and fiercely competitive economy, organizations are combining edge computing with infrastructure modernization and other IT innovations to meet the demands of the modern digital age.

Specifically, enterprises are shifting their data center strategy toward high-performance edge computing to meet AI’s compute requirements.

“A high-performance edge infrastructure allows AI computation to happen near the data center at the edge of a network instead of a private data center,” Utpal Mangla, general manager of distributed edge, sovereign cloud and partnerships at IBM, told VentureBeat. “Since the data is being analyzed locally, there is potentially better availability and more real-time analytics.

“It also helps to reduce networking costs,” he added.

Incorporating high-performance computing (HPC) capabilities at the edge helps organizations respond promptly to evolving market conditions and analyze data sources at greater speed and scale. HPC at the edge provides not only the necessary computational power and speed but also scalability.

“When relying solely on traditional [centralized] data centers, a company risks data breaches and lagging [behind] the competition by not being able to support real-time applications,” Erik Pounds, senior director for enterprise AI at Nvidia, told VentureBeat. “AI applications are increasing at an incredible rate, which is why businesses are now quickly moving to high-performance edge computing at data centers that can support multiple AI applications running at any given time.”

Leveraging 5G for high-performance edge networks and data centers

Edge computing has gained much momentum thanks to its effectiveness at tackling of the challenges of data-centric workloads. By retaining sensitive data on-premises, edge computing minimizes data flow to and from distant data centers, optimizing resource use and enhancing efficiency.

Industry experts believe enterprises must carefully consider the data processing requirements in the data center cloud and at the network’s edge to successfully scale AI. This is particularly crucial for next-gen use cases such as generative AI, which demand substantial computing power.

Gartner has predicted that by 2025, more than 50% of enterprise-managed data will originate and undergo processing outside the data center or cloud. This underscores the necessity of embracing high-performance edge computing, which uses a distributed architecture to process data and deliver services close to the users.

Pounds said that shifting computing resources closer to data collection and consumption points is crucial for next-generation devices and applications. It reduces latency, particularly for real-time processing, and speeds operations for many devices, from wearables to autonomous machines.

According to Pounds, 5G technology has enhanced network capabilities by reducing latency, increasing bandwidth and supporting software-defined networking. 5G’s expanded networking capacity facilitates the development of new real-time applications.

“For consumers, this may mean immersive technologies such as virtual and augmented reality and greater automation through autonomous machines and robotics for enterprises. Low latency is critical to ensure [that] the desired outcomes of these applications, from user experience to functional safety, are accomplished,” Pounds told VentureBeat. “This is only possible by distributing high-performance computing to the edge and closer to 5G networks.”

Likewise, IBM’s Mangla told VentureBeat that enterprises need to harness data where it is located to benefit from edge computing and 5G speeds.

He cited IBM’s recent collaboration with Bharti Airtel as an illustration of this approach. In this partnership, IBM is working with the telecom company to offer secure edge cloud services to enterprises. 

“We designed the platform to enable large enterprises across multiple industries, including manufacturing and automotive, to accelerate innovative solutions securely at the edge while helping data centers reduce latency and transmission costs,” said Mangla.

Deploying high-performance edge at data centers for AI/ML workload management 

Scalability is another critical consideration. Edge computing in data centers enables an increase in connected devices by reducing the strain on centralized infrastructure. This shift toward localized processing opens doors to better device connectivity.

Today, retailers are using edge AI to improve customer experience and inventory management and to analyze customer behavior for optimal product placement in aisles. With edge computing’s rapid data processing, retailers can introduce voice ordering or product-search functionalities to enhance the shopping experience.

Similarly, edge-based AI is being employed in industrial settings to ensure functional safety, including collision detection and avoidance for AMRs; robot-human interaction; and detecting breaches in barriers or incorrect use of personal protective equipment.

“High-performance edge computing allows AI applications to be tested and experimented [with] within controlled environments without affecting the rest of an organization’s operations. Once success is achieved, it can be further scaled and distributed by combining the power of data center architectures,” said Nvidia’s Pounds. “This can lead to iterative improvements, which can add up to major improvements in efficiency and effectiveness.”

According to Rosa Guntrip, senior manager of product marketing at Red Hat, a synergistic relationship between high-performance edge and the data center is essential for AI/ML deployments.

“Once the models have been tuned and are ready for production, the AI-powered intelligent application can be deployed (and automatically updated as needed). The intelligent AI-powered application running at the edge can now help make real-time decisions based on the data it is processing,” Guntrip told VentureBeat. 

She explained that each edge device maintains a local model, enabling data analysis and decision-making at the point of data generation. Additionally, it can periodically share newly collected data from sensors, cameras and other sources with the global model at the core data center. This ensures that the AI/ML model remains accurate by incorporating the latest data updates as necessary.

Hand in hand

“We expect centralized AI and distributed AI to co-exist. You already likely have AI running inside your phone, and use centralized AI like ChatGPT. These will ultimately lead to the deployment of AI onto large networks. We call that distributed AI,” John Graham-Cumming, chief technology officer at Cloudflare, told VentureBeat.

Graham-Cumming said that distributed AI allows for the rapid iteration of AI models, providing end users with the most up-to-date versions. Additionally, this approach enables low-power devices to access AI capabilities by connecting to a nearby “supercloud” data center for inference.

“With large distributed networks, code and data can be brought close to the 5G network, close to the end user, ensuring there’s no penalty no matter where the end user is,” he said. “At Cloudflare, we now have 300 distributed computing system-based data centers worldwide linked through the edge that operate as a massive system capable of moving data and code around for the highest performance.”

Madhav Srinath, CEO/CTO of cloud analytics firm NexusLeap, pointed out that AI practitioners consistently grapple with balancing accuracy and feasibility. Conducting additional model training directly at the edge eliminates unnecessary data transfer.

Srinath explained that since AI models often have a substantial data center footprint, transmitting the entire model over traditional data center networks carries more risk than incrementally training a model already deployed at the edge.

He said that if the end user’s device lacks the computational capabilities to perform inference, it must communicate with the centralized infrastructure to overcome its computational limitations.

“Such methods inevitably introduce data transfer costs and amplify the risk of operational failure,” Srinath told VentureBeat. “However, if edge devices are inculcated within data center infrastructures to handle AI processes demanding high computational power during the inference stage, it significantly bolsters the overall application’s reliability.”

The essence of security in edge data centers

Maintaining data privacy and security is crucial for highly regulated industries such as financial services and healthcare, where handling sensitive customer data is a trust-based responsibility. Operating within an edge computing framework allows data to be processed closer to its origin, minimizing the need for data transmission across multiple channels and reducing the potential for attacks.

“By keeping most of the data on the edge device, rather than transmitting it over the internet to servers, the attack surface for a threat actor to get that data is smaller,” Nancy Wang, GM of data protection at AWS, told VentureBeat.

Wang said that integrating edge computing into data centers also facilitates compliance with data residency and sovereignty requirements. While public cloud services are progressively expanding their reach, specific countries and sub-national governments are enforcing regulations mandating that certain data remain within their borders.

“If you do not have access to a data center or cloud provider within that country, then using an edge computing device would be a compliant way to do business in that country,” she added.

However, regardless of the data’s location, comprehensive management of security postures remains essential to ensure data protection.

Guntrip from Red Hat emphasized that although the potential attack surface may expand with increased locations offered through the edge, containing a breach to a single site is easier than addressing a problem through the entire architecture.

She said that implementing an edge computing strategy enables companies to streamline operations and enhance security across data center environments. This can be achieved through automated provisioning and hardening, efficient management, predefined configurations, and orchestration.

“The key is to make security for the full supply chain a priority from the outset — from the underlying operating software to the network connectivity, and edge applications across physical and virtual elements, from mobile endpoints to business applications hosted in the public cloud,” Guntrip told VentureBeat. “Measures like role-based access control for application users, and SSL (secure sockets layer) to encrypt the application as it is created can help keep data secure along its journey.”

“Data center security can be enhanced through edge computing by reducing data breaches during transfer to a data center, as well as the ability to compute and analyze data offline,” added Nvidia’s Pounds. “With AI at the edge, data can now be pre-processed, and protected information can be obscured before it is ever seen by humans or sent to a data center. Additionally, real-time decision-making means real-time safety or security measures when a risk is detected.”

IBM’s Mangla emphasized the importance of companies having a unified view across all their environments, including private cloud, public cloud, on-premises, and the edge, to fully gain the benefits of processing data on-premises at the edge.

“With a single point of control, organizations can seamlessly manage complexities and differences within their infrastructure, ultimately making it easier to keep data secured and compliant with regulations,” he said. 

The future of data centers and high-performance computing

Nvidia’s Pounds said that HPC at the edge will continue to offer definitive advantages to organizations. By bringing computing power closer to the point of data collection and consumption — the data center — it empowers real-time applications needed for today’s generative AI-driven software.

“It comes with challenges, namely, ensuring data is accurate and consistent across edge devices. Investments in infrastructure can also be challenging, though the return in the form of greater efficiency typically justifies the investment,” said Pounds. “Orchestration is the key.”

Red Hat’s Guntrip said that edge computing for data centers allows organizations to deploy latency-sensitive applications, ensuring a seamless user experience regardless of location. This approach also ensures compliance with regulatory requirements by keeping data within specific geographic boundaries as it is stored and processed.

Guntrip further highlighted that edge computing helps organizations reduce the amount of data sent to the cloud for processing, resulting in improved site resilience and optimized resource utilization and costs. This proves beneficial in addressing specific use cases or problems as they arise.

“Organizations now really need to think about how to manage and maintain these highly scaled-out deployments with their existing teams and established tools and processes, with a relentless approach to automation and security to accelerate adoption whilst reducing operational costs,” she said. 

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Source