Americas

  • United States

Your Data Center is moving … to the EDGE

BrandPost
Jan 04, 20215 mins
Data Center

Data collection and analysis is moving from data centers to centers of data, necessitating changes in operational and infrastructure strategies.

istock 1061412388
Credit: iStock

By John Gray, Data Center Product Marketing at Aruba, a Hewlett Packard Enterprise company

In the next few years, it is estimated that half of all the data we create will be processed at the edge, near the point where it is produced for collection and analysis. The data center will remain a critical piece of IT infrastructure, but its role is morphing to support rapidly evolving remote locations with full IT stacks of compute, storage, and networking resources.  

Traditional enterprise data centers are being augmented by distributed and virtualized data centers, extending out toward where data is captured: public clouds, and an increasing number of edge data centers. This distribution of data collection and analysis signals a move from data centers to centers of data, changing infrastructure, and operational strategies along the way.

One catalyst for the evolution is the changing nature of data. For one, the number of network-connected cameras alone will double in the next 5 years – each pumping out huge amounts of data. Already, three-quarters of network endpoints are IoT devices. Meanwhile, traditional enterprise application delivery and physical plant operations are moving to common infrastructure. Increasingly, real-time decision-making and actions will be universally desirable as more physical plant machinery and infrastructure gets connected.

Take for example modern, high-speed manufacturing processes with video and sensors to ensure higher quality and safety. In the oil & gas industry, leak detection and equipment malfunction are now closely monitored and quickly rectified using distributed automated systems. In the public sector, smart cities will rely on cameras, sensors, and other IoT devices to monitor public safety, optimize traffic, maintain infrastructure, and promote energy efficiency. 

In all industries, a variety of infrastructure – compute/GPUs, VMs, containerized apps, storage (HCI, AI, and ML) and wireless, wired, and WAN network devices will be needed to support these requirements at the “edge.”

According to IDC, by 2023, more than 50% of new enterprise IT infrastructure will be allocated to edge environments, and these edge environments will have to be connected reliably and securely to datacenter and cloud networks.

But as enterprises make this shift, many find themselves unprepared. An increasing number of enterprise edges are built on siloed compute and storage systems, with disjointed network architectures not only in the campus, data center, and branch, but now at emerging edge locations. With this explosion of centers of data, achieving scale using legacy approaches is hard. Perhaps impossible.

Traditional data centers must be reimagined and redesigned to account for the new distributed, edge data centers. Consider these five recommendations as you make this transition.

Unify your infrastructure – Architect around a unified, cloud-managed, network infrastructure and operating model across your extended enterprise (campus, branches, and traditional and emerging centers of data). It is far easier to manage an expanding enterprise network architecture that is based on common operating model (operating system, L2/3 feature sets, and tooling) than trying to manage different architectures across various enterprise domains.

Embrace automation – Implement AI-driven, software-defined networking that can help automate manual, mundane, and complex tasks often associated with networking.  Emerging AIOps can accelerate this transformation by combining AI-based problem identification with automation to ensure the highest degree of network availability and performance. Some solutions also integrate directly into widely used compute, storage, and virtualization management stacks to provide an even more streamlined IT infrastructure.

Deploy integrated solutions – While some organizations may have the time, energy, and in-house expertise to design and integrate custom IT solutions, many providers offer ready-to-deploy, pre-engineered, workload-optimized solutions that simplify and speed IT service delivery while reducing the time, risk, and expertise needed to deploy complex solutions.  Look for solutions that include networking in the stack, versus tacking it on after the fact. For example, consider hyper-converged solutions that integrates both physical and virtual networking.

Implement Zero Trust security – Security must be at the forefront of any architecture discussion. The key is building security from the ground up and extending it across all areas of the network. By default, Zero Trust networks enable: 

  • Authentication of every user, device, traffic flow, and “thing” on the network.
  • Centralized creation of role-based policies and implementation across all aspects of the intelligent edge, from the core data center out to every edge and IoT device.
  • Dynamic segmentation of users, devices, and tenants (and their traffic) no matter where they enter the network and how they traverse it.

Push the easy button – Achieve scale through the cloud and pay as you go. There is a lot to do. Strike a balance between all on‑premises IT, a move to the cloud, and a new expanding edge – let’s face it, orchestrating (and paying for) this complex IT estate can be daunting. 

The good news is that enterprises have new options for cloud services and infrastructure-as-a-service for their workloads – on-premises, fully managed in a pay-per-use model. Enterprises should strongly consider and evaluate these options to increase efficiency, move faster, and deliver better business outcomes.

To learn more, visit our website and discover Aruba data center networking solutions