Mentors of Digital Innovation
Header_Weblog_V2.jpg

JC2 Weblog

The ‘CIO Two Cents’ blog features insights from Yvette Kanouff, partner at JC2 Ventures. Learn what’s on the mind of CIOs at this moment in time.


An Edgy Future – the Ongoing Pendulum of Central and Decentralized Computing

Volume 1 - Issue 4 ~ May 2, 2022

 

Welcome to the fourth edition of the “CIO Two Cents” newsletter from me, Partner at JC2 Ventures. Read on for insights into what is on the mind of CIOs at this moment in time.

The JC2 Ventures team (John J. Chambers, Shannon Pina, John T. Chambers, me, and Pankaj Patel)

 
 

An Edgy Future – the Ongoing Pendulum of Central and Decentralized Computing

I find it fascinating how we have swung the pendulum of data center centralization and decentralization over the past few decades. Think about it: we started with one (giant) single-user computer – total decentralization; then in the 1980’s we were thrilled with multi-tasking operating systems and centralization with huge mainframes and disk farms; we then decentralized with personal computers, the ongoing evolution of connectivity, and the adoption of the internet. Can you believe that in 1995 only 14% of US adults had internet access? By 2000, the number of online American households moved up to 46%. Then in 2006 Amazon launched the Amazon Elastic Compute Cloud, and the rest is centralization history. Now, we are adding the concept of the edge as we consider new and emerging concepts like Web 3.0. I love it… decentralization is where the pendulum is leading us once again.

 

The edge offers us flexible data management, pre-processing before we move to the cloud, reliability, new levels of scale, and a hugely important low-latency connection.

— Yvette Kanouff


 

To be fair, centralization and decentralization have always worked in parallel. Consider content delivery networks (CDNs) – we had to build these in the 1990’s to cache content closer to users. I remember writing a paper with the awful title of “Hierarchical Storage Networks” in the mid 1990’s, where I defined the need for distributed storage servers for video content. Akamai came up with a much better name in 1998.

But I digress – back to the edge and the concept of cloud and data moving to the edge. While Gartner predicts that 80% of enterprises will shut down their own data centers in favor of the cloud by 2025, the edge computing market is expected to grow at a CAGR between 25-50% during that time according to various research. There are many reasons to augment cloud computing with edge networks, especially in today’s environment where we are starting to consider what the next generation of the internet will look like. While the cloud offered us consistent maintenance, security, and cost savings on staff, power, and facilities, the edge offers us flexible data management, pre-processing before we move to the cloud, reliability, new levels of scale, and a hugely important low-latency (potentially sub 1ms) connection. 

I find that edge computing still has different meanings to different applications. Each is valuable. The idea of the edge is that you’re moving containers or applications away from the central cloud. Take a company like the startup Pensando, for example, which recently announced a sale to AMD. Pensando, with customers including Microsoft Azure, Oracle Cloud, IBM Cloud, and Goldman Sachs, builds a card that offloads the cloud CPU, similar to what AWS does with Nitro. This means you can build an ‘edge’ that’s still inside your data center or cloud, but it creates an edge at the input/output of that cloud to offload heavy compute items or security. You can offload the cloud computing to do what it’s supposed to do and process critical and heavy-compute items like decryption/encryption and firewalls at the cloud-edge.

This is different from applications like 5G where you have multi-access edge compute (MEC).  MEC enables compute resources to be physically located close to the user as opposed to the cloud, which may be far away in location. Allowing the compute to be close to the user enables very low-latency applications to utilize the MEC. It also enables data to sit locally with high performance and increased security. Combining CPU offload together with MEC is also a valuable and needed edge-at-the-edge implementation of hardware optimization and security. MEC is seen as a great opportunity for smart cities, autonomous vehicles, and IoT.

In some cases, edge computing means that the compute is done in the end-user devices, which doubles as a deep-edge device for things like pre-processing, so that the cloud processing is optimized.

Whichever our favorite implementation of the edge is, it’s here and it’s valuable, and it’s our new decentralization. Maybe it’s an augmentation to centralization, but I prefer to think of it as the evolution of computing and how we continue to swing the pendulum. 

 

Moving fast? I've got you covered:

(1)

  • We are adding the concept of the edge as we consider new and emerging concepts like Web 3.0, moving businesses back towards decentralization after years of centralized cloud platforms.

(2)

  • While the cloud offered consistent maintenance, security, and cost savings on staff, power, and facilities, the edge offers flexible data management, pre-processing before we move to the cloud, reliability, new levels of scale, and a hugely important low-latency (potentially sub 1ms) connection.


Image of the Moment

 

Edge computing is playing an increasingly critical role in business processes and functions. Leaders, like HPE CEO Antonio Neri, have been bullish on the transformation to edge computing.

 

Your Thoughts?

 
 
 
 

More CIO insights to come! Until next time

Yours truly,

Yvette Kanouff