Op-Ed

Edge or Core Computing? It’s Time to Rethink the Question

As IT evolves rapidly, value awaits companies whose data sticks to the middle.

Written by Simon Ninan | 7 min July 18, 2025

Edge or Core Computing? It’s Time to Rethink the Question

When the chicken crosses the road, the self-driving car can’t contemplate “Why?” 

Life and death decisions may need to be made in fractions of a millisecond. There’s just not enough time to run a network check to the central server. The decision to swerve, brake or accelerate must be instantaneous. And it must occur close to the source of the data or the edge. It must come from within the car. 

The rapid evolution of edge computing capabilities has fueled a boom in autonomous vehicles and other innovations at the edge, not to mention technologies such as 5G, the Internet of Things (IoT) and generative AI (GenAI) that have enabled a slew of new use cases. Back in 2018, when companies generated just 10 percent of all data outside of a core, central location, Gartner predicted that the edge would be responsible for 75 percent of enterprise data by 2025. Gartner also estimates that the number of IoT devices will triple from 2020 to 2030. IDC similarly forecasts a $150 billion spike in edge computing spending through 2028.

This proliferation of edge devices is increasing the volumes, diversity and complexity of data. Use cases vary substantially in their requirements for timeliness, computational sophistication, personalization and several other such characteristics. In that increasingly distributed context, the ‘traditional’ computing architectures of today may not work for tomorrow. There is no one size that can fit all. What does that mean for core computing, then? 

In the case of IoT sensor data, such as in our self-driving car example, we used to see that 99% of sensor data was discarded. But, as Gartner points out, there is a “need to extract and preserve greater value from sensor data at the edge.” On one hand, there may be regulatory and liability reasons why the information must be retained following the self-driving car’s split-second response to events on the road. On the other hand, the aggregation of such data can be incredibly valuable in providing additional intelligence to improve future decision-making via machine learning. 

Data captured at the edge triggers decisions at the edge. This data and the outcomes of the decisions are then transferred to the core for aggregation and learning. That, in turn, updates the large language decision models to increase their robustness, which are subsequently pushed out to the edge to drive improved decision-making. That is a virtuous information cycle of steady improvements in intelligence and automation, made possible in a world of high-connectivity and strengthening computing capabilities. 

A solitary wayward chicken encountered by a single car is a valuable data point that can be used to teach an entire fleet to drive more safely.

Edge, Core and the Swinging IT Pendulum

History repeats itself. Over the past several decades, as technology has evolved, the IT pendulum has swung dramatically enough to test the limits of our adaptability. And with these swings, we see old models return to new avatars.

In the early stages of networked computing, centralization dominated IT infrastructure. Astronomically large and expensive mainframes dominated the scene, surrounded by “dumb” terminals. Then came personal computers, which began the trend towards decentralization as the so-called edge began to become more powerful. The stronger the processing power within the PC or Mac, the less reliant it had to be on centralized systems. And these computers became mobile as notebooks and laptops as connectivity grew. 

But then, along came cloud computing, with its promise of efficiency, flexibility and scale: why put so much burden on the edge when you could reduce costs and get numerous other advantages through the cloud?  

The well-known “Moore’s Law” has not failed. Computing power continues to make exponential improvements. And so, predictably, as IoT technologies have proliferated, computing at the edge has been strengthening by the day. As a result, we’re seeing the pendulum swing yet again: the cloud is divesting some of its power to the edge. To be clear, today we often use the term "edge” to describe both endpoints (with individual devices) as well as distributed computing and storage capabilities close to those end points, away from the ‘core’ centralized locations managed by the organization.

Now, we’ve reached the point where endpoints are no longer solely for data generation. Smart buildings, remote medical devices and self-driving cars function as miniature data centers, individually processing and responding to information in real-time. I had a front-row seat to this pendulum swing when I supported a Japanese automaker’s connected car program. It was stunning to see how centralized technologies and systems of record were giving way to distributed approaches, in what seemed like no time at all. 

"Big data supports more data-driven decision making and new use cases...on the flip side, though, this could make data complex and unwieldy. "

Even as the world of IT infrastructure is always in a state of change, one idea that operates as an anchor of sorts is data gravity. To put it simply, data gravity is the principle that as organizations become smarter and as datasets grow and become more sophisticated, this data tends to attract other data and applications and infrastructure to be built up around them. That drives towards core locations attracting greater attention and investment, and pulls the organization towards increased centralization. Such an approach has its benefits: big data supports more data-driven decision making and new use cases, allowing for innovation, collaboration, efficiency and other competitive advantages. On the flip side, though, this could make data complex and unwieldy, limiting data mobility and flexibility, and potentially adding other challenges such as creating data siloes, increasing costs and even becoming a bottleneck for time-sensitive applications. GenAI has only added a new layer of complexity and opportunity to this conundrum.

Once again, we find ourselves staring at the swinging pendulum, wondering where to invest our resources. But, maybe, the answer doesn’t lie in swinging between extremes. Good strategy is not about choosing one or the other, but rather about finding the right balance.

Look for Your Own Express Lane

The truth is that the future of enterprise IT will lie in designing for efficiency, seamlessness and effectiveness across the edge and the core. This requires straddling truly hybrid approaches designed to best support the specific use cases needed by the business. Positioning ourselves at the right point in the middle allows us to operate fluidly between these paradigms and build the flexibility and scalability necessary to shift as business and technical needs evolve. 

That’s where the hybrid cloud comes in, enabling seamless access, management and optimization of data, regardless of its location. Hybrid cloud solutions leverage the best of both computing worlds. These architectures allow for the capture, storage, movement and processing of data at the right place and at the right time. And, flexible hybrid cloud business models (such as ‘as-a-service’ or consumption models) allow for the framing of these solutions in the form of delivered use cases and business outcomes, rather than just in terms of deployed hardware and software products.

Every use case will have its own requirements, and hence its own endpoint vs. edge vs. core design that will need to be worked through.

Take cargo ships, for example. When they’re in remote offshore waters, they have to operate with wireless Internet connectivity that may be unreliable or non-existent. Bandwidth will depend on whether they are close to onshore mobile networks or have to rely on satellite communications. This impacts their ability to move data at high volumes, such as for AI inference, ML or deep learning algorithms which can quickly overwhelm the limited bandwidth. To top it off, they have limited access to power resources, so power consumption must be lean even while using onboard computing. However, the right edge/core designs can ensure that these considerations are all accounted for: 

  • Data generation across a host of onboard IoT sensors that feed into aggregated maritime telemetry;
  • Edge AI appliances that collect this data and run AI inference, allowing on-device data processing for off-grid applications; and
  • A fast built-in Ship Area Network (SAN) to transmit sensor data to the edge computing appliance, together with an IoT gateway that can help connect to the Internet where available and exchange raw and processed data and model updates as required.

You can examine similar approaches with other such edge computing applications, each with their own unique requirements: smart factories that require real-time processing that supports production line efficiency; remote healthcare and telemedicine where high-precision medical procedures depend on real-time data processing and complex decision-making at the edge; traffic management for smart cities that have adaptive traffic lights for congestion management; and many other such cases. 

New use cases are constantly being added. New product innovations are unlocking new sources of value. And all of this is creating new opportunity.

The edge is engineered for specificity and can accommodate breaks in connectivity, limited bandwidth and real-time decision-making. Core systems, on the other hand, operate holistically, focusing on long-term storage, model training and interconnectivity across the enterprise.

But over time, GenAI is blurring the lines between the edge and core. Large language models are no longer limited to the core. Smaller derivative models are deployed to the edge for mission-critical responses, eliminating the need for the massive computational power of a centralized system. It’s the kind of evolution that warrants a new question: What belongs at the edge, and what belongs at the core?

This isn’t just a technical inquiry but a strategic, operational and financial one. Organizations must rethink how they build their IT infrastructure — as well as the technological investments and governance strategies that will set them up to innovate in pursuit of ROI for years to come.

It is clear that there is no one-size-fits-all solution. But understanding the core principles of hybrid design allow for a common underlying approach or framework to arrive at the best-fit solution for the given use case. In other words:

  • First, start with understanding and prioritizing the use cases that matter most to your organization, that represent the main opportunity to deliver Return on Assets, including through AI.
  • Then, understand the essential requirements that support these use cases, in terms of performance, latency, data volumes, security, cost and so on.
  • Next, map out your data – where it is located, who is responsible for it and how it is used, which will also help understand the impacts of data gravity.
  • Leverage the power of the edge to overcome limitations driven by the use case or the challenges of data gravity.
  • Apply the core principles of hybrid cloud design to arrive at the architecture that is best suited for the execution and evolution of this use case, understanding the impacts of data gravity. Hybrid cloud can leverage the edge, optimize the core and bridge effectively across them.
  • Finally, and very importantly, invest in data governance like data access controls, data quality management and data retention. This is critical to effectively managing data and ensuring that the RoA is delivered as promised.

That’s a responsible strategy. And it’s your ticket to the middle lane, which you may well soon find was the express lane all along.

  • Leadership
  • Hybrid Cloud
Simon Ninan

Simon Ninan

Senior Vice President of Business Strategy, Hitachi Vantara

Simon Ninan is Senior Vice President of Business Strategy for Hitachi Vantara. He develops and drives aligned business strategy, maximizes customer and stakeholder value, and promotes growth and innovation while driving market leadership.