It was always about data movement

Originally a post on LinkedIn, March 9, 2026

For decades, we have been obsessed with performance.

In the 80s, we wanted faster dial-up. The faster the modem, the faster we thought things would get. It didn’t really matter whether anyone could clearly articulate what those faster results were supposed to produce. The assumption was simple: more speed meant better outcomes.

By the late 90s, the conversation shifted to networks. Cable modems began appearing, and suddenly, bandwidth became the new obsession. If we could just make the network faster, we believed everything running on top of it would automatically improve. Those of us working in IT would occasionally ask the obvious question, “Faster results for what?” but the answer rarely mattered. The real objective was simply making things fast enough that nothing broke and nobody called us at 2 am.

Then the early 2000s arrived, and the obsession moved again, this time to backup.

Entire engineering efforts were focused on squeezing more throughput out of the pipeline. We tuned buffer sizes, upgraded NICs and switches, replaced tape drives, and optimized every stage of the data path. We moved from DLT to LTO. Software vendors multiplexed and interleaved streams to keep tape drives from shoe-shining. By the mid-2000s disk was beginning to replace tape as the tier-one backup landing zone.

All the while, we wanted faster processors, faster hard drives, faster SSDs. Faster, faster, faster. We rarely stopped to consider the limitations physics placed on us as long as the label said it was faster.

At the same time, the world of supercomputing measured everything through performance benchmarks. If the speeds and feeds beat the other system on a benchmark chart, everyone declared victory.

But if you step back and look at all of these eras together, a pattern becomes pretty obvious.

None of it was really about performance.

It was always about data movement.

And it still is.

Performance metrics are easy to market. Data movement is harder to explain, but it is the thing that actually determines whether systems work. When data moves efficiently through a system, results follow. When it doesn’t, no amount of raw compute power or impressive benchmark numbers can fix the problem.

Artificial intelligence is not changing that reality.

If anything, it is making the underlying problem far more visible than it has ever been.

I have been thinking about this quite a bit lately. Maybe it’s the upcoming #GTC2026 event. Maybe it’s the same tired messaging about idle GPUs. Or maybe it’s simply the realities many of us have been seeing since around 2020 finally colliding in the perfect storm we now call AI.

Whatever the reason, stay tuned.

Leave a Reply