As IT leaders, you have all come to accept certain certainties in life: Death, taxes and… data transfer fees from public cloud providers. Though no longer from Google, which recently pledged to eliminate data transfer fees for its cloud services.

Also known as egress fees, these are the sums cloud providers charge customers to move data to other clouds, on premises or anywhere but where it’s currently hosted. Data transfer fees can range from 5 cents to 9 cents per gigabyte, which can grow to tens of thousands of dollars for organizations trafficking in petabytes. Generally, fees vary based on where data is being transferred to and from as well as how it is moved.

Such fees are not exorbitant for some enterprises but others find them prohibitively expensive. It depends on the amount of data they wish to move. Regardless, public cloud providers have long been maligned for making it technically cumbersome or financially infeasible for customers to exit.

Naturally, IT leaders view egress fees as moats keeping them behind the castle walls. Or mousetraps. More roach motels than Hotel California. Metaphors will vary for seasoned CIOs versed in exiting proprietary software platforms.

It’s No Longer Cloud Vs. On Premises

You can believe Google’s tactical move to punt egress fees is a thinly veiled effort to keep antitrust wolves at bay or grab precious market share. You might also speculate whether Amazon Web Services and Microsoft will follow suit.

Rather, IT leaders should consider a salient question: Is it time to reassess where you place your applications?

For most of the past decade, the choice was simple—cloud or on premises. Despite a dalliance with cloud repatriation last year, IT leaders’ calculus has shifted. Employee requirements have helped diversified the market for compute and storage.

Emerging trends point to a decentralization of IT. In the burgeoning ubiquitous computing phenomenon, applications are increasingly served by both legacy and new on-premises systems, multiple public clouds, microclouds, and colo facilities.

Ubiquitous computing hands off naturally to edge computing and fog computing. At the edge, data is processed at or near the source of its generation; ideally such local processing reduces latency and bandwidth usage rather than sending it to a centralized server. Sensor-laden beer breweries, for example, are awash in edge compute capabilities.

The fog extends the edge to include additional layers of compute nodes between the edge and centralized systems and cloud platforms, supporting data, processing and capacity along its journey. Across the edge and through the fog, your smartphones may turn on your lights while your smartwatches may turn off your ovens

These paradigm shifts are already busting traditional datacenter boundaries.

Workload Placement Is Evolving

What does this mean for IT leaders? It’s all about the applications and data.

Today, most IT leaders support workloads and data across myriad locations due to the accumulation of applications employees use. Yet as applications become more distributed across multiple clouds and on-premises systems they generate more data and the resulting data gravity makes them harder to move.

Accordingly, organizations must move compute capabilities closer to where that data is generated. That old chestnut about real estate? Location, location, location? Nothing is more apropos, as it pertains to the placement of applications.

Organizations must be ready to run any workload anywhere users wish to access it, while weighing performance, latency, security, cost and other factors.

No workload embodies this more than generative AI, whose large language models require large amounts of compute processing. While most employees access GenAI services from computers and laptops today, more will expect to access them from mobile devices as technology advancements continue to shrink models.

From the core to the edge and everywhere in between, silicon and Infrastructure to support these applications, which may include software agents and ambient displays, must be diverse, flexible and capable of running anywhere. Ubiquitous computing indeed.

Multicloud-by-design Is the Future

To support critical business workloads, more IT leaders are cultivating strategies that support application workloads wherever it makes the best sense to run them.

IT leaders must architect infrastructure that supports data needs both within and outside the organization – from datacenters and colos to public clouds and the edge. With such multicloud estates, IT departments can run hundreds or even thousands of applications.

Such an architectural shift is more iterative than seismic, when you stop to think about it. The reality is that most organizations have been running applications in a variety of places based on business needs, developer preferences and shadow IT—an ad-hoc, happenstance multicloud.

It’s time for IT leaders to architect for the future, one whose tendrils have extended beyond the corporate datacenter and cloud platforms. The path forward is an intentional, multicloud-by-design strategy.

Dell Technologies’ APEX portfolio of solutions is designed to support your multicloud-by-design strategy and is delivered as-a-Service via a pay-as-you-go consumption model. The APEX portfolio also helps you run your preferred storage in public clouds, while also enabling you to run your favorite public cloud services on premises.

Learn more about how Dell APEX can help you allocate workloads across your multicloud estate.Â