The Truth About Edge Computing


Back when the cloud was the next big thing, skeptics questioned its reliability, its durability and, above all, its security. Over time, each concern has been addressed and largely resolved. The wisdom of off-premises computing is now practically a given.

But cloud computing isn’t a religious issue. We can believe that moving critical applications and mission-critical data off local gear is strategically smart, safe and cost-effective, and still acknowledge that growing pains have tested, and will continue to test, the model. With the cloud’s maturity comes some degree of ossification and even inefficiency.

The introduction of edge computing

Over the years, I’ve sought to debunk myths and hype around cloud computing’s flavors of the month: public, private, hybrid, fog, etc. They all taste great. They’re all less filling. My point has been that terminology too often masks an intention to fix things that aren’t broken, to repackage and sell things that already exist and work well, and to find alternatives to solutions that have proven themselves eminently capable of enhancing business processes.   

As Upton Sinclair memorably put it, “it is difficult to get a man to understand something when his salary depends on his not understanding it.” Because the tendency in technology is to tease the “next next big thing,” the temptation to apply a bear hug to the latest and greatest can be hard to resist, whether or not we fully know what we’re embracing.  

That’s where we are with edge computing. Before this bit of jargon fully morphs into a way of doing business, IT consumers, IT professionals and IT pundits all need to understand what it is substantively and where it lapses into change for change’s sake.

How edge computing can help

Unlike some of its vaporous predecessors in the IT realm, edge computing is legit. Edge strives to get business rules, and at least some data, closer to the user. While it’s easy enough to be bamboozled by buzzwords, the decision to push middleware closer to the user is, in some cases, a wise one. In a way, it’s cloud computing with a caveat, or an asterisk. “Off-premises cloud” sounds like an oxymoron, but in a dispersed physical environment – say, a corporate campus with a data center serving satellite offices edge computing makes sense. 

Many of us naturally want local equipment to perform locally and thereby spare us from chasing around the internet. But that’s a pretty application-specific approach, and it’s not at all clear that every application needs to go that direction. It’s important to ask whose ox is being gored in this scenario. Who has a vested interest in staying on premises or returning to the older model in a significant way?

Might that someone be old-line equipment manufacturers? Apart from the rationale of being closer to the user in some limited instances, it would seem that the biggest champions of edge computing are those who lost the most by going to the cloud. Where’s the demand for a methodology that brings servers back to their locations?  

Today, a certain number of large players seek to commoditize the market. They want their service to be a commodity because it’s easier to compete simply on price, never mind educating the user community. So it’s fair to ask, does edge computing solve a real problem, or is it more of a backdoor entrance through which the old guard can re-establish its hegemony?

So remember Occam’s razor: The simplest answer is often the right one. To avoid falling off the edge, opt for local computing where you have to, and the cloud everywhere else.

How to Conduct a Market Analysis in a Crisis

How To Boost Content Marketing Using UX Rules