Jeff Ready, CEO, Scale Computing: On the evolution, use cases, and tailwinds of edge

James is editor in chief of TechForge Media, with a passion for how technologies influence business and several Mobile World Congress events under his belt. James has interviewed a variety of leading figures in his career, from former Mafia boss Michael Franzese, to Steve Wozniak, and Jean Michel Jarre. James can be found tweeting at @James_T_Bourne.

There is a long-standing joke around edge computing and driverless cars. Back in 2019 Mike Fay, then of Dell but now at Amazon Web Services (AWS), told an audience at the IoT Tech Expo event: “If an autonomous car is driving and a deer runs out, you can’t send those calculations to the cloud. If you do, you’re having venison for dinner.”

The need for lower latency to enable the types of computation required for edge use cases has never been more apparent. Jeff Ready, co-founder and CEO of Scale Computing, explains the rationale while also using the connected car theme.

“It’s a very practical reason of physics,” he tells Edge Computing News. “If somehow that car were connected via fiber optic cable, and a perfect packet of data went 2000 miles from New York to San Francisco and back again, that round trip at the speed of light takes 80 milliseconds. If you’re driving 70 miles an hour, you’ve gone 14 feet, you’ve already hit the animal.

“It doesn’t work until somebody goes faster than the speed of light – there’s no solution.”

This isn’t the case with all industries, however. Ready notes one customer who uses high definition cameras to examine cans on a manufacturing line. “There’s a pneumatic arm which pokes out the bad cans – we’ve all seen video of this sort of thing,” he says. “But it’s the same problem – the cans are moving so fast. If you had to go to the cloud and back to analyse an image that was captured, the can’s already gone.

“The edge is a natural complement, an offshoot companion, however you want to think of it, to the cloud,” Ready adds. “You run the applications where they make sense, and they are often interoperating with each other.”

Scale Computing has plenty of experience when it comes to assessing the progression from cloud to edge. The company is famous for its HC3 hyperconvergence line, launched in 2012 having previously adopted KVM as its hypervisor. Google Compute Engine launched in 2013 with a KVM-based hypervisor, while Apple dropped VMware in favour of KVM two years later.

When it comes to edge, and the HC3 Edge series, the key, Ready asserts, is machine intelligence: infrastructure which can be deployed quickly, managed remotely, and able to self-heal.

The company was well aware of where the puck was going – and so too, it turned out, were its customers. “We knew that customers were demanding more and more of this automation – even more of those large enterprises that had the 100 IT guys and saw value in the automation,” explains Ready.

“I won’t say we’re the only ones, but there’s not many people who are trying to use machine intelligence across the entirety of infrastructure – servers, storage, networking, virtualisation, those core components.

“So these customers, some very large, highly distributed enterprise customers, folks that are in the retail space where you might have thousands of stores, or transportation. Or ocean-going vessels; you’ve got hundreds of oil tankers out at sea, with infrastructure on them, and obviously they don’t have an IT admin on board. They found us.”

While Ready admits to not being ‘clairvoyant’ with regard to this opportunity, he was right with how enterprises would adopt cloud. “There was a moment in time where more than a few folks were out saying everything’s going to the cloud – and I always thought no, that’s not true,” he says.

“The cloud is important, but like the edge is a fancy way of saying outside the data centre, the cloud is a fancy term that means somebody else’s data centre. It’s not magic. There’s a data centre, there’s apps there, and some apps make sense to run in the cloud, some apps make sense to not run in the cloud.

“What’s put some real tailwinds behind the edge, I think, and why it’s now becoming so big, is that there’s other trends which expose more of the applications that make sense to run locally,” Ready adds. “There’s a lot going on with video – from basic security, to doing AI-type analytics on video, and you’re doing that processing – well that’s a lot of data. Sometimes you’ve got to make decisions in real time.”

Another element to consider, Ready notes, is that many companies were using edge technology without realising it. “I’ve seen this in retail in particular, although it’s true of many locations that are remote,” he says. “Managing remote locations, historically, is a huge pain. There is stuff running on-site, even if it’s a dedicated server to host cash registers, or to host the video cameras. If one of those servers dies, the application dies with it.”

Security is a key case in point, with so many companies being publicly compromised in recent years. If you’ve got servers in remote locations, they are less likely to be fully up-to-date. “In retail – they know that’s bad, so they want to consolidate applications that are going to run, what I would say at the edge, they would say in their store,” says Ready.

So now they’re faced with a question of a disparate stack: bare metal servers, modern containerised applications, legacy applications running on virtual machines which won’t move to containers for some time. These companies don’t want a different set of infrastructure for each use case, because that was the original problem to begin with.

“So they start looking to consolidate infrastructure – and all of a sudden, what the customers are looking for is something that looks and behaves like a data centre,” says Ready. “It’s one pool of resources that you’re going to put your applications on; there’s redundancy and high availability, but they need it to run at the edge.”

Ready is speaking at the Edge Computing Expo event on February 9-10 and much of his talk will relate to real world use cases similar to the above. “The power of edge computing – there’s a lot of talk around the latency, but the reality is there’s a lot of use cases for edge computing that are happening today that are solving real problems today,” he says.

“So what I’d like the audience to be able to see is – here are real world examples, not theoretical, but people who are actually implementing edge computing in some way, and the impact it’s having on their business. They may have a different use case, but they can see how this now applies.”

Picture credit: Scale Computing/YouTube

Want to learn more about topics like this from thought leaders in the space?Find out more about the Edge Computing Expo, a brand new, innovative event and conference exploring the edge computing ecosystem.

Sign up: The latest edge computing news straight to your inbox.

View Comments
Leave a comment

Leave a Reply