Skip to main content

Examples of edge computing: how cloud architects are benefiting from cloud-to-edge architecture

Read these stories of cloud and enterprise architects as they build a playbook of great ways to take advantage of edge architecture.
Image
Woman walking up the side of a building

Does your organization have an edge computing strategy?

28 votes tallied
Yes
14 votes
No
4 votes
Not yet, but I'm working on one now
7 votes
I'm not sure
3 votes

In 2019, the value of the global edge computing market was $3.5 billion. By 2027, it's projected to hit $43.4 billion, according to an edge computing stats and predictions review from The Enterprisers Project.

But the real question is whether organizations are actually adopting its technology, and if so, how?

That how aspect is quite important, being that edge computing does come with a wide variety of architectures with different goals in mind. Here are a couple of innovative use-case applications and architectural considerations for edge computing, as told by software and system experts designing these new architectures.

Edge computing for faster data processing

When someone thinks of the latest cloud computing innovation trends, they're probably not thinking of the retail or fast food industry. However, one very famous fast-food chain, Chick-fil-A, has used edge computing in conjunction with Kubernetes container orchestration and the concept of GitOps, to maintain its long-standing reputation as an industry leader in service and hospitality.

The secret behind the chain's speed and proficiency is running a Kubernetes cluster in every store. Sean Drucker, an executive technologist at Chick-fil-A, says that the chain serves one sandwich every 16 seconds at its peak. With well over 2,000 restaurants nationwide, that level of operational efficiency is achieved by hundreds of Internet of Things (IoT) devices reporting operational data back to the business. The plethora of fryers, grills, and refrigeration systems used for the food sends telemetry back and forth to a three-node cluster with only 8GB of RAM in each system. The cluster processes the data to keep every station stocked and working at max capacity. These "little data centers," as Drucker calls them, drastically increase the speed of these insights and offer a great lesson in edge computing architecture.

Edge artificial intelligence (AI): A risk-prevention tool versus a security concern

Juan Pablo, director of engineering at Artify, a web-based design editor, says he sees the potential of edge computing, though his company is still in the discussion phase of trying to understand just how edge fits the needs of their smaller organization.

"The fast rise of edge computing has shown that it has great potential to be used in any industry, but it's been difficult to implement it in real life due to logistics and infrastructure problems...However, we've been investigating and analyzing edge computing trends because it will be a very useful tool for data analysis," says Juan. "Now, security concerns involving AI technologies can generate a deep debate considering its pros and cons. Nevertheless, I like to see edge computing as a risk-prevention tool rather than as a technological menace. For example, by combining edge systems and new AI models, like DALL·E (an open AI program), new visual platforms can be created to identify real-time risk situations, such as industrial accidents, work accidents, or even car accidents. If AI visual models can understand the risk of a situation after seeing it through video cameras and communicate it to an edge network, high-risk situations can be prevented or attended to on time."

What Pablo is referring to is a kind of symbiotic cloud-to-edge architecture that allows edge computing to boost the processing speed of data through artificial intelligence (AI) to support computing performance and effectiveness. One of the benefits is real-time risk mitigation, where key insights are delivered more rapidly by bypassing the time it takes for data to be transmitted back and forth for processing. This kind of architecture is optimal for troubleshooting issues that can lead to system failures that need to be quickly addressed by the respective IT personnel. A few other real-life applications of edge AI are voice-recognition systems like Siri and Alexa, or highly tailored edge-monitoring solutions for oil rigs streaming data.

What about internal security concerns about implementing edge? "Edge computing, when managed by an effective DevSecOps program, doesn't pose any greater risk than other architectures," says Peter Clay, a former Deloitte & Touche chief information security officer (CISO) and CEO and founder of Cyberopz.

Peter identifies four key edge considerations to keep in mind when implementing your edge strategy:

  1. If a web server gets compromised, drop it and bring up a new one.
  2. Increase security monitoring to prevent surprises.
  3. Don't push mission-critical processes and services to the edge.
  4. Enable the IT team to preconfigure devices for testing prior to deployment.

Wrap up

Clearly, organizations are taking edge computing seriously. There are many ways to realize the advantages of computing capabilities housed at the environment's perimeter, local to the work that needs to be done—just look at the effectiveness of Chick-Fil-A's implementation.

Edge computing will continue to grow over the next decade and become an even more integral part of the cloud computing architecture.

Does your organization have a strategy for edge? Let us know in the poll above.

Topics:   Edge computing   Cloud   DevOps  
Author’s photo

Marjorie Freeman

Marjorie is the Community Manager for Enable Architect. More about me

Navigate the shifting technology landscape. Read An architect's guide to multicloud infrastructure.

OUR BEST CONTENT, DELIVERED TO YOUR INBOX

Privacy Statement