Subscribe to the feed

This post was inspired by a Compiler episode on the same topic. Compiler is a Red Hat podcast about all the things you’ve always wanted to know about what moves tech forward, straight from the people who know it best. Check it out!

Let’s say you’re playing your favorite online multiplayer game when—all of a sudden—things start getting choppy. Your character isn’t moving, other players start freezing up, maybe the in-game music isn’t even coming through. 

Lag.  

This dreaded three letter word is one of the biggest and most frustrating problems in modern online gaming. It happens when we have to wait for data to come our way—and it can take milliseconds to actual seconds for that to happen.

But what if there was a way to vastly reduce all lag in online gaming? With edge computing, that very well may be possible. In this post, we dive into what edge computing is, how data in online gaming traditionally travels, and why edge computing can solve the challenges of modern online gaming. 

What is edge computing?

Edge computing takes place at or near the physical location of either the user or the source of the data. 

By placing computing services closer to these locations (the “edge”), users benefit from faster, more reliable services while companies benefit from the flexibility of hybrid cloud computing. 

Edge computing is one way that a company can use and distribute a common pool of resources across a large number of locations. Here are some very important edge computing terminology to know:

  • Edge device: Any piece of hardware that processes data outside of the cloud.

  • Edge servers: Servers that process data at an edge location.

  • Edge node: Any cluster of edge servers where edge computing takes place.

Why lag exists in modern gaming

To understand how edge computing works in online play, we first need to understand the typical cloud gaming data path. For example, let’s take the experience of playing an online first-person shooter: a data packet’s journey begins the second you press a button on your controller to fire at another player.

Before the input data can be processed in the cloud, it first has to navigate through your home network and Internet Service Provider (ISP). Once the data reaches the cloud network infrastructure from the ISP, further delay is added by graphics rendering and video encoding. Finally, the data has to travel all the way back to you—and by that time, the data packet has taken maybe hundreds of milliseconds to complete its journey. 

Hundreds of milliseconds may not sound like a lot of time, but it can make a huge difference when you’re playing something like a first-person shooter. Take our example as a case in point: Thanks to the lag, you fire your shots a bit later than you intended—and the player you were aiming at got away. 

Bummer! 

Hopefully this helps you understand why lag in online gaming exists: Data packets may move incredibly fast (roughly two-thirds the speed of light), but since cloud servers are often so far away from where the data originates (the user), it can take a few extra milliseconds for the packets to complete their journey to and from the server. In other words, distance is the chief factor in determining latency.

How edge computing solves the challenges of modern gaming

When we're talking about edge computing, we’re shortening the distance the data has to travel. In other words, instead of having compute resources centralized at large, faraway datacenters, we’re pushing compute power close to the network edge—closer to you, the player. 

But how does this all work? With the help of edge infrastructure.

Edge computing infrastructure” by NoMore201 is licensed under CC BY-SA 4.0

Edge infrastructure is a vital part of the network edge. It refers to micro-datacenters located close to areas they serve which act as a gateway for connecting to the cloud, making it easier to store and process data faster than a large-scale datacenter.

The end result for gamers? Increased speed—and less lag! 

The same thing applies to mobile gamers as well. When a telecommunications service provider moves mobile workloads closer to the end user, it can be considered a new kind of mobile architecture. This architecture is called mobile edge computing or multi-access edge computing (MEC). MEC means co-locating edge devices with existing mobile network infrastructure—whether in a nearby building or even on the cell tower itself. 

So,  let’s say you're playing an online game on your mobile phone. Instead of your phone communicating wirelessly to the closest cell tower and then being routed  all over kingdom by an ISP, your signal can just be calculated at the cell tower and sent right back to you with a microserver. 

In this case, microservices are responsible for things that don't need to go back to the main server or can be asynchronously connected to the server over time. So maybe when things are less busy, that data can be synced back upstream.

Another tangible example of edge computing for video games could work for real-time strategy games, where an AI runs on your computer. When your computer is processing everything locally, you're going to run into an upward limit as to what that AI can do. 

If you offload that to a server in some type of edge infrastructure, however, all of a sudden you can play that same real-time strategy game against a very sophisticated AI. 

Neat!

Learn more about edge computing

In this post, we outlined what edge computing is and why video games are a compelling use case for it. To learn more about edge computing, including how Red Hat’s broad portfolio can help you implement and manage edge infrastructure, check out our edge resources.

Read more

Featured resources


About the author

Bill Cozens is a recent UNC-Chapel Hill grad interning as an Associate Blog Editor for the Red Hat Blog.

Read full bio
UI_Icon-Red_Hat-Close-A-Black-RGB

Browse by channel

automation icon

Automation

The latest on IT automation for tech, teams, and environments

AI icon

Artificial intelligence

Updates on the platforms that free customers to run AI workloads anywhere

open hybrid cloud icon

Open hybrid cloud

Explore how we build a more flexible future with hybrid cloud

security icon

Security

The latest on how we reduce risks across environments and technologies

edge icon

Edge computing

Updates on the platforms that simplify operations at the edge

Infrastructure icon

Infrastructure

The latest on the world’s leading enterprise Linux platform

application development icon

Applications

Inside our solutions to the toughest application challenges

Original series icon

Original shows

Entertaining stories from the makers and leaders in enterprise tech