Local Break Out (LBO) and Its Role in Edge Cloud

As 5G infrastructure is built out, LBO (which has been around for years) has a new, crucial function.

Side of a wireless tower
Smith Collection/Gado/Getty Images

It’s been a part of wireless networks for years, including as a component of 4G LTE. Recently, however, with the rise of edge computing, Local Break Out has become topic of serious discussion in the data center market.

LBO is the part of the wireless network that lets selected streams of incoming data be diverted, like a fork in a river or a tap on a beer barrel, away from the network’s own addresses. It was proposed by communications engineers back in 2012 to enable roaming devices like smartphones to have the data they’re using to be routed to them by service providers other than the ones they’re customers of. At the time it was also under consideration by the European Parliament [PDF] as a kind of platform for a competitive market in roaming communication, a way for many networks to serve a single customer with voice and data.

It later became part of the 3GPP standards organization’s Release 8 of the global standard for wireless infrastructure – baked into the network. The standard is presently on Release 16, the second stage of 5G deployment, with Release 17 due to be finalized in early 2021.

With 5G turning up the volume on data throughput, LBO is becoming a kind of all-purpose socket for edge computing platforms to tap into the wireless network and connect to services and data providers upstream. One of the things this does is enable smaller platforms to provide data center services that are competitive, at least in some regards, to colocation and public cloud giants by leveraging wireless networks’ fiber optic links as their interconnecting fabrics.

Gateway Arch

“3GPP has defined the roles of home and visited networks while sharing the responsibility of providing the user experience or data service,” remarked Tom Nadeau, regarded by many as a “father of SDN” and presently a technical director at Red Hat. “The visited network is responsible for building its network infrastructure independently of the home network operator.”

LBO was designed to enable a shorter path between a wireless user and a roaming data network, leveraging the existing wireless packet gateway in the network being visited as the routing agent. In practice, however, LBO was expanded for the home network’s purposes, shortening the distance between the packet gateway and a nearby access point to the internet, or to a cloud service provider. In a note to DCK, Nadeau explained that 3GPP’s architecture for LBO is altogether agnostic about the infrastructure of either the remote visited service provider network that connects to it or the home network from which a connection may be made.

“The 3GPP SBA (Service Based Architecture) refers to a 3GPP-defined model of how [network] functions are built and chained to deliver mobile services,” Nadeau wrote. “These ‘services’ are built using software components called network functions.” The functions can run on a cloud platform such as Red Hat’s OpenStack, he pointed out.

“LBO is an option to ‘short-circuit’ traffic at the local POP [point-of-presence] where the data plane exists to deliver edge services. S/PGW-U [the converged serving gateway and packet data network gateway in 5G, on the user plane of SDN] can be placed as a network function on OpenStack at the edge to handle Local Break Out.”

In other words, no single part of the roaming data connection requires a peculiar type of server or unique network appliance. It’s an anything-to-anything connection. And at the point where just about anything comes together, why not build an entirely new data services market?

“At the edge, there’s this concept of an edge exchange,” explained Alex Marcham, technical marketing manager for the edge data center provider Vapor IO. An edge exchange is similar to a provider-neutral Internet Exchange, which gives networks more options for direct connectivity to ISPs and CDNs, giving operators similar options for outside connectivity at the edges of their networks. Conceivably, it could be a way for Amazon Web Services or another cloud provider to offer low-latency connectivity between its cloud platform and subscribing private facilities.

“In those cases, if you were to have an LBO connection from your packet gateway to one or more of those edge exchange sites you could potentially exchange data between, say, a Verizon and an Amazon without having to take your original long path back to a hyperscale facility,” Marcham said. “In that case, once you leave the packet gateway on that SGi interface, you are routable IP. You go into an edge exchange which has the same components as a traditional network exchange, in the form of a meet-me room, and the various letters of agreement that control those network connections through that MMR.”

The Big Tent Pole

There’s an argument to be made, Marcham believes, in favor of the development of an edge services ecosystem. It wouldn’t replace or supplant the current cloud services-based economy, but rather supplement it with lower-latency options on a premium service tier.

In one scenario, for example, a CSP could offer API access to an AI service, perhaps for rapidly processing photo or video data and identifying its contents. Today, a round-trip transaction involving a cloud data center in one of its regional availability zones would take too long for a near-real-time application. But on a premium tier, where the connectivity is arranged through an edge exchange, users could be linked to the same service using the much faster fiber optic network that wireless operators use for backhaul. LBO would provide both the on-ramp and the off-ramp into the fiber network.

There’s a business to be forged there, believes Vapor IO’s chief marketing officer Matt Trifiro. Just whose business it will be, however, remains to be determined.

“Offering the operators the ability to deploy white-box hardware in facilities that they didn’t have to build and don’t have to carry on their balance sheets provides massive economic benefits, as well as massive time-to-market benefits,” he said. Conceivably, as much as one-third of the cost of operating an edge data center could be wiped off an operator’s balance sheet, and risk could be offloaded to a partner.

“If I’m Verizon, if I’m Amazon, I’ve got to put something there,” continued Trifiro. “And somebody’s already solved the networking and the colocation problem. It’s an order of magnitude more efficient and cheaper to use that than to build my own.”

“Customers are hoping to capitalize on the opportunities that edge computing can bring to their organizations,” noted Red Hat’s Nadeau. “While edge computing and Mobile Edge Computing are established concepts, what’s different are the new technologies that are making it easier to deploy and operate resources at large scale across broadly distributed locations, while at the same time enabling a new wave of applications and workloads that businesses can use to create differentiation, drive new business, or lower costs.”

In 1904, an electrical engineer named Harvey Hubbell happened upon the concept that electricity could become a viable industry if the network were to adopt standard sockets. Well over a century later, engineers believe, edge exchanges could become a similarly viable, world-changing industry, as long as everyone agreed on how to plug in.

Read more about:

Data Center Knowledge

About the Author(s)

Scott Fulton III

Contributor

Scott M. Fulton, III is a 39-year veteran technology journalist, author, analyst, and content strategist, the latter of which means he thought almost too carefully about the order in which those roles should appear. Decisions like these, he’ll tell you, should be data-driven. His work has appeared in The New Stack since 2014, and in various receptacles and bins since the 1980s.

Data Center Knowledge

Data Center Knowledge, a sister site to ITPro Today, is a leading online source of daily news and analysis about the data center industry. Areas of coverage include power and cooling technology, processor and server architecture, networks, storage, the colocation industry, data center company stocks, cloud, the modern hyper-scale data center space, edge computing, infrastructure for machine learning, and virtual and augmented reality. Each month, hundreds of thousands of data center professionals (C-level, business, IT and facilities decision-makers) turn to DCK to help them develop data center strategies and/or design, build and manage world-class data centers. These buyers and decision-makers rely on DCK as a trusted source of breaking news and expertise on these specialized facilities.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like