Open Compute Project: Redefining Open Source for the Data Center

OCP expanded the meaning of "open source" beyond software to address the same problems open source software is meant to address.

ocp servers microsoft
Microsoft's custom cloud servers, open sourced through the Open Compute Project, as seen at the OCP Summit 2017Yevgeniy Sverdlik

When the Open Compute Project was founded (about 10 years ago), it began redefining the term "open source." Before then, the term had primarily been applied to software, although it had, to a lesser degree, been applied to hardware as well.

At OCP, the term open applies to specifications regarding all aspects of a data center -- from servers and network switches to power and cooling.

While this means that OCP usage is often much different from the way the open source software community talks, sometimes it's exactly the same. That's because OCP shares some of the same concerns as the open source software folks, such as avoiding vendor lock-in and building "vendor-agnostic."

"I like to think of vendor agnosticism or vendor neutrality as the slippery slope that took me all the way to open source," Rob Coyle, the co-lead of the Open Compute Project's OCP Ready program, told DCK. "When someone sees the value in a supply chain that can be made resilient by having multiple sources for a piece of equipment that can meet the same performance characteristics, the next logical step for me is going full open source."

On August 17, Coyle will be giving a presentation at Data Center World in Orlando called, "Vendor Neutrality and Open Source: Optimizing the Data Center Design Process."

What Is OCP?

The Open Compute Project started life in 2009 as an internal project at Facebook called "Project Freedom." Two years later, Intel, Goldman Sachs, Rackspace, and Sun Microsystems co-founder Andy Bechtolsheim joined Facebook to form the Open Compete Project, tasked with sharing designs of data center products along with best practices.

"Basically, they were going to consume so much server and hardware technology about 10 years ago that they had to hand out all their design files so that companies could build it for them," Coyle said. "That's evolved to [include] the entire data center ecosystem, including facilities, heating and cooling, and many other things outside of software."

From there, these designs intended to meet the needs of hyperscalers have been adapted to meet the needs of smaller data centers being run by colocation operators and even on-premises data centers.

"We see these technologies from the hyperscalars trickle down, and it's been happening for long before the Open Compute Project existed," he said. "We're seeing that trend continue and maybe even accelerate, with Facebook, Microsoft, and LinkedIn largely using the open source OCP technology, and we're seeing colocation facilities join our OCP Ready program to tell the marketplace that their facilities are ready to accept open hardware."

OCP Ready is a self-assessment program that allows data center operators to check themselves against OCP guidelines, which cover everything from weight rating for the floors to doorway heights and cooling provisioning.

The guidelines include OCP's 21-inch Open Rack design that can accommodate power densities up to 40kW per rack (which is larger and not compatible with Open19, another open source rack specification from LinkedIn that's now a Linux Foundation project).

Coyle said that although "the 19-inch standard rack is the prevailing approach," the OCP racks do offer advantages.

"We have no connection points at the rear of our rack," he said. "As long as there's sufficient airflow, you can take one of our racks and push it almost all the way up against a wall and you can have access to all required cabling at the front of front of our racks."

Data center operators using the racks are also able to take advantage of an equipment recycling program involving hyperscaler equipment that OCP participates in with companies such as ITRenew.

"You can have hardware that has been refreshed and repurposed that is relatively very new at a much cheaper rate than you pay purchasing servers brand new from a traditional vendor," Coyle said. "A lot of the technology that companies like Facebook purchase [is bought] 18 months before it's available on the open market. They're using it for about three years, so when it comes onto the secondary market, somebody can essentially be buying technology that's only 18 months old at a much lower initial cost.

Also with "a much lower total cost of ownership," he added, "because of technologies like onboard battery storage, so you don't require a central UPS. You could eliminate that from the design of your data center, and in a smaller on-prem that could be substantial cost savings."

Vendor Lock-in and COVID-19

We asked Coyle for a brief summation of the presentation he'd be giving at this year's Data Center World.

"It's an overview of the current market conditions," he said. "How the last year or two with the pandemic has really accelerated the need for vendor agnosticism, how the next level of that is to take things open source, and some examples of how that's worked out really well."

"The last year, at least in North America and the United States, where our GDP was cut in half, we built millions of square feet of data centers while the supply chain was severely constrained," he added. "If you were married to one manufacturer or vendor based on your specification or other limitations that you had built into the design of your data center, I believe you were at a significant disadvantage.

"Now, as the economy reboots and the rest of the world comes back online, and other industries start consuming these raw materials that we run our data centers with, it's only going to get harder" he said. "We have to think in a new way and avoid these legacy ways of thinking to protect our supply chains and keep these projects on schedule and on budget."

This year's Data Center World will be held August 16-19 at the Orange County Convention Center in Orlando, Florida.

Read more about:

Data Center Knowledge

About the Authors

Christine Hall

Freelance author

Christine Hall has been a journalist since 1971. In 2001 she began writing a weekly consumer computer column and began covering IT full time in 2002, focusing on Linux and open source software. Since 2010 she's published and edited the website FOSS Force. Follow her on Twitter: @BrideOfLinux.

Data Center Knowledge

Data Center Knowledge, a sister site to ITPro Today, is a leading online source of daily news and analysis about the data center industry. Areas of coverage include power and cooling technology, processor and server architecture, networks, storage, the colocation industry, data center company stocks, cloud, the modern hyper-scale data center space, edge computing, infrastructure for machine learning, and virtual and augmented reality. Each month, hundreds of thousands of data center professionals (C-level, business, IT and facilities decision-makers) turn to DCK to help them develop data center strategies and/or design, build and manage world-class data centers. These buyers and decision-makers rely on DCK as a trusted source of breaking news and expertise on these specialized facilities.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like