How Did GitLab Scale Up for the Slashdot Effect? Point and Click update from June 2018

Microsoft's acquisition of its competitor GitHub pushed more than 100,000 repositories to migrate to GitLab within 24 hours. Here's how GitLab's infrastructure team coped.

A GitLab's eye view of its CI autoprocessing software already running on GCP.
A GitLab's eye view of its CI autoprocessing software already running on GCP.GitLab

Hiding beneath Monday's headlines about Microsoft shelling out $7.5 billion to buy the open source software repository GitHub was an illustration of how much things have changed in the last 10 years or so. It appears that the Slashdot Effect is no longer as worrisome as it once was -- and not just because Slashdot.org is no longer a force to be reckoned with.

The "Slashdot Effect" -- for those of you who are new to the farm -- was a phrase once used to describe a sudden onslaught of traffic that was too great for a site's infrastructure to handle. The term came about because in the early days of the 21st century, a mere mention on the website Slashdot, or "/." as it was usually denoted, could send enough traffic to a site to make even big, well-prepared sites crash and burn.

Beginning last Friday, when rumors began to circulate that Microsoft might be preparing to purchase GitHub, competitor GitLab experienced something of a Slashdot Effect, as many Microsoft-wary open source developers started a GitHub mass exodus of sorts. Evidently, there's still an ample number of open source devs who don't want to have anything to do with Microsoft as much as four years after CEO Satya Nadella took the reins in Redmond and declared a new world order, where Microsoft "loved" Linux and all other things open source.

If you don't get it, don't try. It's a Hatfields and McCoys thing.

Although exact figures for the added traffic are unavailable, the company said on Friday that it was seeing "a 10x increase" in migrations. "We've seen over 100,000 repositories being migrated in 24 hours," Sid Sijbrandij, co-founder and CEO at GitLab, told Data Center Knowledge in an interview.

GitLab's infrastructure took this without breaking stride, although we did notice a bit of a burp or hiccup on Monday morning shortly after Microsoft's announcement, when live migration data disappeared for a while from the real-time graphs on GitLab's site.

So what kind of strategy did GitLab's tech team come up with to keep their servers up and running? It was as simple as a couple of clicks to fire up some extra virtual machines in the cloud.

"There were rumors before, so we talked about it as a team, and our production engineering team was standing by," Sijbrandij said. "What they did was put our static website behind the CDN, which was a good thing, and then during the day they scaled a lot of different things up."

GitLab didn't have to worry about provisioning any additional servers for its data centers, because all its IT infrastructure is located in public clouds. Until recently, it all lived on a single cloud, Azure [irony noted], but the Microsoft/GitHub brouhaha caught GitLab in the middle of its own migration, away from Azure to Google Cloud Platform.

It turns out that servers, data centers, colo space and the like aren't the only things GitLab is doing without.

"We don't even own an office," said Sijbrandij. "We're not renting one either. We're a remote-only company, so we have 270 people in 270 locations in 40 countries."

Which means that given the economics of public clouds GitLab is about as much on the pay-as-you-go plan as it's possible to be. Pretty frugal for a company that was already hosting 4 million projects before the recent spike.

The company also isn't running any containers in-house -- it's almost all VMs -- although it offers extensive Kubernetes and container functions to its customers. The need to adopt containers in its own infrastructure is partly the reason behind its move to GCP.

"We believe Kubernetes is the future," Sijbrandij said. "We've already added the ability to add a Kubernetes cluster to your projects at GitLab and make sure that all the compute happens there. We want to make sure that we're using what our customers are using, so we want to make sure that we run GitLab.com on top of Kubernetes. We think Google, the authors of Kubernetes, has the best service for Kubernetes, so we wanted to move to Google."

Sijbrandij said that initially, when it was founded in 2011, the company used AWS but moved to Azure in 2015. "We got a big credit, being a startup, to move there," he said.

Like all examples of the Slashdot Effect, the mass migration that has put GitLab in the news won't last forever. When it's done, the company can scale down just as easily as it scaled up -- with a few clicks. That's the norm in the age of the public cloud, where VMs, containers, and even bare-metal servers are routinely deployed or removed on the fly.

Even though this traffic spike is only temporary, Sijbrandij sees this as something of a dawning of a new era for GitLab.

"I think the core of the message is that what's happening over the last few days is not just because Microsoft bought GitHub," he said. "It's because GitLab is graduating from being a code hosting application to being this complete single application for the DevOps lifecycle. I think the developments of the last few days are shining the spotlight on that. That's not going away; that's going to be a sustained change."

Read more about:

Data Center Knowledge

About the Authors

Christine Hall

Freelance author

Christine Hall has been a journalist since 1971. In 2001 she began writing a weekly consumer computer column and began covering IT full time in 2002, focusing on Linux and open source software. Since 2010 she's published and edited the website FOSS Force. Follow her on Twitter: @BrideOfLinux.

Data Center Knowledge

Data Center Knowledge, a sister site to ITPro Today, is a leading online source of daily news and analysis about the data center industry. Areas of coverage include power and cooling technology, processor and server architecture, networks, storage, the colocation industry, data center company stocks, cloud, the modern hyper-scale data center space, edge computing, infrastructure for machine learning, and virtual and augmented reality. Each month, hundreds of thousands of data center professionals (C-level, business, IT and facilities decision-makers) turn to DCK to help them develop data center strategies and/or design, build and manage world-class data centers. These buyers and decision-makers rely on DCK as a trusted source of breaking news and expertise on these specialized facilities.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like