Google Launches Private Docker Container Registry Service update from January 2015

Offers to take private registry management and security off corporate IT shoulders

Data Center Knowledge

January 24, 2015

2 Min Read
Google Launches Private Docker Container Registry Service
Docker Founder and CTO Solomon Hykes (Photo: Docker)

As enterprises test the waters to see how well Docker containers work for their developers and applications they make, security becomes an ever more crucial point to address.

The predominant way container images have been stored has been a public registry accessible over the Internet and maintained by Docker, the company. That’s not good enough for security- and compliance-sensitive enterprise users, and several alternatives have emerged.

New York startup Quay.io started with a hosted private registry for Docker containers. In August 2014, another startup, called CoreOS, acquired Quay.io and very quickly introduced a version of the registry users could deploy in their own data centers, behind their own firewalls.

In December, Docker, a company built around a piece of open source technology, previewed a private container image registry as its first commercial product. Docker Hub Enterprise can be hosted on premises or hosted in private or public clouds.

Google, which has been using containers to run its own infrastructure for years and which has been a big supporter of Docker ever since it was born, today took things a step further. The company announced a private container registry service running on the Google Cloud Platform. The aim is to give users a secure private registry but without the pain of securing and managing it.

Online retailer zulily is one of the first users of the Google Container Registry. “Private registries help, but they need valid certificates, authentication and firewalls, backups, and monitoring,” Steve Reed, principal engineer at zulily, said in a statement. “Google's container registry provides us with a complete Docker registry that we integrate into our development and deployment workflow with little effort.”

The service hosts a user’s private images in Google Cloud Storage under the user’s Google Cloud Platform project. This way, the only people who can access the images are people that have access to the project. They can push and pull images through the Google Cloud SDK command line.

Images are automatically encrypted before being written to disk. They are cached in Google’s data centers and can be deployed to Google Container Engine clusters or container-optimized VMs available on Google Compute Engine.

The service is currently in beta and so comes free of charge, for now. The only thing the users have to pay for is cloud storage capacity used to store the images and network resources they consume.

About the Author

Data Center Knowledge

Data Center Knowledge, a sister site to ITPro Today, is a leading online source of daily news and analysis about the data center industry. Areas of coverage include power and cooling technology, processor and server architecture, networks, storage, the colocation industry, data center company stocks, cloud, the modern hyper-scale data center space, edge computing, infrastructure for machine learning, and virtual and augmented reality. Each month, hundreds of thousands of data center professionals (C-level, business, IT and facilities decision-makers) turn to DCK to help them develop data center strategies and/or design, build and manage world-class data centers. These buyers and decision-makers rely on DCK as a trusted source of breaking news and expertise on these specialized facilities.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like