Tech Giants Join Linux Foundation's Connected-Cities Efforts

The recently announced Urban Computing Foundation might be a sleeper that will have a big positive impact on data center operators.

China Mobile's smart city display at Mobile World Congress 2019 in Barcelona
China Mobile's smart city display at Mobile World Congress 2019 in BarcelonaDavid Ramos/Getty Images

Data center operators might be well advised to pay attention to a new project announced by the Linux Foundation in May. Called the Urban Computing Foundation, its mandate is to accelerate the development of open source software designed for so-called "connected cities." According to the Linux Foundation, this includes software for mobility, safety, road infrastructure, traffic congestion, and energy consumption.

The project’s initial contributors include developers from well-known tech companies (Facebook, Google, IBM, and Uber), from a couple of startups (transportation-centered Interline Technologies and blockchain-based StreetCred Labs), as well as from MIT’s Senseable City Labs and the University of California San Diego.

Why would tech giants like Facebook, Google, and IBM be interested in jumping on the connected-cities bandwagon? The answer is simple: the cities that are already getting connected are generating enough data to make the bandwidth being used by major social networks seem trivial by comparison.

In February we talked with Simon Crosby, co-founder and CTO of Swim, which develops an open source edge-deployed platform for analyzing streaming data on the fly. He offered some impressive numbers around just the data generated by traffic monitoring cameras at busy intersections, a major aspect of the connected cities concept.

"Just the little City of Palo Alto [California] is many times bigger than Twitter," he said. "Four terabytes per day from Palo Alto; in Las Vegas, it's close to 100 terabytes of data per day."

With traffic cameras, used to help cities control traffic to relieve rush-hour congestion and to clear paths for emergency vehicles, much of the data is processed at the edge. However, the results of those analytics must be sent home to a data center for further processing and distribution. In Swim's case, after the data is analysed at the camera location, "we deliver that to an API, currently hosted in Microsoft Azure, and delivered by a partner of ours, Tidalwave, who markets those insights."

The marketing of data collected by connected cities is an important consideration, because it suggests that the cost of installing and maintaining connected-city systems might not come out of taxpayers' pockets but from companies that can benefit from the data.

This is underlined by the fact that the initial software being hosted by the Urban Computing Foundation is Kepler.gl, a geospatial analysis tool created by Uber, a company that certainly benefits from real-time traffic information to keep its vehicles on the move instead of stuck in traffic.

"As a founding participant with the Urban Computing Foundation, Uber is honored to contribute Kepler.gl as the initiative’s first official project,” Uber's data visualization lead, Travis Gorkin, said in a statement. "Technologies like Kepler.gl have the capacity to advance urban planning by helping policymakers and local governments gain critical insights and better understand data about their cities."

The Linux Foundation said a technical advisory board, "which includes a variety of technical and IP stakeholders in the urban computing space," has been created, and it is developing and open governance model for the project. The advisory board will also manage the review and curation process under which new software projects are accepted.

Read more about:

Data Center Knowledge

About the Authors

Christine Hall

Freelance author

Christine Hall has been a journalist since 1971. In 2001 she began writing a weekly consumer computer column and began covering IT full time in 2002, focusing on Linux and open source software. Since 2010 she's published and edited the website FOSS Force. Follow her on Twitter: @BrideOfLinux.

Data Center Knowledge

Data Center Knowledge, a sister site to ITPro Today, is a leading online source of daily news and analysis about the data center industry. Areas of coverage include power and cooling technology, processor and server architecture, networks, storage, the colocation industry, data center company stocks, cloud, the modern hyper-scale data center space, edge computing, infrastructure for machine learning, and virtual and augmented reality. Each month, hundreds of thousands of data center professionals (C-level, business, IT and facilities decision-makers) turn to DCK to help them develop data center strategies and/or design, build and manage world-class data centers. These buyers and decision-makers rely on DCK as a trusted source of breaking news and expertise on these specialized facilities.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like