MLPerf Is Changing the AI Hardware Performance Conversation. Here’s how

What’s improving machine learning? Is it competition, or is it something else?

Nvidia's DGX-2 supercomputer on display at GTC 2018
Nvidia's DGX-2 supercomputer on display at GTC 2018Yevgeniy Sverdlik

What should “performance” mean in the context of machine learning (ML)? With more supercomputers and cloud computing clusters supporting snap-judgment decisions every minute, artificial intelligence (AI) engineers are finding out it’s just as important to improve the way deep learning algorithms deliver their results as it is to improve the performance of the processors and accelerators that produce them.

Since May 2018, researchers from Baidu, Google, Harvard, Stanford, and UC Berkeley have been developing the MLPerf benchmark. It’s a tool that measures the amount of time consumed in training a machine learning model to the point where its inferences (the ability to make estimates or predictions) reach a 75.9 percent confidence level. ML system builders and architects have been invited to use MLPerf to gauge the accuracy and performance of their systems, and perhaps along the way to do a little bragging about them.

However, in June 2019 MLPerf was expanded, incorporating a new testing category into what’s suddenly become a suite of performance analysis tools. With the addition of an inference benchmark, MLPerf 0.5 can also clock the time algorithms consume post-training in using the data they’ve built up to reach conclusions.

Time-to-Solution

Not every person involved in the operation of an ML system will be a data scientist, says Ramesh Radhakrishnan, technology strategist at Dell EMC and a contributor to the MLPerf development group. Without a complete understanding of what’s going on under the hood, the mechanism of ML can be completely opaque. A simple measurement, such as the total number of reasonably correct predictions, can go a long way toward giving everyone involved in the ML management process a basic competency.

To read the rest of this free article, please fill out the form below:

 

Read more about:

Data Center Knowledge

About the Authors

Scott Fulton III

Contributor

Scott M. Fulton, III is a 39-year veteran technology journalist, author, analyst, and content strategist, the latter of which means he thought almost too carefully about the order in which those roles should appear. Decisions like these, he’ll tell you, should be data-driven. His work has appeared in The New Stack since 2014, and in various receptacles and bins since the 1980s.

Data Center Knowledge

Data Center Knowledge, a sister site to ITPro Today, is a leading online source of daily news and analysis about the data center industry. Areas of coverage include power and cooling technology, processor and server architecture, networks, storage, the colocation industry, data center company stocks, cloud, the modern hyper-scale data center space, edge computing, infrastructure for machine learning, and virtual and augmented reality. Each month, hundreds of thousands of data center professionals (C-level, business, IT and facilities decision-makers) turn to DCK to help them develop data center strategies and/or design, build and manage world-class data centers. These buyers and decision-makers rely on DCK as a trusted source of breaking news and expertise on these specialized facilities.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like