Learn the Benefits and Drawbacks of Edge Computing

This exciting technology can extend compute power to places and scenarios previously considered unimaginable – but it carries some very real risks.

Stuart Burns

December 21, 2018

1 Min Read
Learn the Benefits and Drawbacks of Edge Computing

Already have an account?

Edge computing can cover anything from a tiny single board system to a rack of mountable appliances. Put simply, all the data collection occurs locally and the useful, aggregated valuable data is then uploaded to the cloud for further computation or decision making. Great examples of good uses of edge computing include remote device diagnostics and telemetry; the data they log is fed back to the manufacturers or support service teams to help them troubleshoot, improve reliability and even help identify issues that may occur in the future and be able to remediate them before they cause unscheduled downtime or reduction in service.

The unifying factor is all these scenarios are network limited (in terms of either capacity, expense or sometimes reliability) to the cloud infrastructure as we know it.

This report outlines the challenges and risks to edge computing, identifies environmental factors, steps through correct data-collection processes, then offers recommendations on how to reduce on-going support costs. For anyone considering expanding their IT operations to include remote devices and the Internet of Things, this primer is an excellent backgrounder in what you'll need to know before computing on the edge.

 

Read more about:

Downloads
Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like