What does it take to succeed in IT in the era of Cloud?
Part 1 of a 3 part series
March 11, 2016
IT has always been a tricky career field. Many people think of the rapid rate of change in IT systems, technologies, and employment as something that started with the advent of cloud-based services, but you can clearly trace the same concerns of today’s IT administrator back for 40 or more years, to a time when mainframes in glass-walled datacenters were the normal way that enterprises obtained their business computing. As the market-dominant technologies in use have changed, so has the set of skills required to stay in the game. One of the most common concerns I hear when I speak or write about Office 365, Azure, and other cloud services is simple: “How do I upgrade my skills to stay relevant as the things I now manage move to the cloud?”
Two of my former coworkers from Summit 7 Systems, Office 365 MVP Ben Curry and SharePoint guru Brian Laws, came up with an answer. Rather than the traditional mix of “IT pro” skills we’re all familiar with, they argue for recognition of a new breed of “cloud pro” instead of the tried-and-true IT pro. This is a valuable model, one that’s worth exploring to see how it fits with trends in the labor market, the technology world, and other forces that shape work opportunities in the IT field.
How we got to now
Back in the old days, computing was necessarily extremely specialized. There was a sharp distinction between the programmers who wrote application software, systems programmers who developed tools and operating systems, systems architects or analysts, the hardware engineers responsible for keeping the mighty beasts running, the operations staff (often responsible for mundane tasks such as changing tapes or cleaning out punched-card machines, but also for job scheduling and user management), and so on. There was a degree of mobility between these fields, but not as much as you might think: the skills required for different jobs were often very different, and the large variety of hardware and software in use meant that in many cases, people stuck with the same platform or technology throughout the majority of their career. This was the first stage of development of what we now think of as the IT pro role: highly specialized, task-specific workers whose skills were mostly developed through on-the-job training.
The emergence of increasingly smaller computers led to an increasing democratization of IT work. This started with the advent of dedicated word processors from companies such as Wang, Lanier, and Exxon. These systems didn’t require (and most couldn’t accept) programming, didn’t need design services from systems architects, and could usually be maintained by someone with limited or no IT-related skills. It didn’t take long for general-purpose computing to follow the same path. First we got minicomputers suitable for branch and regional offices, or smaller companies; that led to departmental and workgroup computers that, thanks to aggressive exploitation of Moore’s Law, ended up with more, and more powerful, computers on desks throughout the world. (At the same time, the parallel “bottom-up” path that led to the arrival of home computers accelerated this transition, but the consumerization of corporate IT is a topic for another article.)
During this period, the typical career path for an IT pro was to work in some sort of support capacity—perhaps on a help desk staff, or doing end-user hardware support—then moving up the ranks to technologies, and positions, of increasing responsibility. This path neatly matched the trajectory of most organizations’ IT deployment; first they started with a small number of centralized IT resources, but as computers, network connectivity, email, and file/print sharing became less expensive and less complex, they spread to smaller and smaller organizations. Let’s consider this the second phase of the IT pro role: more skill generalization, but still mostly self-taught or job-based learning.
By the early- to mid-1990s, four trends had become clear:
Moore’s Law powered an incredible expansion of computing capacity, putting more computing devices, with much more storage and power, into more places faster than almost anyone had previously expected. If you pick any random 10-year span from the history of enterprise IT, and compare the advancements in that field to advancements in any other field during the same time period, it’s pretty astonishing how much, and how fast, computing and storage technologies developed.
This accelerating technical capacity meant that more and more tasks, services, and processes could, and would, be computerized or automated. At the same time, local and wide-area networks, and the gradual emergence of the Internet for commercial use, meant that individual organizations’ networks were becoming increasingly complex and capable. This led to an increased demand for people across all layers of the enterprise IT world, from an executive’s printer problems up to wide-area network and connectivity issues.
The demand for skilled administrators, developers, and architects led employers to seek a standardized way to measure skills and competence. This in turn led major IT vendors to offer certification and training programs, such as Microsoft’s original Microsoft Certified Systems Engineer (MCSE) program and Novell’s Certified Network Engineer (CNE), to provide that measurement.
Both IT pros and employers flocked to these certification programs. An untold number of people studied their way into jobs by cramming for and passing these certification exams, getting hired based on the strength of having “MCSE” or “CNE” after their name, and then frantically learning everything they could on the job. This was not necessarily a bad thing, as the vendor certification standards managed to ensure that most candidates learned something about the subject matter even if they were largely paper tigers. Throughout the late 1990s and into the early 2000s, having your MCSE (or equivalent for another vendor) was almost a guaranteed foot in the door.
The continued broadening of the IT world led to a sort of re-specialization—almost a return to the old mainframe days, when you could be (for example) a Microsoft Certified Database Administrator (MCDBA) and be very successful in your job despite not knowing much about Windows in general, TCP/IP, and so on. This was natural and expected as new systems and products were introduced and then grew in complexity. For example, planning an enterprise Exchange 2003 deployment for a 10,000 seat company could be at least as complex a task as planning an Active Directory deployment for the same company.
The complexity and scope of IT operations and management continued to increase, and the broader market responded by placing high salary premiums on highly skilled positions—and the vendor and training markets responded by pouring out training and learning materials (classes, books, videos, and more) to help people chase those high-skill, high-dollar positions.
But then the whole world of IT was about to make a huge shift — a shift we'll discuss in our next post. Subscribe to IT Pro's newsletter to get it sent right to your inbox.
About the Author
You May Also Like