How Microsoft Can Fix Windows 8, Part 2: Virtualize Compatibility
When it comes to backwards compatibility, Windows is all things to all customers. But it's time to put legacy technologies on the backburner and turn Windows into a truly modern, componentized system that can be more easily optimized for a coming generation of PCs and PC-like devices.
December 18, 2010
While there are many ways in which to compare and contrast Apple and Microsoft, one of the more disparate differences is the way in which the two companies handle backwards compatibility. Where Apple is (perhaps overly) aggressive about removing legacy technology deadwood from its products, Microsoft values backwards compatibility above all else, and often leaves out-of-date technology its own products, especially Windows, for far too long.
Microsoft's approach is seen as more customer-centric in the sense that it allows customers to more easily upgrade to new product versions as they appear. But there are downsides to endlessly supporting ancient technologies and security is only the most obvious. Windows is far more complex, bloated, and hard to support as a result of this decision. And it makes it harder for Microsoft to make major strides when improving the OS.
Witness the debacle that was Windows Vista: There, Microsoft wanted to create a solid new foundation for Windows, one that would last a decade, but it had to do so within the technical and user experience confines of all the Windows versions that came before it. The result was a mess.
Windows 7, by comparison, gives the appearance of simplicity while retaining the same technical and user experience cores of its predecessor. The reason Windows 7 is seen as such a huge success where Vista was not is mostly perception, however, not reality. Under the covers, it's the same stew.
But imagine how stable, safe, and high-performance Windows could be if Microsoft simply abandoned all of the legacy technologies that have been holding it back for decades. Now stop imagining, since Microsoft will never do such a thing, and the point of this series isn't to engage in pipe dreams but rather look at real solutions--general as they may be--that the software giant could actually implement.
Fortunately for Microsoft and all of its diverse customers, such a solution exists. We think of it as "virtualization," though as is the case with user state "virtualization," it's not important whether the technologies in question are truly virtualized. It's only important that they be cut off from the core, foundation parts of Windows and used only when absolutely necessary.
Thanks to the work that occurred in Vista, Windows is already componentized, though I'd warn you against believing that this means the OS is neatly compartmentalized into the clean, isolated blocks you see in Microsoft technical diagrams. But a lot of work has been done there to make Windows as componentized as possible, and this work can be seen most clearly in products like Windows Embedded Standard 7 and even Server Core in Windows Server 2008 R2.
In fact, using the Embedded version of Windows as a starting point going forward is one way to achieve the goal I've identified here. It's also a way to arrive at a smaller Windows core that would work well on iPad-like tablets and other non-traditional PC form factors. (This version would also require a simpler, device-like, touch-based user interface as an option, but that's another story.)
Sandboxing is another approach, and here Microsoft could look to its Windows Phone OS as a guide. There's also true virtualization, in the form of Hyper-V, where entire legacy systems could be bundled with a newer OS for compatibility reasons.
However it's implemented, the model we're looking for is the Classic environment Apple provided to early Mac OS X users, giving them a way to run older applications inside of a separate, isolated environment that emulated legacy versions of Mac OS. As Mac OS X evolved, Classic was depreciated, removed as a default part of the install, and then finally removed all together. Aggressive, yes. But the result is a Mac OS X that is lean and well designed. Most important, it is devoid of legacy (at least Mac OS legacy) deadwood.
From the user's perspective, this new modular approach would result in tiered experiences. Low-end, device-like PCs--like iPad-type tablets, Media Center-based set-top boxes, certain netbooks, and so on--would simply get the foundational pieces, or roughly what is now Windows Embedded. These systems will require a simpler new UI, compatibility with all Windows drivers, and compatibility only with modern Windows applications.
Move up the chain, and traditional desktop PCs and notebooks would add the traditional Windows user interface and the option for backwards compatibility--however it's implemented--though disabled by default. This might perhaps deserve it's own article, but I'll at least note here that Microsoft is taking a very short-term approach with its multiple product edition (SKU) model, which customers despise and are confused by. Again, we need to be pragmatic--Microsoft isn't going to go from several SKUs down to a single, Mac OS X-like one SKU--but I think it's reasonable for there to be only a small number of SKUs, SKUs that people would understand, like Home, Business, and Ultimate (which is Home + Business). And Home should be that low-end SKU with the new, touch-based UI. That's all we need on the client.
Take it a step further and you arrive at the Server versions as well. These would build off of the same core as the client versions but would of core include the server-oriented software modules and optimizations that are central to that market.
Ultimately, what I'm talking about here isn't really all that radical, unless of course your concern is running an application that was written in 1998. In fact it's just a furthering of the current situation, one that is far more aggressive. And far more aggressive is exactly what Microsoft needs in Windows 8.
Read more about:
MicrosoftAbout the Author
You May Also Like