.NET's Seismic Shift
You might not realize it, but the .NET shift is going to be a major seismic event. Use this overview to recognize the shift’s leading indicators and prepare for the changes it will bring.
July 23, 2002
Prepare your analytical applications for earthshaking changes
A major geological change is in progress. You've felt the tremors, but you might not appreciate the magnitude of the upcoming event. Pay attention to the warning signs because the .NET shift is coming, and you're probably developing analytical applications today that will be obsolete tomorrow.
When designing an application, every software developer needs to decide how much time to spend planning for the future. Every time you develop an application, you have to choose how flexible to make the application so that you can adapt it to changing needs in subsequent revisions. If you want to make the application infinitely flexible, it will take an eternity to develop. If you solve only the business problem you face today, you might end up rewriting the whole application when the next problem crops up. My strategy is to build infrastructure for changes you can anticipate and use modular programming techniques to make isolating future problems easier.
You can prepare for .NET's upcoming changes now as you develop your current applications. Watch for these major changes over the next 12 to 18 months:
.NET will be king of intercomponent communications, and COM will gradually go away.
XML will be the format for storage and communications.
XML for Analysis and ADOMD.NET will be the analytical programming interfaces of choice.
The final .NET shift will come in the commercial release of the next version of SQL Server, code-named Yukon. Preparing for these upcoming changes makes sense—and is less risky than continuing your current programming habits.
.NET Rules
Software development organizations have taken a while to realize what .NET's true benefits are. What's made this understanding particularly difficult is that .NET encompasses a collection of technologies, including XML, ASP.NET, C#, Visual Studio .NET, Simple Object Access Protocol (SOAP), and the Common Runtime Language (CLR). None of these technologies is particularly revolutionary by itself. But the sum of these technologies—and the programming approach that these technologies espouse—is what makes the .NET shift truly earthshaking. One piece of .NET brings most of the benefits: the .NET Framework. Built into this collection of programming interfaces is the ability to control Internet technologies such as XML, SOAP, Dynamic HTML (DHTML), and more.
The .NET Framework is language-independent, so Visual Basic .NET, C#, and Visual C++ .NET can all use the same interfaces. Language independence means that .NET has its own standard for intercomponent communications. Without .NET, when you want to access a Visual Basic (VB) software component from a C++ component, you use the component object model (COM). With .NET, COM isn't necessary. A common concern is whether .NET will make COM obsolete or will coexist with COM indefinitely. I believe that developers will use COM less frequently, but COM won't become obsolete for a while because of the vast amount of COM-based software in use. Even Microsoft will take some time to retrofit its existing applications to .NET. Microsoft recognizes that its customers can't simply rewrite everything to .NET. Every software release needs compelling new benefits to win customers, and the work that developers will have to do to make the .NET Framework part of their systems will include rewriting existing code. Rewriting code for its own sake isn't compelling, but exploiting the capabilities of .NET to bring new features to your customers is.
The fact that .NET isn't compatible with COM is a roadblock to seamless adaptation of .NET. You can call a COM application from .NET, but a native .NET application isn't necessarily compatible with an older COM version of the same application unless the .NET application was written to support COM. For example, if Microsoft changed its Office applications to .NET, existing third-party COM add-ins such as Hyperion's Essbase Spreadsheet, Business Objects BusinessQuery, or Business Reporter (which my company, ProClarity, produces) probably wouldn't be compatible. So, should a company release a version of its product that's half .NET and half COM? Should it develop a whole new product based on .NET? If so, how will the company make the new product COM-compatible? Every Windows software-development company will have to weigh these questions to decide how to move to .NET.
XML for Storage and Communications
XML is a proven technology. If you're not already using XML to develop new applications, you should be. My company is on its third generation of XML-based products, and I can't imagine going back to pre-XML technology. XML is an open-architecture format, which means it's easily integrated with other applications. Previous methods were platform-specific (e.g., limited to Windows or Intel platforms). And XML is a self-documenting, extensible format for storing or communicating data (i.e., passing structured data between software components such as Web services).
XML is verbose, so many developers have resisted using it. XML can use more than twice the storage space that binary data uses. XML requires parsing (interpretation), so reading it is slower than reading binary data. However, these limitations have largely been mitigated by compression and easily available, high-performance parsers. Compression techniques significantly reduce the size of XML when you transmit it over a network. Examples of currently available XML parsers are Microsoft's COM-based parser, which is named Microsoft XML Core Services (MSXML), and a public-domain parser called Simple API for XML (SAX). The .NET Framework also has a high-performance native XML parser. Because XML is a simple, standard, cross-platform, open-architecture format, its benefits far outweigh its limitations.
Choosing XML for Analysis and ADOMD.NET
Analysis application developers have to choose which programming interface to use and suffer the consequences—or reap the benefits—of that decision. Some third-party interfaces exist. For example, ProClarity's Analytic Platform and AlphaBlox's AlphaBlox are both third-party interface options. But the leading programming interface choices are ADO MD, XML for Analysis, and ADOMD.NET.
ADO MD, the programming interface that Microsoft introduced with SQL Server 7.0, is an extension of ADO that supports OLAP Services' and Analysis Services' multidimensional capabilities. Like ADO, ADO MD is a COM interface built on OLE DB for OLAP. Supported by PivotTable Service, OLE DB for OLAP is a lower-level data-access interface that communicates with the database server by using a proprietary binary protocol. If you use ADO MD, you have to install Microsoft Data Access Components (MDAC) on the client PC. MDAC is large (more than 10MB) and frequently requires a reboot during installation, so it isn't Internet friendly. (An Internet-friendly component is one that you can quickly install across the Internet without prompts so that it doesn't interrupt your Internet browsing experience.) However, ADO MD is fast. Because OLE DB for OLAP caches cube data on the client and understands MDX queries, ADO MD can respond to queries without a round-trip to the server.
The emerging .NET approach, XML for Analysis, helps solve the deployment problems of MDAC but introduces new performance challenges because no caching happens on the client. XML for Analysis is a communications protocol based on HTTP and XML; it provides the client application the same functions as ADO MD does. The primary advantage of using XML for Analysis is that it doesn't require you to install client-side components. Thus, you can develop an XML for Analysis client application for platforms that MDAC doesn't support. For example, you could develop a Macintosh, Linux, or Pocket PC application that communicates through XML for Analysis.
Unfortunately, like XML, XML for Analysis is verbose (e.g., a query result can be very large) and requires a server round-trip. Also, XML for Analysis is stateless, so the client application has to provide additional information for every request (e.g., what database the query pertains to, data source properties). Another drawback to using XML for Analysis is that Analysis Services doesn't natively support it, so you have to use the middle-tier XML for Analysis implementation that Microsoft released in the past year, which doesn't have adequate support for different Internet security models. Microsoft will likely remove this limitation when it releases a version of Analysis Services that includes native support for XML for Analysis. For more information about how this protocol bridges the two worlds of client/server and .NET, see "XML for Analysis," April 2001, InstantDoc ID 19846.
A hybrid of ADO MD and XML for Analysis, ADOMD.NET is a .NET interface that has almost all the same objects, methods, and properties that ADO MD has. So if you're used to ADO MD, you'll have no problem using ADOMD.NET. The significant difference between the two interfaces is that ADOMD.NET uses XML for Analysis to communicate with Analysis Services, whereas ADO MD uses OLE DB. This difference is crucial because it means that with ADOMD.NET, you get no caching or MDX intelligence on the client. Every request to ADOMD.NET will turn into an XML for Analysis request to the server. The most serious limitation of ADOMD.NET is that it isn't yet generally available. By the time you read this article, ADOMD.NET should be in beta testing; Microsoft expects general availability in fall 2002.
How Can You Prepare?
Decision time is now. Many factors go into your organization's plans for the .NET future, and you'll want to choose carefully to be sure the technologies you work with will serve your organization now and as changes occur. Here are some questions you might consider:
Is the application you're developing new or an enhancement to an older application?
What's the application's expected lifetime?
Is backward compatibility important for your customers?
Is multiplatform support important?
Do your users need to use the application when they aren't connected to the server?
How time-critical is the development of the application?
How much do you know about XML and .NET?
After answering these questions, you'll understand more clearly the trade-offs associated with using XML for Analysis or another analysis API.
Here are some recommendations to help you develop an action plan. First, make sure you understand and use XML. All emerging Internet technologies, including .NET, are based on XML. If you don't understand XML, you'll be lost. Second, if you're developing a new application or a major new portion for an old application, consider using one of the .NET languages. You'll be in a better position to take advantage of .NET technologies, you'll be more productive, and you'll produce a better result.
Third, think about building in some metadata caching to make the switch to ADOMD.NET easier. Consider the effect that client-side caching will have on your application's design. If you're using ADO MD today, you're getting excellent response time from metadata queries. Metadata queries are simple requests for lists of metadata such as dimensions, levels, members, and measures. Because ADO MD caches this metadata, you can make frequent requests at time-critical points in your application with little or no cost in performance. However, if and when you switch to ADOMD.NET, metadata requests will cause server round-trips that might significantly increase processing time and cause an application error such as a timeout.
For example, imagine that in your application you want to implement a fly-over hint that displays metadata such as member properties for a dimension member. In such a case, you might have to rework the program code that supports the hint when you switch to XML for Analysis or ADOMD.NET. (A fly-over hint is that little label that appears when you move your mouse over an icon on a toolbar, for example.) Safer alternatives are to cache part of the metadata or avoid this type of time-critical request. If caching sounds like too much to tackle, you can include a request for member properties in the MDX of each query you perform. Then, the member properties for each dimension member in the query result will be available for a fly-over hint.
Shake Up Your Routine
If you had some advance warning that a major earthquake was about to occur, wouldn't you be irresponsible if you didn't use that information to prepare? Because you know .NET is coming and can anticipate some of the changes it will bring, you should take this information into account as you develop your next analytical application. You might avoid a bigger shake-up down the road.
About the Author
You May Also Like