Mimecast’s Peter Bauer on Exchange in the Cloud, e-Discovery, and Data Protection
Mimecast's CEO Peter Bauer discusses trends in the ever-changing Exchange market, such as Exchange in the cloud, and provides details about how Mimecast addresses data protection needs, e-discovery and data retention policies, and data residency restrictions.
April 17, 2012
The importance of messaging systems such as Microsoft Exchange has greatly increased as companies have begun to use their email systems for in-depth collaboration. The Exchange community has been constantly evolving to keep up with the increasing demands on email, resulting in Exchange in the cloud via Office 365, strict data retention requirements, and extensive data protection needs. At theExchange Connections conference in March, I had the opportunity to sit down with Mimecast’s CEO and cofounder, Peter Bauer, to discuss how Mimecast addresses the ever-changing needs of the Exchange community.
Megan Keller: Peter, can you tell me a bit about your background and Mimecast’s background?
Peter Bauer: I started out life in the tech business as a Microsoft certified systems engineer focused on Microsoft Mail 3.2, which was the product before Exchange 4. I lived and worked in South Africa. Then I started a business that did a lot of application development on top of the earlier versions of Exchange and helped Microsoft in South Africa bump Lotus Notes out of a lot enterprise opportunities because our business helped [Microsoft] showcase the workflow and the forms and the application development capabilities using the Microsoft stack.
Through that experience, I saw how important email was becoming to corporate communications and corporate workflows. When I moved to the UK in 2002, I met my cofounder at Mimecast, who’s also a South African guy -- I sold my first company to a public company in South Africa in the late 90s’. He and I spent a lot of time talking about how all of the point solutions that exist around an email system create a very complex environment to manage and to run. And I think that most vendors at the time had been thinking about the specific problems they were trying to solve, like long-term email archiving, either for storage or for compliance reasons, antispam and antivirus type services from a security management point of view, high availability technology to make email systems resilient or rapid to recover, disclaimers and brand management inside of email, and encryption. So each one of them was solving all of these problems individually and bringing out products to solve these problems, but in doing so creating a bigger problem, which was complexity. So we looked at it and thought “Okay, it’s a bit of geeky idea, but could we create a single piece of software that could do all of these things and really be a one-stop shop ideal companion to a Microsoft Exchange or a corporate email system environment?”
And as if picking fights with all the major point solution vendors wasn’t a foolhardy idea enough, our next big idea was could we build this as a large, multi-tenant Software as a Service infrastructure. We felt that it could be a lot more accessible and easy to adopt if you didn’t have to check something out and buy something new, but you could simply connect up to it and use whichever services you needed relative to whatever your own deployment or existing estate looked like. And over time, you could trend toward using more and more of our stuff as you felt you wanted to retire things.
So the combination of converging this functionality over the past nine years and ensuring that we continue to mature the functionality of the platform so that it meets or exceeds best of breed status in each of those areas. The combination of that rich functionality converged and the cloud-based delivery model, extending on-premises environments, it’s become a very popular service for people running Exchange. So today we have over 5,000 companies that use it -- it’s about 1.3 million end users total -- we store terabytes of data for companies, and we’ve been able to expand into three regions -- North America, obviously we started in the UK and South Africa, but we really have clients all over the world.
Keller: Do you mind giving some details about what you discussed in your keynote?
Bauer: It’s interesting. I remember when I signed up for MCSE courses in 1996, the structured database, the relation database, was the core value of an enterprise environment. And I fully intended to become a SQL -- what was it, 4.3 or something at the time -- person, and through some scheduling error I ended up on the mail track quite begrudgingly. It’s funny how things work, but what’s turned out is that SQL Server and relational databases and structured data sit at the core of line-of-business applications. But the longer term value sits inside the unstructured databases that companies run. Try as we may to get unstructured data into and around lots of other types of systems in the business, from Intranet and collaborative stuff, we’ve seen how all roads lead back to email because it is frictionless and ubiquitous. Although it’s very unstructured, it’s so convenient. So you end up using email for even the most sophisticated collaboration exercises or activities, and it’s a very interesting database as a result. It’s not perfect; there’s lots of frustration and heat pointed at email in terms of information overload, but it’s there and it’s not going away. I think people in 2011 and 2012 are starting to realize that Intranet or the collaboration of that environment isn’t going to save us. We have got to figure out how to be smarter with emails and how to take the email experience forward, as opposed to pretend it’s something else and trying to move into a new collaborative paradigm in another environment.
Keller: So at the core your keynote is really about getting IT pros ready for the cloud, right?
Bauer: I think it’s looking at and perhaps even dispelling some of the myths about the cloud. I’ve spent a lot of time thinking about the evolution from mainframes to client server because I turned up in IT just as that shift was happening. And I remember going in to pitch Windows NT–based systems and being laughed out of the room by the UNIX crowd who said “That’s not secure, that’s not mature enough, that’s not enterprise ready.” They were prepared to have it as a file and print server somewhere; this certainly wasn’t a platform to run any applications on.
Obviously, nowadays much more of the world is computerized, so the stakes are higher; the way that change happens really matters. It’s quite interesting to see certain approaches and certain strategies playing out again. An example is, what I remember in the client server world was terminal emulation. So you have your mainframe systems and you have all of these back end things. You present inside your Windows machine a terminal emulation box and you persuade your user that they’ve got a Windows-like experience going on there. Or you write thin front ends that don’t integrate into the mainframe to do stuff, to create the facade of a client server experience, and you’re trying to get it to go as far as it can in that new world. But it’s not client server computing.
In the same way now in the cloud world, we have a lot of client server legacy technology and a strong interest in making it seem like it’s part of this brave new cloud world. The world doesn’t understand yet, but it’s not cloud computing at all. It’s pseudo cloud computing and no one wants to call it pseudo cloud computing, so they come up with other names for it like private cloud. But it’s not part of the future; it’s a temporary [solution] where people are trying to either make money or do the best with what they got. But I think people shouldn’t be fooled -- the majority of mainframe systems haven’t survived; they were part of the solution, they had to be replaced and rearchitected. In the same way, running virtual machines with lots of stuff built for the client server world and fronting it with cloud wrappers isn’t cloud computing. You can have a cloud-like experience, but don’t overestimate what you can get in terms of long-term benefit.
Keller: Everyone is wondering what will happen to their job with cloud computing.
Bauer: What are the opportunities that are emerging and what are the opportunities that are remaining? Having looked at some of the problems and aspirations for cloud computing and some of the mainstay motherhood and apple pie truths of how IT just has to deliver certain things, when you clear all the myths of the cloud, what are the things that still have to be thought about and solved by the people we have in the room? People can use this just to start their own dialogue in their heads about how their own careers might shift or evolve. And it’s not going to happen suddenly. I mean for some people it will -- “Sorry mate, we’re moving to the cloud and we don’t need four of you administrators. You got here first, so you can stay; the three of you clear out.” There will be abrupt things like that for some people, but they’ll just end up doing similar things somewhere else and then the evolution will happen over time. But you don’t want to be caught napping by it.
Keller: How does Microsoft’s push with Office 365 integrate or compete with Mimecast’s products?
Bauer: First, it integrates and extends in a couple of areas. We add a belts and braces layer of high availability, we add additional layers of security, and we add independent storage options and opportunities. But I think what’s perhaps more relevant is that email has always been a bit of a work in progress for most companies. They’re always at some stage of migration or upgrade or trying to solve the next issue around email. What’s quite common is that you’ve got companies that are in a bit of a hybrid scenario between perhaps Office 365 and various versions of Exchange. They may exist in that state temporarily while they migrate to something or they may exist in that state permanently because that’s the configuration that they want. And the role of a sophisticated email service point that can competently handle routine consistent policy control and long-term storage of data so that you don’t have to be shifting too much stuff around between these systems. And a kind of pivot point that has really rich capabilities that integrates well into those environments, ties into things like Active Directory, fits in with Outlook and Windows Mobile and maintains that user experience means that companies can exist in a much more comfortable state while all of this change is going on.
While there’s a lot of interest and enthusiasm around Office 365, a majority of companies are still running Exchange on premises -- the majority of them are still running Exchange 2003 and 2007 on premises -- so we’ve done lot of work with Microsoft and some of the partners, like Binary Tree, to enhance the migration experience to help people get up to Exchange 2010 in most cases. And to do some of the heavy lifting in the cloud so that they can have a lean, mean local Exchange experience that they can migrate and that they can pivot more reliably and cost effectively. So we think that the majority of the market is interested, and I’m talking about the mid-market largely, is interested in running a high-performance inhouse Exchange environment and leveraging a service like Mimecast to do a whole bunch of the heavy-lifting peripheral stuff as a unified offering in the cloud, and getting the best of both worlds in a sense.
Keller: How does Mimecast ensure that data is protected and meets data residency restrictions?
Bauer: We’ve put a lot of work into that. I suppose there’s two pieces: There’s the physical management of the data and the physical location of the data, and then the logical governance stuff that you wrap around it in that particular physical environment.
On the physical environment side, we’ve implemented hosting facilities in specific jurisdictions that the markets we focus on care about. We have facilities in the UK, we have facilities in North America, we have facilities in South Africa, and in a few other niche markets where there are particularly jurisdiction-sensitive requirements.
The second thing is then how do you manage those environments. For example, we just got our ISO certification, so there’s a lot of business process management and governance over that. Then within the application itself, auditing and logging how things are accessed and role-based permissioning over who can do what type of stuff. I think this is often stuff that is not fully thought through by new cloud entrants. We have some of the benefit of being relatively old as a cloud company, nine years old, and particularly with real depth in the legal sector -- we have about 800 law firms that use our system.
If you’re going to have a significant portion of the legal community who is really saying to you, “Here, take our data and keep it for us for 10 years and give us high-speed access to it,” it becomes very important that you meet all of the governance and policy and audit requirements that they would have had over their LAN-based systems. An aspect of the shift from client server to the cloud is that the initial set of cloud computing stuff is inherently going to be less mature and less feature rich, in the same way that the early stuff in the client server world was much less mature, much less stable, and less feature rich. Some years had to be spent rebuilding and recreating things in these new architectures.
Keller: How is Mimecast addressing the latest e-discovery practices?
Bauer: We act as a supporting function in e-discovery contexts. We have the data and we help surface the appropriate data for export into sophisticated e-discovery suites. A lot of the common man’s e-discovery stuff you can do right within our system. If you’ve got very complex, sophisticated, high-stakes cases, then you may have a suite of e-discovery specific stuff that’s got all of the super high-tech, artificial intelligence that will also suck in other data types from other locations and other things and manage it. So we inherently have lightweight e-discovery capabilities, but we can easily interface into heavy-weight things when the use case is required.
We focus on things like making sure the data is actually there for an e-discovery case. By the time you go to start looking for it, how do you feel confident that you’ve actually got it? So soaking up PST data, ensuring that retention policies are applied and enforced through our system and that stuff hasn’t been deleted. Being able to manage retention policies based on end-user filing behavior -- end users dragging stuff into folders -- assigning different retention policies, and enforcing those all the way through.
Another key issue is preventing or avoiding data loss. If you’ve got to retain data for many, many years, and you’ve got to be able to search and access it very quickly, you’ve got to keep it online. It’s not like you get a batch of [data]; it’s a stream. Every second of the day is the start of a new 10-year retention period. You can’t go to the top law firm in the world and say "Look, I’m sorry, we had a fire in the data center and all of the data up to 2009 is gone. Have you got a copy?" And you also can’t say, "Well it’s gone temporarily because we had a fire in the data center and we are busy recovering it from backup." I don’t know if you’ve ever transferred a terabyte of data, but recovering it takes days. And while it’s happening you’ve got a massive stream of new data coming in.
So you can never be in a situation where you’re trying to run primary, secondary, backup, and recovery environments. You have to run multi-master data repositories that are geographically dispersed. We’ve never had a data center go up in smoke, but it could happen and we have to have an architecture that recovers from that by itself. Your multi-master configuration means that any one of your copies of the data can return that data to whatever the end-user environment is -- it may be an end-user search on a mobile or inside Outlook or an e-discovery request through our admin interface. There are lots of themes in supporting e-discovery and some of them are quite basic, like making sure you actually have the data.
About the Author
You May Also Like