Test Driving Database and Developer Consultants

Need some ideas on how to make sure you're getting the best consultants for your needs? I have a few ideas to add to my article earlier this year on the same topic. Because you responded so urgently to that discussion, I wanted to continue it with you.

DevPro Staff

June 18, 2009

5 Min Read
ITPro Today logo in a gray background | ITPro Today

 

Need some ideas on how to make sure you're getting the best consultants for your needs? I recently had an experience that really helps solve this problem.

SQL Server Consulting and Twitter?

I love SQL Server. As such, I have TweetDeck set up to scan for anything anyone might say (or tweet) about SQL Server. A few weeks ago, someone (unknown to me) tweeted that they were looking for a SQL Server Guru to help with some performance problems. Since I'm a SQL Server Consultant who specializes in helping with performance problems, I tweeted back with a link to my site, and told them to give me a call for a free initial consultation.

Long ago I determined that offering a free consultation would accomplish two primary things. First, and foremost, it would give potential clients a way to 'try out' my skills and expertise with no worries about commitments up front. Second, it would give me a sense for their problems and environment while also letting me decide not to work for them if there are any nasty warning signs.

So, to make a long story short: a call was arranged, and after talking with this potential client for about their performance problems for about an hour, I was brought in to help them with their performance problems.

Little did I know what I was getting into.

Taking Consultants for a Test Drive

Turns out that this client also hired some other consultants to work on the exact same problem. Better yet, they didn't tell any of us what they were up to. Why? Because they knew that their performance problems were likely more than skin deep.

Normally I can whip performance problems into shape pretty easily by tackling many of the common issues that I see when it comes to performance tuning. In this case things weren't as simple as I would have expected. I won't go into the details of what I found here, but suffice it to say, I ended up getting a lot more than I initially anticipated.

So I let this client know exactly what I had found. Doing so required a couple of lengthy emails over a few days and an arranged phone call or two to go over the findings in greater detail. But in so doing, I was able to provide them with the raw performance numbers and problems I was seeing, along with a description of what SQL Server was likely doing internally to explain the kinds of problems we were encountering. The big thing I wanted to communicate though was that whereas I had initially told them they hopefully just needed a tune-up, I was now of the opinion that they were going to need some minor surgery - and I had numbers and data to back that up.

It was at about this point that the client let me know what was up - and how they had actually had a couple of consultants working on this problem on different instances of their environment - and how they had all come to pretty much the same conclusion.

 Of course, I'm happy to report that my approach to explaining what was going on along with my philosophy of 'teaching them to fish' rather than merely giving them a 'fish' (or the solution in this case) was well received and that they'd like me to continue working with them. But what struck me most about the entire experience is that they were a bit apologetic about how they'd 'test-driven' a number of consultants against their problem to find the best fit. In some ways, they couldn't help but feel that they had pulled a bit of a dirty trick. In my mind, what they had done should be repeated everywhere.

In a Previous Briefing...

I touched upon the topic of consulting in a previous briefing as well. In addition to mentioning how cleaning up after expensive consultants left me realizing that I could do better, I lamented that companies frequently don't do enough screening of consultants before bringing them onboard. Case in point, how many companies ask to see the resumes of actual consultants that will be walking in the door? Yet how many of these same companies would be as equally lax in evaluating a new hire?

As a consultant, I'm therefore always on the lookout for ways to help me improve client success while increasing the reputation of consultants in general. Which is why I've always advocated increased scrutiny of consultants before bringing them onboard. (Especially since so many of them are brought on in such critical capacities.) This client, however, took things to a whole new level by actually having a number of different consultants work on an actual test case - to see how things would pan. And while some consultants might balk at the idea, I personally love it.

The only drawback, of course, is that it's a bit of an expensive way to evaluate potential help. But if you're going to undertake a long-term project, the truth is that NOT spending a bit of money up front to 'test out' your consultants can cost much more in the long run in some cases. So the approach of testing out consultants really does have some merit - even if the cost seems to be a problem at first blush. Likewise, if you know that you're going to need external help for a while, or in a while, you can always take things slow - just as you might do if you were looking for a new full-time employee. 

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like