I've been thinking about virtualization a lot recently. I became a fan while using VMWare for development testing at Bindview and then Symantec. I became a cheerleader for virtual machines at BP, once again for development testing. There's just something about the idea of a machine in a known good state as the starting point for a test that I really dig. I also liked having several operating systems at my disposal to test my application on various versions for inconsistencies.
I asked some folks who ran data centers about their use of virtualization and the ones who used it said that they used it to save money. "We have higher utilization of our existing servers" was the core idea that I heard repeatedly. The folks who said that they didn't use virtualization said that it was too expensive. Hmm, one group has cost savings as an argument to use something while another is using expense as a reason not to use the very same thing. Something isn't right here.
A phone call with my father, an Economics professor currently in New Zealand (which I'm told is somewhere near Australia), got me thinking about the paradox in a new light.
To help test my theory I created a short survey with the help of Brent Ozar, SQL guru and virtualization expert at Quest Software. It should only take 5-10 minutes, and is applicable whether your organization is using virtualization or not. If you work in a data center, please head over to http://bit.ly/Ukhv to help out. If you know anyone else who does, the forward the link on.
After I collect the responses I'll post some of the results (participants can get the full report) along with the revelation that kicked this whole thing off. Thanks for your help!