The upcoming start to the Summer Olympics got me thinking about … technology. If that sounds a bit random, bear with me.
By and large, sports have rules to make
comparisons fair. Nearly all sports – with the possible exception of rhythmic gymnastics – have an objective basis for comparison. Judges, referees, linesmen, and timekeepers are given clear criteria that they monitor throughout the contest. Competitors who are faster, higher, or stronger win.
“No one ever got fired for buying IBM”
I’ve spent a considerable part of my career managing IT operations. A couple of key skills I foster in my teams are the ability to assess vendor claims and to perform product bakeoffs. The goal is to fairly assess all viable candidates while avoiding analysis paralysis. Even so, the admonition “don’t do something stupid that gets you fired” is an ever-present concern.
In the days of box-centric IT infrastructure, the process of comparing products was reasonably straightforward. It went something like this:
- Establish the overall technology goals
- Determine which products can help reach those goals
- Solicit and receive evaluation systems from candidate vendors
- Get pulled into other, unrelated activities
- Return the loaner gear without ever unboxing or testing it
- End up buying from the vendor with the best price / sales pitch / internal connections
As information technology has evolved, though, the evaluation process has gotten fuzzier. Systems have become more complex. Now we have to choose between physical and virtual, discreet and converged (and even hyper-converged!), proprietary and open, on-premises and cloud. Where do you even begin?
Some time ago, I found myself trying to justify paying for VMware’s hypervisor rather than relying on free alternatives. Each time I’d cite a feature that I felt set VMware apart, someone would say “I think Xen has that”, or “doesn’t KVM do something similar?” Before long, I began to question my own knowledge.
What was true? How could I find out quickly? Option A was to go to each vendor’s site and try to divine the truth – while steadfastly ignoring the pop-up windows inviting me to chat, download a free trial, or take a survey. Option B was to do a Google search for “hypervisor comparison matrix” and see where it led me. Option B won out, and I soon found myself on the WhatMatrix site.
… and the angels sang.
Here was exactly what I’d been looking for. WhatMatrix’s virtualization comparison enabled me to select and evaluate the products I’d been considering – as well as others I hadn’t though or known of – across a variety of dimensions and criteria. They even did the buzzword disambiguation (e.g., vMotion versus Live Migration) for me. Now I could quickly winnow the field and focus my efforts. I could even print out a report to show to my warring clans stakeholders. This truly was a better mousetrap.
Paying it Forward
As fate would have it, I later found myself needing a similar comparison of cloud management platforms (CMPs). So I reached out the WhatMatrix administrators and asked if they had or were planning a CMP comparison. They enthusiastically thanked me for volunteering, and I found myself part of the WhatMatrix community. Soon I was developing criteria, soliciting vendor participation, evaluating submissions, and engaging in peer discussions.
I can honestly say that with WhatMatrix, what you see is what you get. They create a common rating criteria and provide objective comparisons. Each submission is reviewed by multiple WhatMatrix contributors. Participants strive to be as fair as possible – even when evaluating their own products.
In short, WhatMatrix makes technology evaluations like judging a sport. Let the games begin!
Ephraim Baron – CMP Category Consultant (Cloud Management Platforms)
Latest posts by Community Author (see all)
- We hope you are well – help for vendors – free lead generation - March 27, 2020
- Landscape Report Guidance: Cloud Management Platforms - February 5, 2020
- Data Protection: Challengers highlighted in latest update - December 5, 2019