Evaluate CTMS software like an IT pro – blog series: part 1
In this blog series we are sharing with you our 10 years of experience on how to evaluate CTMS software. Today, part 1: „Why feature based scorecards don’t work and the solution to the problem.“
In a well-functioning world, the CTMS is the backbone of clinical research and provides the means to keeping research on track and key stakeholders informed. It’s meant to increase efficiency while cutting down on duplicative work, yet in the months after the sale, why are so many CTMS customers disappointed with their choice?
Houston, we have a problem
What no one is talking about is the heavy burden to the team evaluating the CTMS, and their relative inexperience with technology. All they know is that what they have right now isn’t working. As a result, they’re tasked with going out into the huge world of software, all the meanwhile being pummeled with meaningless messages and slogans.
Remember, we’re asking folks with scientific/clinical backgrounds to suddenly become IT experts. That’s one heck of an uncomfortable stretch for many. The most common fear? That they’ll not even recognize the questions to ask during the evaluation process.
If you’re in the market for a new CTMS, you’ll have noticed that many vendors make similar claims. “Flexible” “Integrated” and “Streamlined Clinical Operations” are just three of the rather ambiguous promises that appear frequently. Which one is most flexible? Most integrated? Most able to streamline your clinical operations?
Features-based scorecards don’t work
Adding to that confusion is the advent of the feature-based scorecard. Usually implemented by the procurement department in an effort to attempt to compare like to like, it does seem like a good idea. The trouble is, most products now have most features. What is impossible to tell from this kind of scorecard is how each feature works in practice. Is it intuitive? Is it clumsy? Does it match your business process or does it create more work for you in the long term? How many clicks does it take to accomplish what you want to do? Therefore, starting with a feature-based evaluation can often lead you down a path of false comparisons - with the deciding factor sometimes being an obscure, seldom used feature that is unique to a particular product.
While we cannot tell you which system is best for your business, what we can do is give you an airtight process to help you evaluate the choices more efficiently. We can also identify the most common traps you might fall into, as well as the key questions to ask which will help you to peel back the shiny packaging and messages to get to the heart of the system. Our competitors are going to hate this, but quite frankly we really don’t mind that part.
Learn why “clicks count” and whether or not your vendor passes the “Pizza Test”
In this series we’ll provide you with a non-traditional yet more effective method for CTMS evaluation. You will learn:
- The most critical questions to ask vendors at the beginning of your search which will immediately allow you to narrow down your vendor pool
- An efficient 4-step process to utilize on your remaining vendor candidates that will help you to truly know if the software fits your business.
- What the vendor is really thinking when it comes to contract negotiation.
The new scorecard
Part two of our series will provide you with infinitely better insights that a scorecard by focusing on the how rather than the what. By following a proven process that focuses on exactly how a solution would work in your organization and by asking the right questions, you can find the product and vendor that fits your organization perfectly, without having to be an IT expert.
Up next: The most important questions to ask in order to begin to eliminate potential vendors
All articles in the blog series
- Part 1: How to become a CTMS evaluation
- Part 2: The killer criterion that immediately narrows down your list
- Part 3: Are they going to march to your beat or will you have to march to theirs?
- Part 4: Bells and whistles, the monitor visiting process, and how to evaluate the user interface
- Part 5: The nuts and bolts of the pricing game