App StorePlay Store

Implementation with your Population: How to be Confident your Digital Health Program will Succeed

After 7 years in tech pitching his own digital health business, Scott Taylor provides his insider's cheat sheet for catching out digital health vendors. In this last of 3 articles, Scott covers understanding “Relevance To Your Population.”

Hero Image

"One of these things is not like the others."

With their seminal Sesame Street song, the Muppets were clearly describing the translation of health research to real-world populations. Obviously.

“If you do a clinical trial on the Thunderbirds, and then tell me that I’ll get the same outcomes when implemented with the Sesame Street community, I’m gonna tell you you’re crazy,” said Dr Big Bird. “Puppets and Muppets are not the same.”

The same is true when evaluating whether a digital health program is right for your population. A salesperson will present results that show excellent outcomes from using their product, but those results will be with a very specific group of people under very specific conditions. So how do you determine whether the same results are going to be achieved with your members?

The answer is to ask targeted questions that cut through the sales spin. Don’t allow the salesperson to get away with vague, reassuring answers. Put them on the spot and make them answer with facts, examples and clear logic.

To establish how relevant a program is to your population, ask these three questions of the vendor:

  1. How did you translate your results to my population?

  2. What is your evidence for successful adoption with my population?

  3. What kind of implementation support and experience do you have with my population?

By the end of this article you’ll be armed with the knowledge necessary to critique how relevant a program is to your population, and ultimately pick digital winners from losers. 



You ask this question to make the vendor’s assumptions explicit, so you can challenge their logic. Also, for the fun of watching them gulp rapidly and stare in terror at you.

To convince you to sign up, the vendor will present you with evidence of outcomes (see the previous article on establishing “quality evidence base”) from previous programs. These could be:

  • Financial claims: “You’ll see a $3,500 per member p/a reduction hospitalization costs for COPD-related events.”

  • Return-on-investment claims: “Our programs deliver a 3:1 ROI.”

  • Health claims: “After 12 months, patients achieve a 22% average reduction in ​​LDL cholesterol.”

  • Quality measures: “You’ll see sustained improvements in 4 key HEDIS metrics.”

They’ll then probably tell you that you’ll see the same outcomes in your population, and quickly slither onto the next topic before you have a chance to ask too many questions. They don’t want you to think about their claims too hard, because they will be based on use cases that are different to your population’s.

As an example, let’s say you’re interested in a motivational fitness app for your members. You’re promised “an average weight loss of 14lbs in the most overweight users, and a corresponding 8% lift in mood measures, which correlates with a 4-day annual improvement in absenteeism rates!” Sounds great, right?

But let’s say that data is derived from a study of 250 Apple employees in Cupertino CA, and you’ll be rolling it out with 6000 Medicaid members in rural Texas. Between the Apple employees and your Medicaid members you’ll see big differences in technology adoption, health literacy and motivation, average age, educational levels, English-as-a-second-language, income stability, access to fresh food and fitness facilities, and so on.

Still confident in those amazing results?

To be clear, none of this is a problem. No one expects that a vendor has tested their solution with every population type. But what is a problem is when they aren’t forthcoming about these demographic and socioeconomic differences and the assumptions they used to bridge that gap. What you want to hear is something like:

“... We used smartphone and household internet adoption rates (citation) as a proxy for tech literacy. Based on this we estimated a ~30% lower enrollment rate in your population. We also used education levels (citation) as a proxy for health literacy, and adjusted down based on the 12-year age difference between the two populations. It’s difficult to be exact, but you can probably expect to see a 1-2 day improvement in absenteeism per member rather than 4 days.”

If you hear anything like this level of transparency in a Sales pitch, grab a pen and sign that contract immediately. And then ask them to teach ethics to your first born child.

One of these things is not like the other
Demographic, socioeconomic and species differences matter when evaluating program success


My mom always said I should’ve been a politician. “Scott, when you get an idea in your head you’ll say the same thing over and over and over again until I want to scream ‘I GET IT.’”

So here I go again about real-world evidence. There is a huge gulf between talking about how something could be done in theory, and having it done by real people with their messy lives. In this case it's not enough for a vendor to talk in theory about how they will modify their product. You want to hear real-world evidence of successful adoption with people who look and think like your population.

Consider the difference between these two statements:

  1. “We know there are accessibility challenges with Medicaid populations, so for your program we will update the app to make the text bigger.”

  2. “30% of the members in our Florida Medicaid program struggled to see the app clearly. We were able to enroll members but they quickly dropped out. We ran workshops and realized that we had too much text, and the color contrast was low. So we created new illustrations and updated the brand colors, and have got that 30% down to 4%.”

This logic works across all aspects of your population’s demographics:

  • “Your population will have low tech literacy. We ran a program with an elderly population in Jacksonville where we leveraged their children to assist with app download and setup.”

  • “Your members won’t be motivated by educational materials. We had a low-health literacy group - smokers with a median age of 54 - who felt like no one cared and were only motivated by the compassion of a coach.”

  • “We’ll provide a $5-a-month phone data card for your population, because we’ve seen concerns around data usage really hold back adoption in similar populations.”

Push vendors for evidence of successful adoption and modifications that were made to the program to better suit the population. A vendor who can provide this type of evidence really inspires confidence in their ability to be successful with your population.

Expectation versus Reality
When it comes to program implementation, what you are promised can often be very different to what you get


OK, so you’re confident your digital health salesperson knows what they’re talking about when it comes to modifying the product for your population. The thing is, it's not just the product that needs to be tailored to your population. Marketing and outreach, program design and evaluation need to be changed too.

Let’s start with Marketing. Marketing’s main job is to get your members to enroll. That means they have to 1) have a compelling message that speaks to your member’s problems, and 2) get that message out where your members will hear it.

OK, message first. Top-of-mind problems for your Texas Medicaid members are probably financial stability, being time poor and worrying about getting food on the table. They don’t need to read “See your new yoga-trim body in just 6 weeks!” Rather “Life is tough, we want to help by giving a little back” is probably going to resonate better.

Similarly, how you get the message out is important. Text messages or an endorsement from someone they trust (e.g. PCP) might work better for this group than sending out an email. You want to hear your vendor talk about their experiences with different channels and knowing what works from experience.

The same applies to the processes of enrollment; these need to be tailored to your population too. If you’re targeting 70-year olds in nursing facilities, you don’t want to hear that the vendor will be using ads on TikTok. Better use an iPad with a visiting nurse or receptionist who can help with creating their profile.

You get the idea. Other areas of implementation you want to hear are being tailored include:

  • Program design: This covers adding or removing features based on your population’s needs. Things like surveys, coaching, web- or app-enrollment, liaising with HR or care teams and so on

  • Evaluation: There will be key metrics that are relevant to your program that need to be tailored. How will they capture the data for your HEDIS measures? How will they prove improved diabetes control? What will the monthly report show?

  • Plan to Pivot: No program is an immediate success. You want to hear how the vendor learns from a pilot and iterates the program, based on what they have done in the past

All this grilling is to ensure the vendor you choose has experience working with your population and that they understand the challenges your population may face….. (get ready for it)..... based on real-world experience. Mom will be proud.


Be Confident to Implement

By asking the right questions, you can ensure that the digital health vendor you choose understands your population, has a track record of success, and can provide the necessary implementation support to deploy the program effectively.

  1. How did you translate your results to my population?

  2. What is your evidence for successful adoption with my population?

  3. What kind of implementation support and experience do you have with my population?

These three questions are designed to give you the confidence a program will actually succeed. Or they might put up a giant red flag that your vendor has no idea what they are talking about. In which case press the trapdoor button, open the shark tank, and stroke your white cat.

Up Next - Behavior Science and Human Motivation

This article concludes the 3-part series on “An Insider’s Cheat Sheet For Digital Health.” The goal of the series was to give you the tools you need to dig into a vendor’s sales pitch, teaching you how to uncover “Genuine Engagement Metrics,” a “Quality Evidence Base” and now “Relevance to your Population.”

For our next series we’ll be turning to Scott’s Co-Founder Hugo Rourke. Hugo will delve into his favorite topic - behavior science and human motivation - with his series “Get Your Behavioral Science Degree in 5 Easy Reads.”


About the author

Scott Taylor is Co-Founder and CEO of Perx Health, a digital health company changing the way health plans engage with their high-risk members.

Perx enrols a higher proportion of members, interacts with them more frequently and keeps them engaged for longer than any other digital health program. They achieve this by tailoring behavioral motivation strategies to the individual, ensuring the completion of 90% of critical daily care tasks like medication adherence, physical therapy and attending appointments. 

Perx Health has already helped over 30,000 patients achieve better health outcomes and partnered with over a dozen healthcare organizations. Email us to learn more. We're always happy to chat.

Further questions?

We love discussing our research - reach out

Contact Perx