Next

How Do You Track the Success of a Texting Program?

Getting program feedback is easier than you think. You don't need a complex research design or statistics to get valuable information and feedback from users about the efficacy of your program.

Evaluating your program can help determine the quality of your program, through user satisfaction ratings and utilization. You can also track how many people act upon the information provided in the text message.

What Types of Evaluation Questions Can You Answer?

  • Were your program objectives met?
  • Will you need to improve and modify the overall structure of the program?
  • What is the overall impact of the program?
  • What resources will you need to address the program's weaknesses?

What Methods Can You Use to Evaluate Your Program?

There are a variety of methods that can be used to evaluate your program. A short survey, focus groups, and direct output metrics are a few ways you can collect data for your program evaluation.

You may find that, with the proper incentive, you can ask enrollees to take a brief survey to answer more specific questions. Most web-based systems allow you to track opt ins, opt outs, and other metrics during the course of your project. You should also consider the effectiveness of different marketing media on enrollment.

Sample Survey Questions

The questions below are examples of ones that can be used on pre- and post-test surveys.

  • How did you hear about this program?
  • Was this program useful?
  • Would you recommend this program to a friend?
  • What would you change about this program?
  • Did this program encourage you to behave differently?

Which Metrics Should You Consider During Evaluation

  • The number of subscribers
  • How long the subscribers stayed in your program
  • Whether or not they responded to texts/surveys
  • Peaks in enrollment following specific marketing pushes
  • Location and other demographic descriptors associated with enrollment
  • The cost of your program relative to other programs
  • The amount of time spent by staff implementing the program

Evaluation Resources

For more information on collecting data and evaluating a program, explore the resources below:

Keep going to review text messaging Best Practices.

Real World Example - Results of Evaluation of Second Dose Study

Investigators at Public Health - Seattle & King County (PHSKC) developed a pilot text message program that could be used in an emergency situation. They also wanted to know whether the community would be willing to sign up for a text message program offered by the health department. During a mass flu vaccination exercise, PHSKC asked parents of children who needed two doses of flu vaccine if they would like to receive a text message reminding them to return for a second dose of vaccine.

Learn About a Successful Pilot Program

Second Dose video

Watch this short video about Public Health - Seattle & King County's mass vaccination pilot program.

Results:

  • In the first year of the pilot program, 84% of parents whose children needed two doses of vaccine opted in to receive text messages.
  • In the second year of the pilot program, 95% of eligible parents opted in to receive text messages.

This information suggested that public health audiences are interested in reminders via text message. Additionally, programs should start small and then expand.

Real World Example - Results of Evaluation of Employee Emergency Texting Program

Public Health - Seattle & King County (PHSKC) designed an emergency texting program that allowed employees to receive texts during emergency situations. The program was tested during a 2012 snow storm. Fifteen messages were sent over the course of five days, alerting employees to late work-day starts, site closures, and commuting reminders. In the week following the snowstorm, more than 180 employees responded to an online survey evaluating employee satisfaction with the texting program.

Results:

  • The majority of employees, 63% thought the text messages were very relevant and helpful, 20% thought they were fairly relevant and helpful, and 12% thought they were somewhat relevant and helpful. Only 5.4% of those surveyed thought the texts were annoying.
  • With respect to the number of messages that were sent: 83% of survey respondents thought PHSKC sent about the right number of texts, 15% felt they sent too few, and only 2% thought PHSKC sent too many texts.

This information suggested that, by and large, the emergency texting program was useful and appreciated by employees. As a result of the evaluation findings, PHSKC program administrators continued to send only emergency messages, and only when relevant. They also tried to provide customized information when possible.