Feb 10, 2015 • 4 min read
Let’s say you happen to glance over the shoulder of a friend who is on TeamSnap (and we all hopefully have lots of friends on TeamSnap). You notice that their screen looks a bit different than what you are used to seeing. It isn’t just that their team information is different, but the entire screen is different. Whoa Nellie! What is happening here? Is my friend on some sort of VIP list where he gets access to special TeamSnap screens?
The reason for these differences is most often what we in the industry call AB testing. In other words, we show some customers version A of a screen and we show other customers version B.
Let’s back up and take you inside TeamSnap. Each month, we receive thousands and thousands of support inquiries from customers. Customers send us help requests, awesome stories, compliments, complaints, jokes and lots of suggestions, lots and lots of suggestions. Our product managers sift through the suggestions and pull out the best ones.
We then go to work implementing the suggestion. Product managers, graphic designers, software developers and quality assurance folks create what will hopefully be an improvement to how TeamSnap works.
When we are ready to roll out the change, we prefer to be cautious and really validate that the “improvement” is indeed better. This is where that A/B testing comes in. We show some customers TeamSnap exactly the way it is today. These customers are the baseline. We show other customers the new (and hopefully improved) version of TeamSnap. We then measure the baseline against the new version, seeing which version produces better results. In a lot of ways, it is similar to the way medical companies test a placebo against a new medicine in a clinical trial.
Let’s say we had a new and improved way to set up your team on TeamSnap. Our goal might be to allow you to finish the set-up process faster than you could with the old version (Who couldn’t use a few extra minutes in their day?). We would measure how long it takes for customers to set up their team on the new process versus the old one. After some period of time, often two to three weeks, we determine a winner. If the new process is indeed better, we make it available to all customers going forward. If the old flow is better, we go back to the drawing board to figure out why this awesome customer suggestion didn’t pan out in the real world.
This is just one of the many ways we use customer feedback, technology, and data to improve the experience on TeamSnap.
Ken is TeamSnap’s Chief Growth Officer, responsible for getting all the coaches and managers who are still wrangling their teams the old fashioned way to use TeamSnap. Ken and his sales and marketing teams use a variety of tools from paid ads to partnerships to emails, blogs and social networking to spread the word about the great work TeamSnap is doing. Ken lives in Boulder with his wife and two boys. When he is not training for a triathlon or coaching, Ken can be found at the C.U. basketball games or baking.