Once we launch an app or website our work is just beginning. We need to learn from everything we put into market.
Stop me if you’ve heard this one. You did everything “right”. You talked to customers to understand their needs. You interviewed stakeholders and confirmed the business objectives. Based on your “discovery” work, you are confident that the solution you developed is rock solid.
Everything looks good. The design is on point and aligns with the latest trends. Copy tone is consistent with the brand and all is well in website land.
Except it isn’t.
As the weeks go by, the results start to appear in your GA account. Engagement is lower than expected. The customers that you thought would discover the site and flock to it simply aren’t. “What’s going on?” you ask. “Doesn’t the intended audience understand how great my website is?”
Let’s face it, launching an app or website isn’t the end of the story, it’s really the beginning.
Even the most informed team can learn a lot by observing how their product is being used in the wild. A funny thing happens when real people start using a product. They behave in ways the personas didn’t anticipate. They use the “wrong” features. They use devices we’ve never seen (seriously, look in your analytics and see how many devices you’ve never heard of that are there).
Analytics, user testing, and customer feedback are vital in determining where the sharp edges are and how to best address them. If customers aren’t pathing through a site the way you expected or or using a key feature, analytics will tell you that. User testing and customer feedback will tell you why.
Without customer feedback and validation during key phases of the project, ultimately we’re guessing. What’s worse, if you’re not monitoring and optimizing things after launch you’re costing yourself (or your client) brand affinity, enrollments or sales. We can look to tried and true best practices and user trends but without customer feedback, your project is a lot like a 6th grade science fair project gone wrong. We have hypotheses that we think will work. We gather data and determine the conclusions. What we miss are the observations which results in incomplete information that feeds our conclusions. Conclusions without observations are really a new set of hypotheses. It becomes a familiar cycle: wash, rinse, repeat.
A far better approach is to involve customers however you can throughout the lifecycle of a project- not just the discovery phase. Have architecture questions, perform a card sort. Need wireframes feedback? Do a low-fi prototype test. Have copy written? Probe on comprehension and reading level. Visual design? Do a hi-fi prototype test. Every time you get customer feedback, you have an opportunity to refine the approach. This enables you to create a solution you can have confidence in once it’s released to the masses.
After you launch, the real fun begins. Listen to customer feedback and review the monthly SEO and analytics reports to see what is working well and what isn’t. Take all of this information and determine your optimization plan. Rather than tackling everything at once, create a minimum viable optimization (lower effort, higher return) approach to help prioritize the changes that will have the biggest impact. Focus on the key areas of the experience where you can move the needle the most. If there are multiple approaches for an optimization, you can always perform an A/B test to see which option works best in the real world.
Go forth, learn and optimize!
Learn how you can use Amazon Alexa and Google Home to improve your customer's experience with your brand.
In this free workshop you'll:
See the data that supports the investment in Voice
Learn strategies for identifying, creating and launching a successful Voice app
Get our framework and roadmap to get you started in 2019
Voice has already begun to transform how we search, and engage in the digital world.
If you haven't yet started to consider voice as part of your content, user experience, or marketing strategy, it's time you started!