We're always looking for new ways to help customers learn about their audience. Our belief is that the more users know about their audience, the more empowered they are to make decisions that help them grow. We know that customer insights are especially crucial for newer businesses trying to find product market fit. One of the best ways to help find fit is by talking to customers to better understand their needs.
This tension shaped our team's mission — help new businesses grow by collecting customer feedback.
Through a series of surveys, we gathered feedback from our own customers to learn how they managed this pain point. The irony wasn't lost on us, so we made sure to be more cognizant of the steps. We discovered a segment of users not able to collect feedback but desired to do so. Barriers such as time, knowledge, and budget prevented them. As we dug deeper, we learned methods they previously tried, or hoped to try, to give us a place to focus. The results of our research landed us at surveys as the most opportune tool.
After brainstorming our options, we outlined opportunities to create an initial proof-of-concept for testing the viability of an entry level survey feature.
Experience themes we focused on:
We didn’t want the tool to get in the way of users learning about their audience, so we accounted for flexibility in adding, removing, and updating questions — wanting to make the experience feel as simple as typing into a doc. Rather than focusing on complex use cases, we opted for simple opportunities to manage conditional questions for users — specifically in cases where the user would want to capture more information. For instance, the "other" option that automatically reveals a text box once selected.
We weren't sure if people would struggle with determining how to phrase their questions, so we spent some time discussing our options to make the initial empty state less intimidating. However, in the end we decided to hold on building additional guidance so that we could wait to see if users really needed the assistance. Instead, we opted for a lighter route by relying on our knowledge base articles to help them in the short term.
Since these surveys are online pages, they can be shared anywhere with anyone. People can link to them within their email, social post, or website. To help users maintain brand consistency across those touchpoints, we surfaced simple styles, such as colors and logo, to make the page more familiar to their audience completing the survey.
In addition to the styles, we felt the language within the survey was just as important. Rather than creating a one-size-fits-all messaging for users, we left elements such as the survey description, button label, and confirmation messaging customizable. This would allow users to match their brand's voice and tone, but more importantly, let international users tailor the fields to match their audience's native language.
Once a survey collects responses, we're encouraging users to explore their results to see what they could learn. Rather than just listing the top answer choices, users can interact with their data to see trends without having to write a single query or spreadsheet function. By clicking on any of the answers, they can see how many people responded, who specifically responded, and their remaining survey responses. This helps users narrow down various segments of their survey without having to think much about the process of doing so.
By segmenting responses, users can start to identify new learnings or key segments within their audience. From recalling our own experiences conducting surveys, this was something that took a decent amount of time for us to slice and segment in a spreadsheet. We already knew time and knowledge was a barrier for most people, so we wanted to make this process as easy as possible. Once they have a group of people, or single person, they want to reach out to, they can tag the known contacts and send them an email either to follow up for more information or provide them something special.
Within 6 weeks, we were able to identify an opportunity space, define the key issues that blocked users, built a proof-of-concept, and release a feature to measure viability. We’re monitoring activation rates while also collecting user feedback through linked surveys using Mailchimp Surveys — keeping it meta.
Days after the release, we discovered a particular piece of feedback where users were hitting friction. They were unsure of what to do once their survey was enabled. This surprised us because we assumed people's behavior would be the same as when they publish a landing page or a signup form; they'll copy the url and then share the link within their preferred channels.
After assessing more of the feedback, we realized that we had assumed wrong. Rather than trying to introduce a new step in the flow, we added a simple change to reveal next steps once they turned on their survey. Shortly after adding these actions, we saw share links being posted to social media asking followers for feedback.
I'm always humbly reminded that the challenges we anticipate people encountering usually aren't the case. Early on we worried users would struggle deciding on what questions to ask. We even explored concepts for surfacing ready-made questions to help them out. However, we decided to hold on to the idea and see where they actually got stuck, before building something that didn't address a real problem.
From observing feedback, data, and surveys created, users seem to have a good grasp on what they want to ask their customers, thus invalidating our initial assumption. It's always a great reminder for starting small, shipping quickly, and observing all data points for making useful improvements for customers.