At Atlassian, we’re always trying to increase the ways we can get meaningful customer feedback. We’re constantly trying new methods to understand our customers, and learning what we can do better. At Summit, our annual user conference in September, we reached out to customers in a new way: the Test Lab.

User testing, Summit-style

Summit brought over 2100 customers from all kinds of backgrounds–developers, project managers, technical writers, designers, product owners, and admins. Finding target audiences for user testing enterprise software can be tough going; but at Summit, we had everyone in one place. We wanted to collect actionable customer feedback, and to empower as many customers as possible to voice their thoughts on early- and late-stage features. We knew there would be more customers in attendance than we could ever talk to individually or that could participate in in-person usability tests.

To meet our goals, we tried a radical approach: unmoderated usability tests in a kiosk-style format in addition to in-person facilitation.

We designed a lab where customers could pop by, chat with a volunteer, and give usability feedback on features in development. When the crowd was too big, we supplemented the in-person moderation with unmoderated tests. The latter used our kiosk-style format with a third-party tool––to present guided tasks, and capture screen and audio recordings. We gathered recordings of our customers talking through the exercise, sharing confusions, hesitations, and comments. Having these self-guided tests meant our facilitators could accommodate many more customers, and check in with people periodically to make sure we listened and heard important feedback.

Customers participating in the Test Lab
Customers participating in the Test Lab

This combination of in-person facilitation and self-guided usability tests gave a diverse customer audience the chance to collaborate with us on new features in development. It let us showcase products and features from people all over the company that couldn’t attend the conference. We have offices around the globe, and we were able to capture results, edit prototypes and scripts remotely, and quickly adjust. Some of our script authors were able to analyze the videos streaming from on the day of the conference, rather than waiting for us to export resources from 12+ computers.

The results are in!

When all was said and done, we collected over 200 valuable customer feedback videos using both and QuickTime videos. We gathered much more data than if we had facilitated all interviews in person, and more customers got to collaborate with us. While the quality of interviews isn’t as high as it would be with in-person facilitation, we’ve found incredible value in the results we gathered. We collected actionable, meaningful feedback across 16 different usability tests–among which, we validated late-stage features like Jira Agile reports, and learned we were using the wrong iconography for a high-profile campaign. We’re now considering hosting a booth to use at our Roadtrips and other events like Getting Git Right, so we never lose a chance to gain customer feedback.

Special thanks to our amazing customers for bearing with us as we tried a new experiment, and to the team for providing marvelous support!

Has your team tried a novel approach to user testing? Leave a comment and let us know how it went!

2 days, 200 customers, and conclusive results: the...