At Atlassian, we pride ourselves on the quality of our products. With the recent launch of Atlassian Bonfire, our new tool for agile and exploratory manual testing, I wanted to talk a little about our development and testing process, and how Bonfire helps solved some of our challenges with testing in our agile development teams.
Challenge #1 – Culture of automated testing
Like many other software startups, Atlassian started out with a strong emphasis on engineering. True to the teachings of Agile, we invested heavily in testing automation and continuous integration (to the extent that we created our own continuous integration server – Bamboo). From unit tests, functional tests, to integration tests, Atlassian’s products had up to thousands of tests which ran on our build farm.
However, what we found was that a purely automated approach to testing wasn’t sufficient, as our products grew – with richer user interfaces, and larger supported matrices, we needed a dedicated QA function.
QA stands for Quality Assistance, not Quality Assurance. The QA at Atlassian was introduced not as a replacement for the automated testing that we do, but rather, to provide additional value where our automated testing didn’t cover. What we wanted to avoid was a developer-throw-it-over-the-wall mentality, where developers developed and “testers” tested. At Atlassian, quality is everyone’s responsibility – not just QA, or developers, but product managers, marketing.
Bonfire is the enabler that allows everyone on the team to be more responsible about quality. At Atlassian, we religiously dogfood our own products in development. With Bonfire – we now have a whole company worth of testers everyday testing our product, who can pick up defects and issues with the click of one button.
Challenge #2 – Bottlenecks
With an agile process and two weekly iterations, one of the biggest challenges with testing are the bottlenecks. Without a clear “phase” for testing, stories would typically “pile up” towards the end of iterations as work is completed by developers, leaving no time for sufficient manual testing before the iteration ended.
The most important factor for reducing bottlenecks is time – which is why we’ve designed Bonfire with the focus on speeding up the process of executing testing – from quickly raising bugs, capturing and annotating screenshots, to using features like dynamic variables to automate mundane data entry.
Similar to our technical writing team’s blitz approach to updating screenshots, we also run “blitz” testing sessions within Atlassian regularly to test our products. Bonfire’s sessions and templates fit right in here to provide a tool to track the testing done, and also standardize on the defects raised.
We addressed bottlenecks by being extremely selective about what we manually tested. We focused on areas where testing provided the most value, and these were not typically the regression tests which can be executed automatically. Rather, our testing is more exploratory in nature.
Challenge #3 – Tracking
With the emphasis on exploratory manual testing, we don’t produce any up-front requirement documents or test scripts to drive our testing. Therefore, it was important that we found a way to keep managers abreast of test progress, answering questions like: “What have we tested?” or “Can we ship the product?”
This is where Bonfire’s test sessions come in. Bonfire’s test sessions allow our QA teams to plan and keep track of the testing effort for our stories – which stories have been tested, what have been tested (in the stories), and where the testing is at.
Have you got Bonfire yet?
If you’re an agile development team (and looking to be one), and face some of the challenges we have when it comes to quality and testing, then Bonfire could be your answer.