In my previous blog about 20% Time, I gave a summary of Atlassian’s “20% Experiment” but left readers hanging as to the future of 20% Time at Atlassian. Well, I’m glad to end that suspense now and officially announce that Atlassian is continuing with 20% Time, rather than keeping it under the label of being an “experiment”.
We get a lot of questions about the ‘nuts and bolts’ of how we’ve implemented 20% Time, so I’m happy to share the details.
|The Goal of 20% Time|
|To encourage innovation in products, development techniques and the Atlassian development ecosystem.
The key here is that innovation is not guaranteed. We can’t mandate that every 20% Time project must lead to a new feature since we don’t want people to be afraid of failure. Rather, we aim to create an environment where innovation is encouraged and where ‘trying’ is as much a cause to celebrate as ‘succeeding’.
We have also stated a broad scope for 20% Time that includes work on productivity enhancements and external activities (eg working on Open Source projects) — but only where there is a beneficial link back to Atlassian. After all, 20% Time is a part of our Development effort, not a charity. (We have the Atlassian Foundation for that!)
Tracking effort vs Tracking projects
The next major decision we had to make was how closely to monitor 20% Time. The Management team want to track “return on investment” so they can justify the expense of 20% Time. Developers, on the other hand, want freedom to innovate without having to justify everything they do. As one person suggested:
Reporting against specific 20% projects can lead to a decrease in the innovation factor. 20% projects, by their very nature, will have a lot of false starts, dead-ends, and time spent that doesn’t directly contribute to a production product (and that is how it should be). If you start tracking and reporting on individuals and how much time they are spending on a project, they might start picking safer, less innovative products, so they don’t have a lot of false starts and dead-ends showing up against their name in the reports.
These two viewpoints lead to some friction. For example, we have two sign-off points to ensure projects don’t consume too much time without foreseeable benefit:
- Any project that has consumed more than 5 days of developer time requires the sign-off of three supporters. These supporters should be developers who are not involved in the project, but who believe the project is both viable, and a good idea.
- Any project that has consumed more than 10 days of developer time requires sign-off from a Founder.
Enforcing these rules would mean that we have to track time against each project, but this is contrary to giving Developers freedom to innovate without a “big brother” watching over them.
After vigorous debate amongst management, it was decided to have Faith that our Developers would apply their time wisely. Therefore, we are tracking 20% Time at a high-level within our normal time-tracking system, but not at the individual project level. Developers are responsible for getting appropriate sign-off and updating the status of their own projects, the results of which are displayed on a dashboard on our corporate wiki.
(For those interested, we track each 20% project as an individual ‘issue’ within Jira, then display those projects as a dashboard on our Confluence wiki.)
It’s also worth mentioning that our quarterly ShipIt Days also contribute to 20% Time but are tracked separately. They involve specific days set aside for innovation in a time-limited, competitive-like atmosphere.
All of these decisions, however, do not help overcome our biggest problem — that scheduling 20% Time can be difficult and can seem unfair. That will be our on-going challenge as we continue with 20% Time.