Privacy is a hot topic — and one that sparks fear in many a developer’s heart. How can businesses stay compliant with new and fast-changing regulations? What do those regulations tell us about the shifting needs of customers? And where does the cloud fit into privacy conversations? We asked our Head of Privacy to give us some insights.

You’ve practiced law for over 10 years now. When did you first start working on privacy issues, and how has the conversation around privacy changed over time?

Erika: I started dabbling in data privacy-related issues about six years ago. At the time, data use and security were really afterthoughts, and parties felt pretty comfortable if they had some baseline language in the contract. It was “tick the box” and everyone moved on.

That really started to change as companies started to make money off secondary uses of data. All of a sudden, everyone was really interested in what you were doing with their data. This had always been true of banks and hospitals, but the conversation has moved in a more mainstream direction. People now understand that some data — even in isolation — may give access to other data. And more data is more value.

New regulations, like the GDPR, mean all of our customers have to review how we handle and secure data. It’s no longer a passive or optional conversation. It’s a legal requirement. If service providers can’t satisfy those standards, by law, a customer cannot put certain data (read: most data) into that service.

It’s a real game-changer and becomes a hard line in negotiations. I tell our teams all the time: Investing in these requirements is an investment in our business. If you don’t come to the table with good solutions, you’ll get phased out — no matter how good your user experience or design is.

If you don’t come to the table with good solutions, you’ll get phased out — no matter how good your user experience or design is.

These days, the conversation is much more focused on the specifics of how data will be used, where it is stored, how long, who you are sharing it with, etc. And those are questions you can’t answer from a distance. You have to be in the trenches with the teams designing how the data will move and helping them understand where to put barriers in place. You can’t make promises to a customer that aren’t designed into the product and practices first. The reality of product capabilities and operations has to lead the negotiation — not the other way around.

You have a unique perspective on new data privacy laws. Can you tell us more about how you see them?

Erika: I think data privacy laws are, fundamentally, a trade issue. That’s how we need to think about these things moving forward. It’s about making sure that regulators and governments keep the value of their citizens’ and residents’ data contained, to a certain degree. Governments understand that the data of their citizens and residents has value–and they’re seeking to either control or retain that value when data moves outside their borders. Much like a tangible good would when subject to export controls.

I think data privacy laws are, fundamentally, a trade issue.

You can look at almost every single privacy law in the books to-date, and they all follow the same trend: give more rights to individuals. Make penalties heavier. Broaden the scope of data that’s protected.

But they all have different motives, right? The EU has the GDPR. The US has the California Consumer Privacy Act and soon probably some version of federal legislation. China has the Cybersecurity Law. They all do the same things. They all have the same level of protection. But they’re not exercised for the same reasons.

The EU is very interested in fundamental rights. If there’s a marginalized class, they are going to move in and protect them.

The US is very concerned about consumer rights. We’re actually pretty okay with people using data, provided you’re giving them the economic opportunity to make an informed decision. Maybe you use that person’s data, but you pay them for it. That’s a different framework.

But China is only going to use its privacy laws to further its national sovereignty interests, right? It’s not about the consumer. It’s not about the individual. It’s about how government can further nationalistic interests by asserting a level of control over the data of its citizens. You’ve got to filter the law through the cultural and geographic context it’s being applied to, and then fit it to your business within that. And this makes it a trade issue, right? Because these laws can effectively raise or lower the value of data and the costs associated with collecting it as a service provider.

What does this all mean for Atlassian products? How does our tech prioritize privacy?

Erika: Atlassian has a somewhat unique model where we have both individuals and companies on our platform.
Sometimes individuals own their accounts independently, and other times the company owns their account (in a more traditional Enterprise model). So we have to think a lot about what data is appropriate for the individual to control versus what data should be controlled by the administrator. I think our teams have done a fantastic job of parsing that in a way that makes sense for our cross-collaborative tools.

This year, users will see a lot more control over what profile elements they share in what spaces and how they can delete their own personal data from our platform. Similarly, admins will have a lot more control in-product over data they are responsible for from a regulatory standpoint.

We really strive to bring that functionality into the product rather than sending our customers through a support channel or to documentation to make meaningful choices about their data. It shouldn’t be that hard. Openness and choice around privacy really drive those designs, and I think we will lean into that a lot more in the next year as we build out various levels of Cloud offerings.

How do you and your team make decisions about how to handle data privacy in Atlassian products?

Erika: We always try to filter requirements through the lens of the customer, asking not just “what’s required by law?” but also “what do customers expect?” Can we build it better? Can we make it easier? Can we meet them in-product at the point that specific choices or conversations are most meaningful?

Because of this, my team collaborates quite a bit with Atlassian’s developer talent. Our developers really care about our customers, and they come up with way better solutions than the law requires. We spend a lot of time unpacking the intention of the law, how it applies to Atlassian, and how other companies do things (and how we might want to be different).

Our developers really care about our customers, and they come up with way better solutions than the law requires.

One of the decisions we made early on in this whole process was that we were not going to have our customers manage their data choices with us and with each third-party add-on.

For example, one of the decisions we made early on in this whole process was that we were not going to have our customers manage their data choices with us and with each third-party add-on. The whole point of the add-on structure is to make our platform extensible. It makes no sense to send our customers to chase their legal rights with 50 or 60 vendors that they may not even realize they’ve installed.

The way we constructed our APIs, we said, “okay, app vendors, if a customer makes a choice about their data, they’re going to do it centrally within the platform, and we expect you to honor that choice — whether it’s the way that they share their profile information, the permissions they grant, or how they can delete specific information or even deactivate their account.”

Where do cloud-based solutions fit into the privacy conversation? Can using hosted cloud applications actually make privacy protection easier?

Erika: Yes! With the cloud, there’s the power of scale, efficiency, assurance, and innovation. When you have everything standardized in the same environments and same technology, that means we can all innovate better and faster on how to protect it.

The cloud eliminates the gap between companies that can afford to protect their data in house and companies that can’t, because we have more resources working on these problems and maintaining solutions. It’s more efficient and effective–for the benefit of all cloud customers. The cloud makes overhead, operational, and maintenance costs much cheaper for everyone.

The cloud eliminates the gap between companies that can afford to protect their data in house and companies that can’t.

The same way that data centers have taken off because there’s a cost to on-prem hosting, secure, privacy-minded cloud offerings can make the consistency of protections and control easier and more scalable for companies of all sizes.

What can businesses do to prioritize privacy within their organizations and keep up with regulations in the coming years?

Erika: Incubating a privacy- and security-minded approach is critical to staying ahead of the law. In the future, I think we’ll see consolidations of data choices above application layers. Much like when you change addresses, you can go to one point of contact (the post office) to have all your mail forwarded. I expect to see data custodians and management solutions that will help people identify where their data is and apply principle-level choices that will then be honored at the application layer.

Specific things businesses should do include developing (or purchasing) software with the data lifecycle in mind, managing data responsibly throughout its lifecycle, extending clear information and choice to customers consistently, automating to remove human error where possible, and using your IT department’s skills to scale across applications that handle data at multiple points.

We hear your path to privacy law was a unique one. Can you tell us about how a love for art led you to a love for the ongoing legal challenges and questions of the internet age?

Erika: Yes. I actually have an art history degree and worked for Sotheby’s for awhile. I ended up working with a lot of lawyers, and in my exposure to that, I thought: these people seem pretty sharp, and it sounds like they have a fun job. Maybe I’ll go to law school and be an art lawyer.

When I started law school, it was early in Obama’s campaign, and there was a picture of Obama that was re-created as a red and blue watercolor print that became his classic campaign poster. The original image was an Associated Press photo, and the photographer said “I want credit for that. It’s my photo.” But the campaign said, “No, we changed it. We made a whole new work of art. You can’t claim it.”

Those kinds of questions really drew me in. There are a lot of areas of law that were written before technology advanced, even before the internet, and those laws are not really set up to handle concepts we see emerging as a direct result of technology and innovation. Before, there was a boundary of exchange that was clear. It was “this is mine and I’m giving it to you and you know it’s mine.” Now we transform things so easily. We share information so easily. That has all been blown open, and you see the seams of these legal concepts really starting to show wear and tear as they’re stretched.

Data privacy, trade issues, and the path from art ...