Brian Acton
Exclusive

WhatsApp Cofounder Says Encryption is Solution to Tech’s Trust Problem

WhatsApp co-founder Brian Acton says he hopes his work at privacy non-profit Signal Foundation will help usher in a new era of free expression. Acton speaks openly about how users should navigate privacy in a tech era where trust has been tarnished, and responds to calls to break up big tech.

Read an edited transcript below, or listen to the full interview on the First Contact podcast.

We as a people should be demanding more transparency. We should be asking where our data is, how is it stored, and getting more things talked about in the open.


Laurie Segall: So you leave WhatsApp — And the next thing you’re doing is creating this foundation, for Signal. Explain… 

Brian Acton:  I wanted to continue, and I, I think what happened in my tenure, WhatsApp is that sort of the crystallization around the mission started to happen, and I started to see more alignment with what Signal was becoming, you know, sort of this alignment around data privacy allowance around information security, building products and technologies and services in support of that. 

It was built it with an open source philosophy. It was built with a nonprofit philosophy. So here, you know, here was this guy Brian Acton who could sort of inject money into this organization and help it become a first-class nonprofit, you know, whereas before it was kind of semi bootstrapped or semi, you know, shoestringed along so to speak. So I saw it as an opportunity to further the mission and make it my life’s work.

Brian Acton: So at the top level you have the foundation…it’s set up as an umbrella foundation. And I wanted to do that with the hope and the expectation of having other products and services. The flagship product today is Signal Messenger. It’s very similar to WhatsApp. It’s very similar to Telegram. You sign up with your username, with your telephone number. Once you get into the system you can message people, you know, if they have Signal installed as well or you can, there’s an SMS bridge, but long and short is it’s a telephone number messaging system, highly encrypted by default. So it has sort of a lot stronger position than say some other products because there’s actually more of it encrypted than any other product. 

Laurie Segall: Where do you see that going?

Brian Acton: Open-ended. I don’t know if I even have a first candidate because I’m still focused on the messenger and making sure that the messenger stands on its own two feet. But I’d love to see stronger mail position email. I’d love to see stronger payment positions. I’d love to see stronger positions around storage, around identity. You know, I mean, Google and Facebook have this login system and it’s all sort of kind of in the clear data that you’re giving to them. I mean, it shouldn’t necessarily be that way.

Laurie Segall: We’ve spent our lives sharing our data on Facebook. We’ve put ourselves out there, we’ve been asked to and we have. Why do you think encryption is the future of communications? Part one was open communications, and the open web, and now you even have Zuckerberg, everyone’s saying, okay, private, private, private is the future. So, explain to folks why.

Brian Acton: Well, I mean, there’s one dimension which is the private part, and then there’s another dimension which is trust me, it’s private. I think where Signal really pushes it is the latter, not the former. I think everyone is sort of really starting to understand the value of the privacy. But we take it a step further in the sense that you shouldn’t have to trust us to know it’s private, right? And that’s the stance that we take with Signal, is that we don’t want to force you to trust us. We don’t want to be some mega-corporation Fortune 500 company, oh, just trust us, we’ll do the right thing. You know, Target was a trusted company, and then they leaked all those Visa credit card numbers. Right? That shit happens. Right? Trust us, and we won’t make any mistakes. Like, those mistakes happen all the time.

Laurie Segall: It certainly seems like this idea of like, just trust us, is over, or we shouldn’t anymore.

Brian Acton: No, we still trust so many companies.

Laurie Segall: But should we?

Brian Acton: I think that… we as a people should be demanding more transparency. We should be asking where our data is, how is it stored, and getting more things talked about in the open.

Laurie Segall: But how do we even do that? Think about the terms of service, right? No one understands what that means. We can’t even really fully wrap our head around what’s happening?

Brian Acton: I think we’re all wrapping our heads around this, right? And then we see some companies, you know, sort of do a better job at it. I certainly, when I was at Facebook and WhatsApp, saw the terms of service getting lawyered to death, right? And it was painful, right? I mean, Yan and I wanted the simplest terms of service we could make.

Laurie Segall: Which would be?

Brian Acton: Like, we’re not gonna put ads in the product. We’re, you know, we’re not gonna sell your data. We’re not do this. Right? There are challenges and struggles that sort of manifest in writing a terms of service, and conveying this in such a way that the user understands it. It’s not, like, a slam dunk, oh, you know, let’s just regulate the shit out of this, and… because peeling the onion, right, you start to cry. You’re like, oh, crap, what about this case, what about this case, what about this case? And people wave their sort of magic wand, they’re like, oh, well, reg- regulation’s gonna solve this. And it’s like, regulation’s gonna make everyone cry.

Laurie Segall: If you think regulation not breaking up Facebook, not breaking up the tech companies, not regulation, so what?

Brian Acton: I think… we are, as a population, evolving. I think we educate and teach people. You end up creating indirectly nonprofit watching groups that point out bad behavior. And then I think as consumers, we can vote with our dollars. We can choose the services that we trust, right? And the services that provide more transparency into that trust, right, and demand that. It’s a slow needle move. Right? And it takes a lot of user education. I mean, we as a population are just new to the internet in general, it’s only 20, 25 years old, right? I mean, it’s ungodly to think of how short a time that is.