Categories
Uncategorized

Algorithms need their electric car

“I’m sure that’s important to someone, but I don’t have anything to hide. Why should I care about data privacy?”

I frequently talk about how the practice of building algorithms on top of telemetry data is stripping us of our agency. If a system only looks at what I do, and never cares to understand WHY I do it, why should I trust it to decide where my attention should go?

But there’s another widespread problem with our algorithm’s addiction to telemetry data: privacy. Our current attention-guidance systems cannot operate without telemetry data. If they stop collecting it they die. If Facebook doesn’t track your clicks and your likes and your scrolling speed, then your news feed will slowly atrophy and eventually it will revert back to a random collection of stuff you don’t care about. The days where a “chronological feed” could work are well behind us. We, “users”, simply generate too much content.

If current platforms are bound to collect this data, it means they will adopt a strong stance in the privacy debate. Any proposal that would shut down their ability to tap into the resource is an existential threat to them.

And as long as these platforms are the only option, an existential threat to them means we would all lose a tool we became quite dependent on. This is reminiscent of the debate around fossil fuels: we, the people, want less fossil fuel extraction; less fossil fuel is an existential threat to car manufacturers; less cars is an unacceptable outcome for us…

In a sense, the best way to advance the debate on privacy is the same than with car manufacturing. We have to innovate. We could invent algorithms that perform equally well using fewer amounts of telemetry data (eg. Cars that consume less gas) or we could invent algorithms that are based on a totally different type of data source and perform equally well (eg. Electric cars).

In many ways Waverly is meant to be the electric car of the world of algorithms. First and foremost to give people their agency back… but its impact on our ability to build better privacy-preserving algorithms is also important.

(This post was inspired by that article, which was brought to me via Waverly.)