Typefully

Looking at privacy tech through the lens of sex/drugs education

Avatar

Share

 • 

3 years ago

 • 

View on X

Privacy-preserving (and digital identity) tech needs to become a lot more like sex education or drug education campaigns. This thought came to me during #RWOT11 @WebOfTrustInfo A lot of digital identity and privacy is fear-based, on how Bad Things Can Happen With Your Data.
Many people *don’t* care about zero-knowledge proofs or private and secure identity because…it’s complex and boring (for many audiences). So, preaching that people should Really Care About Tracking doesn’t fly because a lot of people *like* ads.
I do myself. Normally I’m armed with 3 different ad/tracking blockers but if I’m buying something expensive, I’ll turn them off and *browse with intent*, hoping to get a targeted add with discounts. Or add something to a shopping cart, hoping to get an email with a coupon.
Preaching to people that they should care about how their data gets shared and tracked and resold is obviously noble, but it’s very similar to preaching abstinence as a measure to prevent STIs, or saying “don’t do drugs” to teenagers.
Instead, a lot of successful campaigns in those spaces acknowledged that people will engage in risky behaviour. Concertgoers will do drugs…so offer them testing. People will engage in risky sexual behaviour…so offer them condoms or the morning after pill.
People *will* do “risky” things with their data, online personas, and reputation, mainly because our data and identity sharing methods right now are geared in risky, trackable sharing. How can we offer methods of “privacy harm reduction”, in a way that’s not preachy?
A good example of where privacy tech has been successful is VPNs. They are everywhere: on YouTube, podcasts, online…and while some of it about hiding browsing activity, most VPN players *really* made an impact by offering the ability to watch geoblocked streaming content.
And sure, the other reason is VPNs are an extremely lucrative business with fat margins. But on both fronts, *people* and *companies* had simple-to-understand value propositions without being too preachy.
It’s why I find the #Web5 pitch compelling as something that gets people outside of the identity conference thought bubble excited. As articulated by @RuffTimo here rufftimo.medium.com/web3-web5-ssi-3870c298c7b4 Also by me on the same topic: twitter.com/ankurb/status/1535667405539577856?s=46&t=bxNx7P0WlEKPgvaBb8R1ow
I also find @discoxyz’s analogy of a “data backpack 🎒🪩” *also* a good example of explaining the concept in a fun way while still having serious interop and usage for more “serious” use cases.
(I say “serious” and “fun” in the fashion my friends who don’t care about digital identity would)
An example of “harm reduction” privacy tech I can think of is DuckDuckGo’s Email Protection service, that offers randomised email addresses to use online and strips trackers out. See more in our research in Top 5 Trends in Decentralized Identity: cheqd.io/blog/top-5-trends-in-decentralised-self-sovereign-identity-and-privacy-preserving-technology-in-web-3.0-2022
That’s the food-for-thought I’m mulling over at #RWOT11, among other topics. When people engage with “risky behaviour” with their data, what are the harm reduction techniques that will make them safer, without necessarily pitching the blanket idea of “more privacy”?
Avatar

Ankur Banerjee 🆔

@ankurb

CTO/cofounder @cheqd_io & @creds_xyz. Co-chair of Technical SteerCo @DecentralizedID. Ex @FinTechLabLDN, @inside_r3, @Accenture, @StackTravel.