| Your Inbox Is Spying on You |
| | Damon Winter/The New York Times | |
| Charlie Warzel Opinion writer at large |
| Call it the Five Stages of Privacy Erosion. |
| Tech Company builds popular product. |
| Product is exposed in the press for doing something shady behind the scenes. |
| Tech Company apologizes/clarifies/signals a fix. |
| Brief phase of collective rejoicing and moving on. |
| It's revealed (usually by the same people) that Product was never really fixed. |
| That's the rough trajectory of two recent privacy stories in just the past week. The first is an update to a story I wrote last month about Google quietly monitoring and storing all your purchases across sites like Amazon. When the CNBC discovered the story, Google assured concerned users that they could delete their purchase history. A follow-up report from CNBC suggests that contrary to Google's claims, the fix doesn't remove the purchase history. The company, according to the report, "is looking into it." |
| Then there's Superhuman, the exclusive Silicon Valley start-up for power emailers. At the end of last month, a former Twitter engineer, Mike Davidson, wrote a blog post detailing the ways in which Superhuman violated the privacy of its users by tracking every time their emails have been viewed by recipients — all by default, with little room to opt out. The backlash prompted a response from Superhuman's C.E.O., Rahul Vohra, who promised to reconsider many of the tracking features. "When we built Superhuman," he wrote, "we focused only on the needs of our customers. We did not consider potential bad actors." |
| You can probably guess what happened next. On Monday, Davidson wrote a second blog post, praising the company for responding but calling the fix superficial. "You can still see exactly when and how many times someone has opened your email, complete with multiple timestamps — you just can't see the location anymore," he wrote. "That, to me, is not sufficient. 'A little less creepy' is still creepy." |
| Davidson's extremely detailed posts (which you should read in full here and here) get at a core issue of the privacy debate, which is that none of this invasive technology happens by accident. Our privacy crisis is a crisis of design. Take that telling line from Vohra, Superhuman's C.E.O., which is less than a week old and has already aged poorly. We did not consider potential bad actors. But, as Davidson goes on to explain, Superhuman did receive negative feedback about email tracking; it just didn't listen. "We did not consider" doesn't mean the company was unaware but that they didn't seem to take the feedback into consideration. |
| This line from Vohra's apology offers a clue as to why. "If one of us creates something new, and that innovation becomes popular, then market dynamics will pull us all in that direction," he wrote. It's worth noting because it's a line I've heard frequently from ad tech executives and tech companies in my reporting for The Privacy Project — this couldn't be wrong because it's the industry standard. But, as Davidson rightly notes, "just because technology is being used unethically by others does not mean you should use it unethically yourself." |
| (I want to pause here to offer an email-tracking disclosure and some clarification. Tracking is a tricky subject. It isn't inherently nefarious. This newsletter tracks things like how many times the newsletter email is opened and what links are clicked, which helps to improve the newsletter. But like all privacy issues, it's a matter of transparency and expectations. When it comes to marketing emails and newsletters, which often come from corporate entities, there's often more of an expectation that open rates might be tracked. In Superhuman's case, as Davidson notes, the tracking takes place with every personal email sent, which is more likely to violate the expectation of privacy.) |
| Protecting privacy is often about adding friction to the mechanisms that threaten it. But that's antithetical to the ethos of Silicon Valley, where innovation is all about simplifying. And so privacy, along with bad actors, is not considered and frequently tossed aside during the initial design, when the foundation of a product is built. |
| It's why genuine change in the digital privacy realm is so hard to come by. When flaws are exposed, superficial solutions are common because they don't threaten the core of the product. And when invasive tools are baked deep into the infrastructure of our technologies, there's no easy fix. |
| It's understandable: It's easier to add a coat of paint on a house than fix its crumbling foundation. But if it's left too long, chances are, the whole thing's coming down. |
沒有留言:
張貼留言