The Fat Tech Cat Diet

This got me thinking about the things that live inside my phone.

Like much of the world, I seem to live in a permanent state of vexation about technology, privacy, and how to survive in a world where so many access points are guarded by hungry algorithm crunching data trolls. This is not a new anxiety for me, I’ve blogged here before about some of the privacy concerns of smart cities, the privacy choices made by Apple in the past, and even the benevolent hackers protecting us.  But technology moves on, and the fat cats of the tech sector seem to be getting fatter on a steady diet of user data. 

Well this certainly doesn’t inspire confidence.

This topic has been especially hot on my mind this summer, because in an unexpected move (precipitated by this) I switched from an iPhone to a phone with an Android operating system and have been questioning the implications to my privacy ever since.

I mean, wow, I’ve had to click “I Agree” to a whole lot of things the last month.

But as it (shockingly) turns out, it doesn’t quite matter whether I agree or not.  A report from the Associated Press revealed that Google apps store a time-stamped register your locations, even when you specifically turn off location services.  So, for example, even if your “location history” is turned off, every time your gmail app pings a tower, the time and location is saved in your history.  That’s frustrating.

So maybe you think it’s smarter to use an iPhone and stay away from Google’s proprietary apps.  Well, I have bad news for you.  Apple, who has always differentiated itself by loudly proclaiming its commitment to locking up all of your private data inside your phone that it can’t be touched even by the engineers at Apple, has a new (as of iOS10) privacy scheme.  And the math around it doesn’t look very good.  The scheme is called differential privacy, and in a nutshell, Apple now sends all of your local data back to the mothership, but mixes it with enough noisy data that (in theory) your data could never be tied back to you personally.  This is an opt-in scheme, and when you agree to it, you are agreeing to a data sharing budget of epsilon per day.  That is, there is an upper bound, epsilon, to the amount of data they will harvest each day.

However, mathematicians have shown that even for a fixed epsilon, the amount of privacy being lost is not really something to be proud of, in fact they show that the amount of privacy you can lose each day is unbounded.  A post from Andy Greenberg at Wired gives a good rundown on some of the research that has been done on the shortcomings of the algorithm.

What also really bugs me about this, is how apparently nonchalant Apple is about dealing with the criticism.  I mean, I’m torn.  On the one hand, Google is standing there out in the open with its grabby robot hands taking all of my data and I can’t stop them.  And in some sense, Apple is doing the same thing but just pretending it isn’t.  

Cathy O’Neil, the longtime blogger and now frequent contributor to Bloomberg has done a lot of writing about big tech companies and their questionable algorithm practices.  Recently, O’Neil wrote about a set of proposals from Mark Warner regarding data privacy.  Specifically, what the government might do to limit who gets to access your data and what they get to do with it.   For the algorithms that have big control over your life, Warner recommends a system of algorithm auditing by human (because in case you missed it, algorithm bias is a very real thing). Most recently, she wrote about what Zuckerberg and his fellow cats can do to reel in the powerful and dangerous beast they’ve created.

What motivates your personal decisions on privacy?  Do you think about what operating system you use and does it vex you every day?  Let me know in the comments, or as usual I’ll be tweeting from my underground bunker @extremefriday.

This entry was posted in Data Science, Mathematics and Computing and tagged , , , , , . Bookmark the permalink.