Google’s Android Problem: Security is Hard

I’m a big fan of choice, freedom, and flexibility in mobile devices. I think that our most personal of computers simply must be accessible and hackable. Android is the embodiment of personal choice. And yet, I am very bearish on Android. Why? It’s security, of course.

In the past few weeks there have been new additions to the endless parade of Android security exploits, including the one-touch system break of Dirty COW, and the Google Account stealing Gooligan. While there has been a lot of recent noise about security, the problem has plagued Android since inception.

Use Android for banking? Probably a bad idea. Want to download apps from outside the Play Store? Develop yourself and sideload? Both probably bad ideas. Sure, Apple has had plenty of iOS security issues. But there are a few meaningful differences that make Apple’s issues much less of a concern.

First off, people with Apple devices both can and do upgrade their operating systems. iOS 10, the current state-of-the-art and supported Apple OS, can be run on Apple phones as old as the prehistoric iPhone 5 (which, having been released in late 2012, is now over four years old.) On Android, only the very latest version gets security patches for the big name issues (and basically nobody is running that version). If you want to support more than 60% of active Android devices, you have to go all the way back to KitKat, which was released in 2013. You can’t use the latest attack mitigations, you can’t use Google’s security features, you can’t trust that the users will be running a remotely recent operating system.

Additionally, Google’s fixes for these systemic bugs (like Dirty COW) generally only land in the most recent version, which (see above) nobody’s using. On iOS, on the other hand, more than 60% of active users are using the latest version of the OS (which, keep in mind, is only three months old). Another 30% are using the immediately prior version. If you want to write apps that use the latest features, security mitigations, and the like, it’s much easier to do so on iOS. Within a month (by October 7), more than 70% of iPhone users of the popular Audiobooks app had already installed iOS 10.

But, still, why do Google devices seem to have much nastier exploits? The operating system deficiency alone can’t be to blame for the virulence and headline-grabbing numbers of people affected by them. My hypothesis is the advertising integration that Google practices. From my personal experience, Android apps have much more advertising than iOS apps, and the ads seem to have nearly unfettered access to the system. I can’t count how many ads have appeared that show up and suddenly I’m in the Play Store, or that send push notification-style reminders to bother me with something.

This gets at the much bigger problem that Google still, somehow, in the current year, lets malicious code into their ad networks. This is a huge problem on the desktop as well (and why it’s basically negligence to use a computer without an adblocker). It seems like something that would be really quite easy to fix, if only literally anybody at Google cared. However, scummy ads make money, and Google gets to keep said money if they terminate the scummy ad after it’s run for a while, so the incentives aren’t exactly aligned in making the user experience a safe and quality one.

Android by itself can’t make any money for Google. It requires tons of engineers churning away SessionFactoryImplBeans (relax, it’s just a Java joke) by the ton. Android exists as a vehicle to drive people to Google web services that are monetized and to install apps that will display monetized ads. It’s done well at that, driving $31 billion of revenue (and $22B of profit,) according to Oracle, which may be a somewhat biased source, but I’d imagine the numbers are somewhat in the ballpark.

Because of this, the incentives for Google are to be as secure as necessary to keep nerds (and CIOs) happy, but not too secure, because that would undermine their ability to monetize the platform. Additionally, the experimental and “free-spirited” nature of many Android users (who often are more technically savvy and advanced than iOS users) may lead them to disable what security features do exist so that they can upload their own applications, change themes, and otherwise modify their OS experience.

While Apple is extremely hostile to this, going to the point of starting a security bounty program to try to buy jailbreaks instead of letting them hit the public, Android tacitly supports it, with a huge community of people who generate their own versions of the OS (that may be insecure,) and who constantly theme, tweak, and otherwise modify it. Google can’t get control of the ecosystem without locking it down, which they are (unpopularly) trying to do.

In my next post in this series, I’ll discuss Google’s lockdown, what Google can do to make their ecosystem more secure, and how they can regain the trust of their users. What do you think about Google’s insecurity? Do you agree with my hypothesis, or do you think I’m overreacting? Let me know in a comment below, or if you really want to talk, try writing a post. Include the hashtag #ParanoidAndroid so I don’t miss it.


Start typing and press Enter to search