Kenneth van Wyk: Why mobile apps beat Web apps for privacy

Internet communications are prey to surveillance, but you can better shield them

Yet another excellent resource, Groklaw, is shuttering its services as a consequence of what I'll call the ongoing "Surveillance Wars." Rather than debate the politics of surveillance, I want to again make a case for making our software tools harder and more resilient to attack, regardless of where that attack is coming from.

One thing to consider is making more use of mobile apps as opposed to Web apps. Here's the thing: Surveillance typically targets data in transit, which is something that mobile devices and their apps can do a very decent job of protecting.

In my June column, I talked about safeguarding privacy through email security, stressing key management above all -- specifically, doing our own key management and not trusting any external service from doing it for us.

But we have many other means of communication beyond email. With all of them, it's still vital to consider key management, but things can be slightly different.

Most Internet communications rely on Transport Layer Security (TLS) or its predecessor, Secure Sockets Layer (SSL). In either case, the actual key used to encrypt data in transit is generated by the SSL subsystem. Unless you write your own SSL library, you're probably going to use an SSL that is open and respected among the crypto community -- OpenSSL, Bouncy Castle and the like.

Now, you certainly can use OpenSSL in an iOS or Android application, and you certainly can use Bouncy Castle to build an Android application.

But that's not enough.

When an SSL/TLS session is initiated, the server presents to the client its SSL certificate. In traditional SSL, that certificate is checked for two things: Does the name in the certificate match the name and IP number we can derive via a DNS lookup, and is the certificate signed by a trusted root certificate authority (CA)?

There are a couple of problems with doing only that. First, DNS is not trustworthy, so at least part of the trust validation in the above is placing trust in something that can't support it. Second, if the certificate is signed by any CA, that's good enough for SSL, but it shouldn't be good enough for us.

That's where certificate pinning comes in -- a topic I've mentioned here before. And that's where the mobile part of the argument creeps in.

In traditional Web apps, the browser (and underlying system libraries) does the SSL lookups. Some browsers, notably Google's Chrome, use certificate pinning to verify the SSL certificates they use for their own purposes, such as connections to Google servers to check "safe browsing" URLs and so on. But when a browser is rendering the HTML of a Web app and that Web app refers to an SSL encrypted page (HTTPS), the browser lacks the context to do certificate pinning.

The reason for this is that to enforce certificate pinning, you have to know what CA you expect to see signing each and every SSL certificate you connect to. That's beyond the reasonable scope of a Web browser -- but it is entirely feasible for most mobile apps.

With a mobile app that connects to its own servers, it's quite reasonable to expect the mobile app developers to have no problem at all defining their own servers and corresponding SSL certificates.

Furthermore, Objective C for iOS and Java for Android are quite capable of doing the local processing needed to implement certificate pinning. That's a lot more feasible than trying to implement certificate pinning in, say, JavaScript for a Web app.

So why does all of this matter? In order to have confidence in the security of our communications, we need to ensure mutual authentication between the end points. SSL/TLS, in the absence of pinning, can no longer be fully trusted to make that decision, since there have been numerous attacks that erode that trust model. But when we pin our certificates, we can have a far higher degree of confidence in our communication security.

Will it prevent eavesdroppers from seeing our private communications? There are no guarantees, but it sure raises the bar substantially. For our highest security needs, if we manage our own SSL certificates and pin them in our mobile applications, there's a fighting chance our communications will remain private.

That sounds like a lot of extra work. Is it worth it? That depends on the business your app is carrying out.

What are the downsides (other than more work)? One side effect of pinning is that it doesn't work through many network proxy tools that themselves do SSL breaking to see inside all the SSL traffic of the users on the net. Many companies deploy such network proxies, particularly in highly regulated environments. Apps that pin their certificates will generally not work on those networks.

With more than 20 years in the information security field, Kenneth van Wyk has worked at Carnegie Mellon University's CERT/CC, the U.S. Deptartment of Defense, Para-Protect and others. He has published two books on information security and is working on a third. He is the president and principal consultant at KRvW Associates LLC in Alexandria, Va.

Read more about security in Computerworld's Security Topic Center.

Tags groklawcyberwarfaresecuritybrother

More about Brother International (Aust)Carnegie Mellon University AustraliaCERT AustraliaGoogleMellonPara-ProtectTopic

Comments

Comments are now closed

ACCC won't block TPG's fibre-to-the-basement rollout

READ THIS ARTICLE
MORE IN Storage
DO NOT SHOW THIS BOX AGAIN [ x ]
CIO
ARN
Techworld
CMO