No single facility is as important to Internet security as the SSL. When it's used properly, breaking or even bypassing it is impractical. Not everyone uses it properly.
Data presented at the recent Internet Measurement Conference in Barcelona shows that SSL security really could be a lot worse, even if significant problems remain. Admins who can keep up with best practices are in good shape.
Last week I reported on how, and require SHA-2 from that point. 2016 being over two years away, this constitutes a call to action to the entire Internet, as well as cover for other infrastructure vendors to start advancing similar requirements.
It didn't take me long to realize that Microsoft itself wasn't using SHA-2 all that much. I asked how far they were along in the migration and got this statement from a Microsoft spokesperson:
Microsoft has been signing code with SHA-2 since last year. The company is also working on a new Certificate Authority that will issue SHA-2 SSL certificates for Microsoft's services. Microsoft will move over to SHA-2 certificates after the Certificate Authority enters production over the next year. We recommend customers begin the process of updating to SHA-2 as we continue to help protect the integrity of the Windows platform and Windows customers.
The Barcelona paper, Analysis of the HTTPS Certificate Ecosystem by Zakir Durumeric, James Kasten, Michael Bailey, and J. Alex Halderman of the Department of Electrical Engineering and Computer Science at University of Michigan, reports on the results of 14 months of scanning the Public Key Infrastructure (PKI) of the Internet. This project was the latest in a series inspired by the EFF Observatory, a 2010 collaboration of the EFF and Jesse Burns at iSEC Partners. The researchers scanned the Internet for SSL certificates and collected data on them and on the certificate authorities that issued most of them.
The Observatory project demonstrated a number of interesting facts: There are a lot of certificate authorities out there, almost 1500 trusted by Microsoft or Firefox. There are a lot of bad practices (e.g. over 300,000 valid leaf certificates signed by a single GoDaddy key) and lot of invalid nonsense (e.g. certificates authenticating "localhost" and 192.168.1.2).
Everyone take a moment now to thank the EFF and Burns for this important work, which helps us all to protect our own privacy. Also take a moment to consider whether such work, when done by the government, would be considered "surveillance." In any case, the Observatory data set is now over three years old, and not very useful.
Not only is the Michigan data set more current, but they discount self-signed certificates from their analysis. Such certificates are inherently untrustworthy and their inclusion only clouds a fair analysis of the practices of certificate authorities.
I was a bit surprised and please to see, in U. Michigan paper, that only about 1% of certificates observed used hashes which are weak by current standards (MD5 and earlier). I was also surprised to see just how rare SHA-2 is in the real world — about 0.25% — but I shouldn't be surprised, given that Microsoft is only getting on with it now.
SHA-2, it is worth noting, is a name given to a collection of hash algorithms with a variety of digest sizes: SHA-224, SHA-256, SHA-384, SHA-512. SHA-256 seems to be the only one with any real uptake. SHA-3, a completely new algorithm, also provides for a variety of digest sizes.
Nearly every certificate authority has a little over 2 years to stop issuing SHA-1 certificates and start issuing SHA-2. It looks like almost nobody is issuing SHA-2 now, and major changes like these are usually slow to come. But there's no good excuse to take more than 2 years.
The real problem isn't issuing the certificates, but ensuring that the software which interprets them properly handles SHA-2. This is one of the many reasons why, when you are writing a program which implements cryptography, you are almost certainly better off using an established library than trying to do it yourself. The two libraries which matter most in the world are the Windows Crypto library that is built into Windows and OpenSSL, which is used by nearly everyone else. Both have supported SHA-2 for a while, and will probably support SHA-3 soon.
The Michigan study finds plenty of errors and bad practice, but as a percentage of the overall it seems to be declining. This tells me that what's out there are a few stubborn and/or lazy holdouts, while all new issuers have eliminated the really sloppy practices.
The trends in root key size (see figures 4 and 5 below from the paper) show that CAs are moving too slowly towards more secure practice.
1024-bit keys have been considered insecure for some time, but root certificates have long expiration dates, so you would expect a slow decline. The sudden uptick in April 2013 of 1024-bit keys is disturbing, but it's in parallel with an increase in 2048-bit keys, so it could be a single anomalous event.
The real problem is that almost half of browser-trusted leaf certificates are still signed with a 1024-bit root. Combine that with the fact that over 70% of those 1024-bit roots expire after 2016, a target date set by NIST for ending use of such keys, and you have a recipe for stagnation.
Microsoft has already set 2048-bit keys as a requirement for their root certificate program. In fact, their technical requirements state that "CAs who issue RSA 1024-bit certificates of any type with expirations beyond 31 December 2013 do so at their own risk. RSA 1024 may be blocked from operation in Windows without notice in the near future. Microsoft will remove from distribution most RSA 1024 root certificates in 2014." Fortunately, this means that the incidence of 1024-bit keys should decline precipitously soon, as only a fool would issue a new one.
The Michigan study was done before Microsoft's SHA-1 deprecation announcement and the authors don't seem aware of the key size rule changes. They express concern about the persistence of 1024-bit keys in the PKI, but 2014 should be the critical year for this problem.