Is HTTPS Dangerous? HTTPS is incredibly useful but there may be critical flaws making it dangerous
By: +David Herron; Date: February 26, 2018
HTTPS encrypts our website connections and it authenticates website identity. Both are excellent reassurances for website users, to give us a feeling of safety online. The video attached here makes a bold claim that HTTPS is actually dangerous. Let's take a look at that claim.
The video (see below) makes several claims.
|Must be renewed||Inconvenient||SSL Certificates must expire, therefore the real issue is the manual process|
|Fake certificates||Not quite||Generally the process for validating SSL certificates is robust, but bugs exist everywhere and in a few cases bogus SSL certificates have been issued|
|NSA hashing algorithms||Tin Foil Hat||The NSA and other USA Government agencies developed critical algorithms for these uses, and possibly put in back-doors|
|Complex=vulnerable||Maybe||Hard to say for real|
TL;DR The guy doing this video is one I've often thought makes big mountains out of small molehills. In this case he shows some information saying the NSA developed several of the algorithms behind HTTPS, and therefore the NSA can snoop HTTPS traffic, and therefore the HTTPS cannot satisfy the whole purpose for which it was developed.
There's enough of a grain of truth to that to be concerned. But not to the degree this guy says.
HTTPS/SSL certificates must be renewed
Problem: Because certificates are manually renewed, a website owner can forget to renew the site, or the website owner can stop maintaining the site, and the site will appear to be dangerous even when it's providing useful information. Sites with expired certificates are flagged by web browsers as risky. But what if it's a perfectly safe site just with an expired certificate?
The certificates do need to expire, the problem is the manual renewal process.
Most certificate authorities do not have an automated process of certificate renewal. Since website maintainers are forced to use manual processes, they can forget to renew the certificate one year and have an embarrassing couple of days scrambling to fix the problem.
Some certificate authorities do offer automated certificate renewal.
I've posted a tutorial showing how to automate Let's Encrypt certificate renewal in a Docker container: Deploying an Express app with HTTPS support in Docker using Lets Encrypt SSL
Problem: Supposedly it's easy to create a fake SSL certificate and therefore have a fake unvalidated HTTPS setup.
I don't understand this claim unless it's the following.
It's easy to create self-signed certificates that aren't validated by higher authorities. Such a self-created self-signed certificate can say anything you want it to say, including claiming to be the SSL certificate for
The PKI infrastructure used for legitimate HTTPS/SSL certificates relies on a hierarchy of certificate authorities. The browser makers have a list of trusted Root CA's and any certificate issued by a CA certified by one of those Root CA's is deemed to be trustable.
But - that doesn't stop someone from using a self-signed certificate on their HTTPS website. If they do the browser will notice it's not certified by a trusted Root CA, and put up a big warning to the user.
But - what if a legitimate trusted CA can be tricked into producing a certificate for another website, like
But - what if some hacks into a legitimate trusted CA forcing it to produce bogus certificates?
In the case of a bogus certificate signed by trusted CA - someone could stand up a website claiming to be the site named in the bogus certificate.
This happened in 2011 when DigiNotar was hacked: http://www.zdnet.com/article/fake-ssl-certificates-pirate-web-sites/
In 2014, Netcraft said they found dozens of fake SSL certificates masquerading as real websites. https://news.netcraft.com/archives/2014/02/12/fake-ssl-certificates-deployed-across-the-internet.html
The Netcraft article also talks of bugs in SSL validation libraries. An inegnious attacker might craft a bogus certificate that exploits such a flaw letting it look legitimate.
In 2013, Google announced having detected fake SSL certificates issued by an "Intermediate CA", ANSSI. Those fake certificates claimed authenticity over several Google domains. Google automatically made changes to the SSL certificate validation algorithms in Google Chrome, then warned the other browser makers of the problem. After some time, Google changed what they did to allow ANSSI certificates from certain domains, but not from others. https://security.googleblog.com/2013/12/further-improving-digital-certificate.html
In 2017, Google announced it would "distrust" certificates issued by Chinese CA's WoSign and StartCom. http://www.zdnet.com/article/google-guillotine-falls-on-certificate-authorities-wosign-startcom/
In 2015 it was discovered that Symantec had issued "Extended Validation" (EV) certificates for some Google domains, and that on further auditing found the company had issued thousands of bogus SSL certificates. In 2017, Google and other browser makers announced a plan to completely "distrust" Symantec's CA operation. http://www.tomshardware.com/news/google-chrome-distrusts-symantec-certificates,33973.html
Symantec and its resellers are responsible for 1/3rd of the Web's SSL certificates, so that's a big step to take. https://www.pcworld.com/article/3184660/security/to-punish-symantec-google-may-distrust-a-third-of-the-webs-ssl-certificates.html And https://techcrunch.com/2017/03/27/google-is-fighting-with-symantec-over-encrypting-the-internet/
Secure Hash Algorithm (SHA) developed by the NSA
Problem: HTTPS relies on SHA-1 in several ways. That algorithm was developed by the NSA. The same NSA that has been illegally breaking into computers all over the world, even inside the USA, so that NSA spies can spy on everything.
Given what's been revealed about the NSA how can we trust that the NSA did not install a backdoor via the critical algorithms?
In part the purpose for adopting HTTPS everywhere is so that NOBODY (including the NSA) can eavesdrop on our Internet connections. But, the NSA designed critical algorithms and presumably designed-in a backdoor.
By the way, a designed-in backdoor can be found by a miscreant. Especially since NSA's own hacking toolkit got stolen and released on the Internet.
SHA-2, also developed by the NSA, meant to be used in replacement of SHA-1.
SHA-3, another alternative, was developed by NIST which is technically part of the Commerce Department. But we can assume complicity with the NSA. Especially as NIST was required by law to cooperate with NSA.
SP800-90A is a Recommendation for Random Number Generation using Deterministic Random Number Generators. Encryption algorithms require random numbers but computers aren't capable of generating properly random numbers. Instead they're faked using algorithms that produce pseudo-random numbers. Which is a security issue because it makes encryption keys somewhat predictable.
More complexity means more vulnerabilities
Problem: The more complex a system, the more potential for breakages. That truism seems very real.
Does it mean HTTPS, by being more complex than HTTP, is more vulnerable? It's hard to say this claim is actual in the case of HTTPS.