The founding principle of the web was openness – the ability for anyone to contribute and consume information, wherever they were and on whatever machine.
However, as it has grown from an information sharing system to a platform for financial transactions, various methods have been adopted to inject a degree of security into the web retrospectively.
One of these methods is the system of secure socket layer (SSL) certificates, which signify that a website is to be trusted. Sites that have been allocated an SSL certficate can use the https:// prefix; most web browsers will warn users if a website does not have a genuine certificate.
To get a security certificate, the website operator must apply to what is known as a certificate authority (CA). These are private organisations whose job it is to verify that the website is legitimate.
Under the certification model, the decision of whether or not to trust a website is devolved to the CAs. The model assumes that the CA and the certificates it issues can themselves be trusted.
However, two CAs have been successfully hacked this year – US-based CA Comodo in March and Dutch CA DigiNotar in July – leading to fake certificates being published. It is believed that the same hacker, operating in Iran, was behind both attacks.
With a fake certificate, hackers can trick users into thinking an insecure website is legitimate. They might then be fooled into entering their credit card details or passwords.
Mikko Hyppönen, head of research at security firm F-Secure, has described recent the spate of breaches at CAs as a nightmare scenario. "You have to trust the companies selling these certificates,” he told the Wall Street Journal earlier this year. “If we can’t, then all bets are off.”
Other experts are less worried. Marcus Ranum, CSO of security company Tenable, says that the certification system was never meant to provide real security, merely to "appear to be good enough that unsophisticated end-users would trust it without understanding its flaws”.
Whether or not there is a structural flaw at the root of the certification system, recent events demonstrate once again that IT security requires constant vigilance. What can be trusted today will almost certainly be compromised in the future.
Avivah Litan, security and privacy analyst for Gartner, says extended validation certificates may be the solution
The problem is not with the technology, it is a fundamental problem with business process. The processes need to be scrutinised, tightened up and treated like Fort Knox. It’s not all hopeless, but it’s very difficult to know who you’re dealing with in such a global system.
Extended validation certificates [which require more stringent verification checks] haven’t been hacked yet, so one solution is for people to only trust those. All the big financial services websites, such as Bank of America, are making this move. The most practical solution for now is for more companies to adopt those standards, and for customers to know that’s what they need to look for.
Jeff Hudson, CEO at IT security firm Venafi, argues that exposing the flaws in the SSL certification system is a natural part of the evolution of the web
In the evolution of networks and the Internet, we’ve always just assumed that these third party trust providers are not subject to compromise. Well, just like anything else, they are, and now we’ve woken up.
We’ve got to have multiple sources of trust so that when something goes wrong with one of them we can switch to a back up. Recent events are just a natural part of the evolution of the web. People are all up in arms saying it’s broken, but it’s not, it’s just naturally evolving to a place where we have develop the capability to recognise quickly when our sources of trust are no longer good.