June 1, 2015 By Shane Schick 2 min read

Going online might just mean learning to live in fear, based on the results of a research study that showed website vulnerabilities in 86 percent of 30,000 properties that were analyzed.

The “2015 Website Security Statistics Report,” which was produced by California-based WhiteHat Security, went on to say that 56 percent of the online properties it studied actually contained multiple website vulnerabilities. Although 61 percent of the issues had already been resolved, WhiteHat suggested that many of them went neglected or unnoticed for months.

As CSO noted, there were some significant differences in the degree of website vulnerabilities based on industry sector. For example, public administration organizations were the No. 1 most vulnerable market, even though it could be argued they are likely to be dealing with extremely sensitive citizen information. Also, despite the awareness that followed major security breaches at Home Depot and Target, retail organizations ranked second-worst in the number of vulnerabilities. In follow-up interviews, WhiteHat determined that organizations focused on compliance with industry regulations tended to be safer than those who were more concerned with risk reduction.

Help Net Security, meanwhile, described the numbers from the report as a nasty surprise, given the wave of attacks that struck organizations throughout 2014 and the first half of this year. Although the scope of website vulnerabilities varied, one of the most frequently cited in the research was transport layer protection. For example, when data travels between or across a computer system, secure sockets layer (SSL) might be used during the authentication process but not in other parts of an application. That’s when cybercriminals can potentially harvest passwords or other personal information.

Experts told eWEEK that hackers are not giving up on the opportunity to make money off their victims, so patching the various website vulnerabilities is not necessarily the answer. What may be more important is how quickly organizations can fix their sites and, if necessary, notify any customers or visitors whose data may have been exposed. Although the study showed overall improvement in time to remediation, there was still a number of sectors that probably need to do better.

The IT industry often talks about best practices, or common ways organizations do something well, but in an interview with SC Magazine, WhiteHat Security said there aren’t any real best practices to minimize website vulnerabilities. The only thing the firm could suggest is to make sure those developing websites work hand-in-hand with computer security professionals, who can make as many fixes as possible along the way. It may not make the Web error-free, but it might make things a little safer when we’re surfing it.

More from

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today