Despite improvements in code quality, web servers remain at high risk of being hacked, according to a new paper from researchers who use honeypot technologies to examine how hackers work.
The Honeynet Project, which provides real systems for unwitting hackers to attack, says web applications remain vulnerable for host of reasons. These include poor-quality code; the fact that attacks can be performed using PHP and shell scripts – which are generally easier than using buffer-overflow exploits – and the emergence of search engines as hacking tools. What's more, web servers can be a gold mine for hackers, in that they have higher bandwidth connections than most desktops and often link to an organisation's databases.
The group's findings are outlined in a paper titled Know Your Enemy: Web Application Threats ". Researchers involved in honeynet projects in Chicago, Germany and New Zealand collaborated on the paper.
The report states: "Web applications commonly face a unique set of vulnerabilities due to their access by browsers, their integration with databases and the high exposure of related web servers. The modern web server setup commonly presents multiple applications running on one host and available via a single port, creating a large surface area for attack."
Code injection, remote code-inclusion, SQL injection and cross-site scripting are cited as common attack modes. Search, spider and IP-based scanning are typical discovery techniques used by hackers seeking vulnerable applications.
Hackers attempt to disguise their identities using proxy servers, the Google Translate service, onion routers and other systems, the researchers write.
Attackers may hack with the aim of phishing, to email or blog spam, to recruit botnets, host files or simply to deface a site.
"By becoming a tool for an attacker to inflict harm on other systems, a site may be opening itself up to liability issues if they have not been paying sufficient attention to security" the report states. "For example, if a machine is joined to a botnet, it may be a participant in a denial-of-service attack against an external site or may be used to recruit other machines into the botnet."
While the researchers said more of the same is in store for organisations using web servers and applications, they did offer security recommendations. These include keeping an inventory of applications on web servers and maintaining patch levels for them, as well as correctly configuring web servers. Network and host-based intrusion-detection systems also can help, the researchers said.
Find your next job with computerworld UK jobs