Ulcer Du Jour: Enterprise Database Patch Processes

Share

When most people think of software patching, the Microsoft Patch Tuesday process springs to mind as the most widely followed exercise of its kind in the IT industry.

Most IT managers also understand the importance of tracking patches and updates from widely-used application vendors—Adobe, Mozilla, AutoDesk, Netapp, etc.

Patching and updating enterprise databases, however, quickly leads to a shadowy corner-case area where, despite laudable efforts by the software vendor community, no vendor has developed a completely satisfactory patch and update process.

The issues surrounding database patching and update merit more concern for a couple of reasons:

• Targeted, financially motivated malware attacks are increasing. Attackers are after data. And where is data stored? Databases! Couple this with organisations externalising more services and delivering increasingly valuable, revenue-generating services over the web, and it’s easy to see why financially motivated hackers are burning more midnight oil attacking and filching information from databases.

• Meanwhile, most organisations have very little visibility and control over their databases. At a recent conference, I gave a presentation on vulnerability management (which is more than scanning and patching). I asked the audience of several hundred how many were concerned with their database security initiatives or felt that databases were a vulnerability management weak spot. Only two percent raised their hands to say yes. After the presentation I asked the question again, and 90 percent of the audience responded in the affirmative.

Most organisations have become fairly adept at dealing with scheduled Microsoft patch updates and are able to extend this to support non-Microsoft applications. When it comes to enterprise databases many organizations can be one, two, or more years behind with database patches and updates. This can be quite troubling in a threat environment that is increasingly focused on data theft, with hackers having the same relationship with databases that Bonnie and Clyde had with banks.

To its credit, Oracle has been providing critical patch updates on a regular basis. Unfortunately, due to the size and structure of database installations, Oracle patches tend to be fairly large and consist of bundles of multiple fixes across multiple products. Oracle has consolidated security information on a dedicated webpage: While this information strives for thoroughness, it should apparent at a glance how complex the task of keeping Oracle installations up to date can be.

NEXT PAGE: Amrit on dealing with a Tsunami of Patch Content

Dealing with a Tsunami of Patch Content

The changes Oracle have made to their security processes are a good thing, but personally I think there is going to be an issue with bundling hundreds of vulnerabilities. What happens if the bundle breaks one component(s)? How will the bundle be tested?

What mechanisms do organisations have in place to support the deployment of such a large bundle, especially to mobile computing devices or remote offices? Will this increase or decrease the likelihood organizations will attempt to address the CPU in a timely fashion?

I know that database administrators are justifiably ultra-conservative and prefer bundled patching that allows for less frequent QA cycles and patch updates. It has only been recently that organizations have even realized the need to patch Oracle in response to potential security threats. Many data base administrators continue to believe that their perimeter shields them from attack.

To date, there have been no broad automated attacks targeting Oracle, but that will not last. When a targeted attack eventually strikes the bundled patch approach will not allow for rapid patching of targeted components, although Oracle does state they can provide a critical patch if needed.

The real issue with a bundle of 101 vulnerabilities that address 12 different components is that some within an organisation might make the case that QA testing will take four to six months, or longer. If every critical patch update addresses a high number of vulnerabilities, very quickly organizations can find themselves two-to-three update cycles behind.

For organisations taking a cautious approach to patching for logistical or change management control reasons, Oracle does not provide enough information on a potential work-arounds or other mitigation techniques. Either way, organisations that patch diligently every 90 days or those that run behind can expose themselves to attack despite everyone’s best intentions.

We are headed towards very significant database security issues and organisations need to take these seriously before their reputation, their bottom line, their shareholder value, and their entire business is severally impacted.

Step one is to acknowledge that data security is important and that database systems need to be included as part of an organisation’s vulnerability and threat management programs. Step two is to take action.

Promoted