By Elad Erez, Chief Innovation Officer, Imperva
Is there a day that goes by where you don’t read a news headline about a mega-breach impacting millions of people?
It’s an unlikely scenario, particularly at a time when the volume of data breaches are rising by an astonishing 30 percent annually. Imperva Research Labs estimates that another 40 billion records will be compromised by the end of 2021. That’s billions of pieces of data, much of it sensitive or identifiable, that will be available for cybercriminals to exploit in the future.
Looking at these figures, it begs the question: is there something fundamentally wrong with the way organizations are handling the security of data? Considering that the volume of compromised records grows 224 percent annually, I think we should conclude that the answer to this question is “yes”.
One of the symptoms of this rampant and global technological epidemic are the vulnerabilities that exist in internal databases globally — those that often store an organization’s most sensitive data.
Your Sensitive Data Sits on Foundations of Sand
Based on insights from a proprietary database scanning tool introduced by Imperva five years ago, Imperva Research Labs conducted an unprecedented study spanning 27,000 on-premises databases globally, the largest analysis of its kind ever completed.
The findings from this study paint a grim picture. Nearly half (46%) of internal databases have at least one unpatched Common Vulnerabilities and Exposures (CVE), with the average database having an eye-popping 26 vulnerabilities. Nor are these minor issues: more than half (56%) are ranked as ‘High’ or ‘Critical’ severity according to the National Institute of Standards and Technology (NIST) guidelines.
These vulnerabilities are easy for hackers to find through scanning tools that are widely available on the internet. They’re also easily exploitable. A simple search on a repository like ExploitDB can provide the necessary proof of concept (POC) code required to carry out debilitating attacks, like privilege escalation, authentication bypass and remote code execution – allowing intruders to exfiltrate the data, take ransom for it or move around freely throughout the network.
These findings indicate that nearly one out of two organizations globally are consistently failing at patching and updating their database software in a timely manner. In fact, our research finds that some of the CVEs have gone untouched for three or even five years!
The reality is: half of organizations today are potentially sitting on a serious vulnerability that could result in financial loss, reputational damage, and the potential for regulatory penalties. Despite the increasing adoption of cloud infrastructure and database environments, it’s estimated that 50% of data is stored on-premises. This is not a problem that will fix itself; a different approach to securing data is paramount.
So, What Do You Do Now?
Addressing a problem of this scale is a monumental challenge. Businesses must understand that patches and security tools for endpoints and applications are important, but not enough to significantly reduce the risk of a data breach.
The answer to this challenge is to maintain a patch management program, especially for data-centric assets, and above all, employ data security solutions to protect the data itself.
There are five primary steps every security team can follow to bolster the security of their on-prem databases:
- Discover and classify data: Organizations tend to think they know where all their data repositories are and what data they hold. Automate this process so there are no blind spots and no assumptions over where sensitive data resides.
- Assess for and patch vulnerabilities: You need to verify that you are running databases with the latest security patches.
- Audit for visibility, protect for prevention: Gain visibility into who’s accessing the data, when, where from, and what data have they accessed. Enable policies designed to block database exploitation, or create custom policies for access reduction.
- Conduct periodic assessment: Databases change, so too do their configurations. Make sure you understand how they change and if your database administrator (DBA) has mistakenly misconfigured them, or just not performing best practices.
- Identify data access anomalies: Databases are heavily accessed resources; it is almost impossible to identify anomalies in a manual way. Deploy anomaly detection tools to detect instances where an application user to the database starts to behave like a human or when your DBA access looks like data misuse or exfiltration.
Will you accept it or change it?
Security must start with the data itself. If that’s not secure, vulnerable web applications or databases will become pathways for an easy attack. The focus must be on gaining full visibility over all of the data, regardless of where it lives, knowing who’s accessing it and where they’re accessing it from.
Beyond that, it’s about adopting a unified approach where data and all the paths to it are protected in a cohesive manner. Standalone data and application security lack context to disrupt multi-vector attacks. Each defensive layer must work in concert with the others to safeguard critical data.