Sponsored Content

Billions of Compromised Records and Counting: Why the Application Layer is Still the Front Door for Data Breaches

Why the Application Layer is Still the Front Door for Data Breaches

Security teams should brace for an unsettling and unprecedented year, as we’re on pace to see 40 billion records compromised by the end of 2021. Imperva’s Terry Ray explains what security teams need to do to bolster their defenses.

By Terry Ray, SVP and Fellow, Imperva

Each year, the number of data breaches grows by 30% while the number of records compromised increases by an average of 224%. 2021 is far from over, but we’re already on pace for another record-setting year. In fact, Imperva research finds that more records were compromised in January alone than in all of 2017.

The volume of cyber attacks and compromised records continues to grow despite increasing spending on cybersecurity, as businesses try to mitigate the financial and reputational damage a breach can cost their organization. So, why are successful attacks soaring if companies have more defenses in place?

Attackers continue to evolve their tactics to get access to sensitive data. They’re using more sophisticated methods that evade traditional perimeter or endpoint solutions. In fact, research finds that nearly 50% of data breaches over the past several years originated at the web application layer. While this isn’t a new trend, attackers continue to use SQL injection (SQLi) or remote code execution (RCE) to exploit vulnerabilities in web applications that are connected to an organization’s data stores.

Organizations need to rethink their security posture and the tools in their tech stack. Protecting the web application layer has been a focus of the industry for years, and yet attackers are still finding ways in. In part, there’s an operational issue to solve. When application and data security are managed by separate teams, it creates gaps that savvy, motivated attackers are able to exploit.

Security teams should brace for an unsettling and unprecedented year, as we’re on pace to see 40 billion records compromised by the end of 2021 — more than double the total of last year — according to Imperva Research Labs.

Time for a Change in Mindset: Protect All Paths to the Data

Security incidents like Sunburst and the Accellion breach underscore the evolving nature of attacks and the widespread impact these events have downstream on customers, partners and the global economy.

Attacks are soaring due to the evolving nature of application development and IT infrastructure. Monolithic applications have decomposed into a sea of APIs, microservices and serverless functions. The modernization of DevOps is enabling faster development and more cutting-edge digital services for end-users. While it’s creating improved user experiences, and application efficacy, it’s expanding the threat landscape in new and complex ways. Each part of the software development lifecycle interacts with a variety of data stores to enable real-time results and improved user functionality, setting up potential data security hazards.

For security teams, the challenge is figuring out how to get real-time visibility into these ephemeral workloads as they are harder to monitor. Further, the volume of first and third-party APIs that have access to organizational data is multiplying by the second, creating additional risk for a data breach attack. In fact, the number of new API vulnerabilities grew by 4% in 2020, with sensitive data exposure ranking as the most common vulnerability. With that context, it’s clear to see why data breaches originating at the application layer are a growing trend. Without effective layers of security to monitor or block malicious activity from edge to application or API through to the data store, how can organizations possibly keep up with security?

Vulnerable Software Supply Chains are an Attacker’s Playground

As web application environments become more complex, any system can be a pathway for external attackers, or even insiders, to achieve their primary goal: getting access to sensitive data.

Maintaining a fully patched database and using native security features — such as data encryption and defined lists of users and authorizations — might seem like enough to protect stored data. While that’s a good place to start, security teams should not become complacent. These controls are not sufficient enough to defend against sophisticated attacks that start at the application layer. SQLi attacks have the ability to evade defenses and then control access or modify and delete records in the underlying databases. In some cases, they can even access the operating systems of the servers hosting the database services.

Attackers are motivated to access sensitive information, regardless of its composition and structure. It’s why organizations need to implement security for all their data stores, with a focus on protecting the data itself — not just the applications and networks that surround and interact with them. Whether for proactive preventative security or for post incident response, or both, it’s vital to understand where the data lives, if it’s classified, if the right access controls are in place, and ensure that strong tools for auditing and anomaly detection are in place.

With all this in mind, a traditional approach to security won’t work as you need visibility across the environment and to understand the lateral movements that happen during the cyber kill chain. Only by taking a holistic, unified approach – that prioritizes the security of the data itself — can you keep your organization out of the headlines.

Suggested articles

Top 10 Breaches and Leaky Server Screw Ups of 2019

2019 was a banner year for data exposures, with billions of people affected by cloud misconfigurations, hacks and poor security practices in general. Here’s the Threatpost Top 10 for data-breach news of the year, featuring all the low-lights.