When deploying Data Loss Prevention, so much depends on a strong start.
Crafting the right policies early on creates business confidence in the tool and builds momentum for your data protection program.
I’ve now used DLP over several years – both at Symantec and prior to that for Symantec clients. In this blog post, I want to share some tips on how to get off the mark in a way that inspires confidence and outline some unconventional use cases.
Build business confidence
Strong business engagement is the foundation of a successful data protection program.
Any new program should start with the business units most highly-engaged in the security program, with policies co-designed to protect their most valuable data.
An issue that commonly arises during early phases of DLP deployments is writing too many or overly broad rules before your program has reached maturity. Once you’re aware of the power and possibilities of the platform, it’s only natural to want to make as much use of it as possible.
You have to resist that temptation.
What you don’t want is a volume of alerts and false positives too large to effectively manage. The risk is that the noise erodes business confidence in the effectiveness of the program.
Once you’re aware of the power and possibilities of the platform, it’s only natural to want to make as much use of it as possible.
In the initial phases of deployment, I have typically focused our efforts on creating polices in two areas: (i) data that needed to be protected due to legal and regulatory requirements, such as credit card and social security numbers, and (ii) intellectual property, such as source code. It takes some discipline to keep other minor use cases – often demanded by other teams – at bay while you get these basics sorted. As a starting point, leverage your organization’s existing data classification standard to understand what the business considers the most important categories of data to be protected.
Another key to building confidence in your DLP deployment is to establish at the outset the right resourcing and processes to triage and respond to alerts. We discuss this in more detail later in this post.
It’s also worth spending time to develop your reporting framework upfront. Your reporting needs to continually demonstrate the value of the program. Think beyond reporting the number of data incidents detected or prevented, which is what the tool will natively deliver you. Focus on outcomes – how has DLP driven changes and improvements to business processes? That’s what your peers and executives want to hear.
When starting out, using simple keyword matching policies can generate value quickly. From that point you can look to features like Exact Data Matching and Indexed Document Matching to help limit the number of false positives.
Exact Data Matching works by indexing a structured data source, for example, a database of employee records. A fingerprint is created for that data source and linked to a DLP policy that detects for it. Indexed Document Matching works in a similar way but involves indexing specific documents.
But DLP is not a set-and-forget solution. After the initial rollout phase, continuous tuning is key.
As you get a more complete understanding of your data and the impact of specific rules and polices, it’s important to continually modify and refine rules to get more precise, as well as introducing new rules to cover a broader range of identified risks and use cases.
Block or Monitor?
DLP plays a key role in protecting data, but it’s not a panacea for every data loss scenario.
At Symantec, we get the most value from using DLP to prevent the risk of accidental or negligent data loss, which often makes up for the larger share of data leakage in organizations. An ideal use case is detecting when an employee sends an email containing sensitive information to someone outside the organization – which is more often than not unintentional (i.e. “fat finger” errors in email address fields).
Another key consideration when configuring DLP is whether to block activity that breaches a DLP policy, or simply to monitor it. In obviously high-risk scenarios – for example, an identified insider threat risk – it makes sense to activate blocking rules using Endpoint or Network Prevent. By equal measure, configuring DLP to block activity requires highly precise policies to avoid disrupting legitimate business activity and triggering user complaints. In large organizations it may require a service (either an individual or a team) be assigned to respond to these complaints and, where necessary, unblock legitimate activity in a timely fashion.
At Symantec, we typically configure any new DLP policies in monitor mode, at least initially. This gives us visibility of processes or behaviors in the organization that could result in data loss, without adversely impacting the business. We are also biased towards configuring Symantec DLP to issue notifications to users about the risks of a data transfer, such that we can allow them to decide whether to proceed with the transfer based on their assessment of the risk.
Another key question to ask is how you will manage DLP alerts in a timely manner. It’s vital to establish a well-thought out process that meets your risk, compliance and regulatory obligations.
At Symantec, we established an alert triage model that outlines clear standards for triaging DLP alerts, including response and resolution timeframes in line with regulatory requirements such as GDPR, and we clearly laid out the roles and responsibilities for responders.
We set different resolution timeframes based on whether an alert is a suspected false positive, appears to relate to the employee’s own data, or is suspected as being a data incident. Alerts relating to customer or personal information trigger an engagement with our privacy team for further investigation.
Ultimately, we found that responding to alerts and continuous tuning didn’t suit the skills required of the modern SOC analyst – which increasingly pivot toward proactive hunting of threats.
We’ve defined simple categories of alerts in Symantec DLP like ‘New’, ‘Under Review’ and ‘Resolved’. Too many alerts in the “New” or “Under Review” state make a good case for us to consider how we’re resourcing the triage function or how we might refine our policies to reduce false positives.
We’ve considered a number of options for who should manage the triage function. Should it be within our Security Operations Centre (SOC), set up as a separate compliance function in our security team, or should it sit with responders in the business units? We weighed up the pros and cons of each.
Ultimately, we found that responding to alerts and continuous tuning didn’t suit the skills required of the modern SOC analyst – which increasingly pivot toward proactive hunting of threats. We felt that a centralized compliance function for DLP alerts worked most efficiently for our organization and drove the continuous improvement we wanted from the program.
Our responses processes for any DLP alerts relating to personal information are also integrated with our global privacy team. This team takes the lead role in managing major privacy incidents and ensuring our compliance with key data protection regulations such as GDPR. Naming standards offer a simple way to streamline this process. We distinctly label any DLP policy that deals with personally identifiable information, which helps our responders know that an alert potentially constitutes a disclosure under GDPR.
Awareness in Disguise
Finally, and somewhat unconventionally, we’ve come to see value in Symantec DLP as a security awareness tool.
First, DLP alerts provide a rich source of data on events that might lead to data loss. Our awareness team uses this data to identify and refine training needs.
But more importantly, a core principle of our education programs is that learning experiences be contextual to a user’s workflow. We want to teach staff in the moment.
Symantec DLP supports this by identifying high-risk behavior at the very moment that it occurs. We operate on the principle that most staff want to do the right thing and are simply trying to be productive. When an employee accidentally sends an email containing sensitive data to an external recipient, the tool can not only be configured to detect this behavior but also to automatically send the employee a message that enrolls them in in-the-moment training about data classification and handling. This creates a powerful, contextual learning experience.
We’re looking at other opportunities to integrate awareness messaging into our rules and policies, rather than just using DLP to generate alerts for the triage team to process.
DLP is highly versatile, but its successful implementation as a security solution rests on building the right processes to manage alerts, continuous tuning and demonstration of its value back to the business.