For many organisations, the introduction of the Protection of Personal Information Act has placed a spotlight in data security.
For many CIOs and CISOs this has led to investments in data encryption.
Ensuring that sensitive data is encrypted is one way to reduce the risk of a breach, particularly from an external source.
Yet, data encryption by itself is not sufficient to manage risk. Data encryption is a blunt instrument – it either protects all the data, or allows access to all the data.
PoPIA Condition 7 – Security Safeguards covers the requirements for data protection under the Act. It requires that organisations must secure the integrity and confidentiality of personal information by applying appropriate and reasonable organisational and technical measures.
But Condition 7 cannot be implemented in isolation. As discussed in What is data privacy? the context of personal data is critical to ensuring protection. My bank manager may require access to my credit history, but the teller does not.
More and more frequently, we are picking up reports of internal abuses of protected data. Most recently, Absa advised that an employee had “unlawfully made selected customer data available to a small number of external parties.” They are not alone. Internal threats – illegitimate use of data by authorised personnel are a key risk that must be managed.
Data privacy requires a multipronged data protection approach
Data is typically stored in relational database tables that are structured according to subject matter. Giving access to a table shares all data, whether sensitive or not.
Let’s look at a real-world example.
An organisation we have been working with is implementing a new enterprise data warehouse as an enabler for self-service BI. They have defined policies for dealing both classifying and accessing sensitive data, and have taken steps to identify and classify sensitive data elements.
Building the data warehouse in such a way as to restrict access to (various) levels of sensitive data is where their challenges have come in. Sensitive data is spread though out the warehouse.
Encryption cannot help.
One approach is to (try to) design the warehouse in such a way that sensitive data is separated into separate views. This means creating multiple views of the same data, each including data of various classifications. As one can imagine, this is non-trivial.
There has to be a better way.
Dynamic access management provides role-based access at an attribute level
What does this mean?
- Policies define access. Leveraging data governance principles, we can identify sensitive data in context and apply different access permission based on factors like the role of the person accessing the data, the location of the person accessing the data, and the classification of individual data elements
- Visibility of each attribute is dynamically adjusted. Rather than (simply) encrypting all data we can apply dynamic tokens or masks to individual attributes. For example, one user may see a complete telephone number, another may see only the last 4 digits, and a third may not see telephone number at all. This nuanced approach allows us to limit access to data according to the processing requirement, minimising the risk of abuse.
Key to the value is the ability to apply policies and access rules across multiple systems and attributes automatically.
In our example above, by applying dynamic access management we can share one data set across multiple users and geographies with a single design.
When we only have a hammer, every problem looks like a nail. Dynamic access management frees analytics teams to focus on value, without adding risk.