AICM AtlasCSA AI Controls Matrix
DSP · Data Security and Privacy Lifecycle Management
DSP-04Cloud & AI Related

Data Classification

Specification

Classify data according to its type and sensitivity level.

Threat coverage

Model manipulation
Data poisoning
Sensitive data disclosure
Model theft
Model/Service Failure
Insecure supply chain
Insecure apps/plugins
Denial of Service
Loss of governance

Architectural relevance

Physical infrastructure
Network
Compute
Storage
Application
Data

Lifecycle

Preparation

Data collection, Data storage

Development

Design, Guardrails

Evaluation

Evaluation

Deployment

Orchestration, AI Services supply chain

Delivery

Operations, Maintenance, Continuous monitoring

Retirement

Data deletion, Archiving

Ownership / SSRM

PI

Owned by the Customer (AIC)

The Customer (AIC) is responsible and accountable for the design, development, implementation, and enforcement of the control to mitigate security, privacy, or compliance risks associated with Large Language Model (LLM)/GenAI technologies services or products they consume.

Model

Owned by the Customer (AIC)

The Customer (AIC) is responsible and accountable for the design, development, implementation, and enforcement of the control to mitigate security, privacy, or compliance risks associated with Large Language Model (LLM)/GenAI technologies services or products they consume.

Orchestrated

Owned by the Customer (AIC)

The Customer (AIC) is responsible and accountable for the design, development, implementation, and enforcement of the control to mitigate security, privacy, or compliance risks associated with Large Language Model (LLM)/GenAI technologies services or products they consume.

Application

Owned by the Customer (AIC)

The Customer (AIC) is responsible and accountable for the design, development, implementation, and enforcement of the control to mitigate security, privacy, or compliance risks associated with Large Language Model (LLM)/GenAI technologies services or products they consume.

Implementation guidelines

[All Actors]
1. Define and apply a data-classification scheme to every data object the actor stores or processes.

2. Attach and preserve classification metadata on data objects and flows and ensure that metadata follows the data through its lifecycle.

3. Apply appropriate protection controls, according to the assigned classification.

4. Maintain a classification register and review it at least annually or after material change, updating controls when data sensitivity or regulatory scope changes.

[Shared among : MP, OSP, CSP]
1. Classify and maintain lineage for all training, test and fine-tuning datasets and model artifacts.

Auditing guidelines

1. Examine the CSP’s policy, procedures, and technical requirements for classifying data. Establish that this process and key controls comply with the CSP’s data privacy and security policy. Establish whether the CSP has documented the roles and responsibilities for this process.

2. Establish if the CSP’s data classification matrix is aligned with the CSP’s data classification requirements in terms of data type and sensitivity level.

3. Select a sample of data to confirm that each item has been classified appropriately.

4. Examine the measure(s) that evaluate this process and determine if they address the implementation of the process/control requirement(s) as stipulated. Verify that technical measures such as labeling, tagging, and access controls are used to enforce data classification.

5. Verify that data classification criteria are based on the organization's specific needs and regulatory requirements.

6. Verify that data classification processes include regular reviews and updates to reflect data types and sensitivity levels changes.

7. Identify how the CSP evaluates third-party data classification practices and if appropriate risk levels are assigned to each.

8. Examine infrastructure mechanisms that support data classification, such as metadata services, tagging capabilities, or labeling frameworks implemented at the storage layer.

9. Verify that storage resources can be segregated based on data classification levels to enable appropriate isolation of data with different sensitivity requirements.

10. Review access control mechanisms at the infrastructure level that enforce permissions based on data classification, ensuring they're adequately implemented and effective.

11. Assess capabilities for enforcing storage policies (e.g., encryption, retention, geographical restrictions) based on classification levels.

12. Verify that infrastructure logging and monitoring can track and report on activities related to differently classified data, particularly for highly sensitive categories.

Standards mappings

ISO 42001No Gap
42001: A.4.3 Data Resources
42001: A.2.3 Alignment with other organizational policies
42001: A.5.2 AI system impact assessment process
42001: A.7.3 Acquisition of data
27001: A.5.12 - Classification of information
27002: 5.12 - Classification of information
Addendum

N/A

EU AI ActPartial Gap
Article 10 (2)
Article 11
Addendum

Classify data according to its type and sensitivity level. While the EU AI Act requires data documentation, it doesn't mandate specific classification schemes.

NIST AI 600-1Full Gap
No Mapping
Addendum

NIST AI 600-1 does not cover the DSP-04 topic of classification.

BSI AIC4No Gap
COS-08
AM-06
Addendum

N/A

AI-CAIQ questions (1)

DSP-04.1

Are data classified according to its type and sensitivity level?