Knowledge Access Control - Need to Know
Specification
Define policy and procedure for "need to know" access to knowledge, information and data within the organization and in the context of the AI system to be applied when regulating access to resources.
Threat coverage
Architectural relevance
Lifecycle
Data collection, Data curation, Data storage, Resource provisioning
Design, Supply Chain
Validation/Red Teaming, Evaluation, Re-evaluation
Orchestration, AI Services supply chain, AI applications
Operations, Maintenance, Continuous monitoring, Continuous improvement
Archiving, Data deletion, Model disposal
Ownership / SSRM
PI
Shared across the supply chain
Shared control ownership refers to responsibilities and activities related to LLM security that are distributed across multiple stakeholders within the AI supply chain, including the Cloud Service Provider (CSP), Model Provider (MP), Orchestrated Service Provider (OSP), Application Provider (AP), and Customer (AIC). These controls require coordinated actions, communication, and governance across all involved parties to ensure their effectiveness.
Model
Shared Cloud Service Provider-Model Provider (Shared CSP-MP)
The CSP and MP are jointly responsible and accountable for the design, development, implementation, and enforcement of the control to mitigate security, privacy, or compliance risks associated with Large Language Model (LLM)/GenAI technologies in the context of the services or products they develop and offer.
Orchestrated
Shared Model Provider-Orchestrated Service Provider (Shared MP-OSP)
The MP and OSP are jointly responsible and accountable for the design, development, implementation, and enforcement of the control to mitigate security, privacy, or compliance risks associated with Large Language Model (LLM)/GenAI technologies in the context of the services or products they develop and offer.
Application
Shared Application Provider-AI Customer (Shared AP-AIC)
The AP and AIC both share responsibility and accountability for the design, development, implementation, and enforcement of the control to mitigate security, privacy, or compliance risks associated with Large Language Model (LLM)/GenAI technologies in the context of the services or products they offer and consume.
Implementation guidelines
Auditing guidelines
1. Verify platform-level RBAC or ABAC policies are configured to enforce knowledge access limitations. 2. Check tenant-level isolation controls on data lake, object storage, and model registry. 3. Confirm logging and monitoring of access attempts to AI knowledge resources like datasets and features. 4. Validate cloud-native services (e.g., Vertex AI, SageMaker) restrict access to metadata and pipeline artifacts. 5. Ensure policies align with CSP customer’s "least privilege" configuration templates for AI workloads.
Standards mappings
42001 B.3.2 - AI roles and responsibilities 27001 A.8.3 - Information access restriction
Addendum
N/A
Article 9 Article 11 Article 15 Article 16 Article 29
Addendum
The AICM IAM-17 control is partially covered in the EU AI Act, with strong alignment on risk, security, and incident management, but only partial coverage regarding quality management and post-market monitoring.
No Mapping
Addendum
No (explicit/implicit) reference to the requirement set by the AICM control is made in the NIST AI 600-1 standard.
C4 DM-01 C4 DM-02 C5 PSS-08
Addendum
N/A
AI-CAIQ questions (1)
Are policies and procedures defined for "need to know" access to knowledge, information and data within the organization and in the context of the AI system to be applied when regulating access to resources?