Skip to content

3.4 Data View

Minimum TOGAF Data ISO 42010

The Data View describes how data is stored, classified, protected, and managed throughout its lifecycle. It addresses the concerns of data architects, DBAs, compliance officers, and privacy teams.

Minimum

Document all data stores used by the solution, including business data, logs, caches, and temporary stores:

Data NameStore TechnologyAuthoritative?Retention PeriodData SizeClassificationPersonal Data?Encryption LevelKey Management
[name][e.g., PostgreSQL, S3, Redis]Yes / No[period][size]Public / Internal / Restricted / Highly RestrictedYes / NoNone / Storage / Container / Application[method]

Guidance

For each data store, document:

  • Data Store Technology - The specific technology (e.g., SQL Server, S3 Bucket, Azure Blob)
  • Authoritative Data Store - Is this the master/authoritative store of the data?
  • Retention Period - How long is data retained before deletion?
  • Security Classification - Highest classification of data in the store
  • Personal Data - Does it contain PII or sensitive personal data (SPI)?
  • Encryption - Level of encryption applied (storage, container, application)
  • Key Management - How encryption keys are managed (vendor-managed, HSM, custom)
Recommended

If specific storage systems need provisioning:

AttributeDetail
Storage Product[vendor / product]
Storage Size[TB]
Storage TypeSAN / NAS / Object / Block / File
Storage ProtocolNFS / SMB / iSCSI / Other
ReplicationSynchronous / Asynchronous / Snapshot / None
Minimum RPO[recovery point objective]
Recommended

Summarise the data classification profile of the solution:

Classification LevelData TypesHandling Requirements
Public[…]Open access
Internal / Corporate[…]Internal access controls
Restricted[…]Encrypted, access-controlled, audited
Highly Restricted[…]Encrypted, strict access, enhanced monitoring
Recommended

Describe how data moves through its lifecycle:

StageDescriptionControls
Creation / IngestionHow data enters the solution[validation, classification]
ProcessingHow data is transformed or used[access controls, audit]
StorageHow data is stored at rest[encryption, backup]
Sharing / TransferHow data is shared or moved[encryption in transit, authorisation]
ArchivalHow data is archived[archive storage, retrieval SLA]
Deletion / PurgingHow data is securely removed[secure deletion method, schedule]
Recommended
Assessment TypeIDStatusLink
[PIA / DPIA / LIA][ID][status][link]
ApproachSelected
Only Public/Internal data is used[ ]
Restricted/Highly Restricted attributes are deleted first[ ]
Sensitive data is masked (describe method below)[ ]
Sensitive production data is used (provide justification below)[ ]
Production data is not used for testing[ ]

[Additional details on data masking, access controls, duration, and destruction]

Does the solution include specific data integrity controls beyond standard hardware, protocol, and access controls?

  • Yes - [describe integrity controls]
  • No

Does the solution allow data to be downloaded to or stored on end-user devices?

  • Yes - [describe data protection measures]
  • No
Recommended
DestinationData TypeClassificationTransfer MethodProtection
[third party][data description][classification][method][encryption, contracts]

Are there data sovereignty or residency requirements?

  • Yes - [describe requirements and how they are met]
  • No
Recommended

Data has a carbon cost — to store, to move, and to back up. Capture how the design minimises that cost. Detail belongs in Section 4.5; decisions belong here.

QuestionResponse
Are retention periods set to the minimum the regulator and business actually need?Yes / No — [per data store]
Is older data tiered to cold/archive storage (S3 Glacier, Azure Archive, etc.)?Yes / No — [tiering policy]
Are unused or duplicate replicas of data identified and removed?Yes / No — [review cadence]
Is compression applied to reduce storage and transfer volume?Yes / No — [formats]
Is cross-region or cross-cloud replication justified by an actual recovery requirement?Yes / No — [justification]
Are large data transfers scheduled to off-peak / low-carbon-intensity windows where feasible?Yes / No / Not applicable

Why this matters

Every byte stored consumes electricity for as long as it exists. Two patterns drive most of the waste: over-retention (keeping data “just in case” for years past the regulator’s requirement) and uncompressed cold data (raw logs, raw snapshots, full-fidelity replicas of data that’s never read). The fix is policy-led: classify, set retention, automate expiry, tier the rest.

Scoring Guidance

ScoreWhat This Looks Like
1Data stores listed but classification and retention not defined
3All data stores classified, retention and encryption specified, PII/SPI identified
5All of the above plus data sovereignty addressed, data transfers documented with encryption, data integrity controls evidenced, retention/compression aligned to sustainability goals

Quality Attribute Cross-References:

  • 4.2 Reliability - Data backup and recovery strategies
  • 4.4 Cost - Storage costs, data transfer costs
  • 4.5 Sustainability - Data volume minimisation, efficient storage formats