Privacy by Design for IT Projects

Build with foresight. Anticipate risk. Avoid costly missteps.

GDPR Article 25 requires Privacy by design & default, success requires compliance risk from the outset

  • Identify privacy risks early to avoid costly rework or delayed approvals

  • Streamline compliance approvals and alignments

  • Create clear, defensible records that stand up to audits and stakeholder scrutiny

A strategic foundation for clarity, control, and compliance

Privacy by Design is not optional. Under GDPR Article 25, compliance must be built into systems from the outset, not added later.

Most organisations get this wrong. Systems are designed first, risks are discovered later, and compliance becomes reactive, expensive, and incomplete. By that point, key decisions around data use, automation, and architecture are already locked in. This is where real exposure sits, especially with AI and large scale processing. What looks like a product decision quickly becomes a legal risk.

A structured Privacy by Design approach forces those decisions to be made early, with clarity and control. Without it, you are not managing compliance, you are inheriting risk.

Our Approach

1

Scope

Collaborative consultation to define project scope, identify data flows, technical components, and key stakeholders.

2

Assess

Evaluate privacy risks and required safeguards using DPIA-aligned methodology and GDPR principles.

3

Design

Provide guidance on implementing privacy controls, access management, retention logic, and role-based access.

4

Support

Offer continuous guidance throughout development and post-launch to ensure privacy stays embedded as the project evolves.

The Result: Confidence Through Clarity and Readiness

Streamlined, Privacy-First Operations

Privacy strategies integrated directly into workflows, enhancing efficiency, reducing overhead, and aligning stakeholders across legal, unions, and Works Councils.

Built-In Risk and Accountability

Proactive identification and mitigation of data protection risks with strong audit trails and clear documentation for regulatory and internal oversight.

Ethical, Compliant System Design

Practical, defensible safeguards embedded in system architecture to support lawful, fair, and transparent data handling.

Two people analyzing data charts and graphs on a large screen in a modern office setting.

Frequently Asked Questions

  • Privacy by Design is not optional under the GDPR.

    Article 25 requires it both when determining how processing will happen and when the processing itself takes place. This means it becomes mandatory as soon as you:

    • Define requirements

    • Design system architecture

    • Select vendors

    • Begin processing personal data

    For high-risk processing, Article 35 adds another layer. A DPIA must be conducted before processing begins, which effectively forces a Privacy by Design approach.

    In reality, the moment a system touches personal data, Privacy by Design is already expected.

  • Yes, but not perfectly.

    GDPR allows organisations to consider “state of the art,” cost, and risk when implementing measures (Article 25, Article 32). This enables a risk-based, incremental approach.

    In practice, this usually means:

    • Reducing data collection and retention

    • Strengthening access controls and monitoring

    • Adding compensating controls such as pseudonymisation layers

    You may not reach an ideal architecture, but you can reach a defensible and documented level of risk.

  • Vendor claims of “GDPR compliance” are not sufficient.

    Recital 78 and Article 25 make it clear that controllers must select solutions that enable compliance. This means:

    • Requiring detailed technical and organisational measures

    • Embedding privacy requirements into contracts

    • Requesting audit rights and documentation

    Certifications and codes of conduct (Article 40–42) can support this, but they are not enough on their own.

    If a vendor cannot demonstrate transparency, it is a compliance and risk issue, not just a procurement inconvenience.

  • Accountability under Article 5(2) requires evidence, not just intent.

    Strong evidence includes:

    • System configurations enforcing data minimisation and restricted access

    • Logs, monitoring, and security controls in operation

    • Completed DPIAs and documented design decisions

    • Audit results or certifications

    The key question is simple:
    Can you show how privacy is built into the system itself?

  • Yes, and it often does.

    Common failure points include:

    • Poor data quality or biased datasets

    • Systems being used in new contexts

    • Users bypassing controls

    • Lack of ongoing monitoring and updates

    GDPR expects measures to remain effective over time (Article 24, Article 32).

    Having controls is not enough.
    They must work in practice and be maintained.