Introduction
This document helps you to determine if you need a Data Protection Impact Assessment (DPIA). This may be true if you tick a lot of boxes and your target audience lives in Europe. Read on to find out more.
Last updated: April 23rd, 2026
What is a DPIA?
The GDPR introduces the concept of a Data Protection Impact Assessment (DPIA). A DPIA is a process designed to describe the processing, assess its necessity and proportionality and help manage the risks to the rights and freedoms of natural persons resulting from the processing of personal data by assessing them and determining the measures to address them.
DPIAs are important tools for accountability, as they help controllers not only to comply with requirements of the GDPR, but also to demonstrate that appropriate measures have been taken to ensure compliance with the regulation. In other words, a DPIA is a process for building and demonstrating compliance.
Goal
Our purpose is to ensure that privacy risks are minimised while allowing the aim of a controller project to be met whenever possible.
This document provides the relevant considerations for a controller to decide whether to create a DPIA prior to implementing the services of UNLESS (the processor). If so, this document contains all processor information to perform the assessment.
Assumptions
This document was written under the assumptions that the GDPR terms and concepts are known by the reader. At least make sure to grasp the difference between the concepts of controller (the UNLESS customer) versus processor (the UNLESS service).
Source
This document follows the Guidelines on Data Protection Impact Assessment (DPIA) for determining whether processing is "likely to result in a high risk" under Regulation (EU) 2016/679 (the GDPR). The guidelines were originally issued by the Article 29 Working Party and have since been endorsed by the European Data Protection Board (EDPB), which replaced the Working Party under Article 68 GDPR.
The impact of UNLESS services and the associated terminology is defined in the Data Processing Addendum of Unless. The purpose of the DPA is to reflect the agreement on the processing of personal data in accordance with data protection legislation.
Related regulation
Beyond the GDPR, deployments of AI-enabled features may interact with the EU AI Act (Regulation (EU) 2024/1689). Where a use case qualifies as high-risk under Annex III of the EU AI Act, the controller may additionally need to conduct a Fundamental Rights Impact Assessment under Article 27 of that regulation. A DPIA and a FRIA cover overlapping but distinct ground, and both may need to run in parallel.
DPIA criteria
In order to provide a more concrete set of processing operations that require a DPIA due to their inherent high risk, taking into account the relevant GDPR requirements, the following nine criteria should be considered. If one or more of the criteria applies, creating a DPIA may be considered.
Evaluation or scoring
Evaluation or scoring, including profiling and predicting, especially from "aspects concerning the data subject's performance at work, economic situation, health, personal preferences or interests, reliability or behavior, location or movements".
Examples of this could include a financial institution that screens its customers against a credit reference database or against an anti-money laundering and counter-terrorist financing (AML/CTF) or fraud database, or a biotechnology company offering genetic tests directly to consumers in order to assess and predict the disease/health risks, or a company building behavioural or marketing profiles based on usage or navigation on its website.
Does this require a DPIA?
"Evaluation or scoring" is not relevant by default, but may be relevant due to controller's actions. While UNLESS offers user profiles and audience membership based on usage or navigation on the controller website, their impact is limited. By default, these profiles are not identifiable or used outside of the scope of the controller's website. More importantly, to apply scoring with legal or similar effects without consent or legitimate purpose is not allowed under the law and the Unless terms and conditions.
Automated-decision making with legal or similar significant effect
Automated-decision making with legal or similar significant effect: processing that aims at taking decisions on data subjects producing "legal effects concerning the natural person" or which "similarly significantly affects the natural person". For example, the processing may lead to the exclusion or discrimination against individuals. Processing with little or no effect on individuals does not match this specific criterion.
Does this require a DPIA?
This situation does not require a DPIA research. Imposing legal or similar effects without consent or legitimate purpose is not allowed under the law and the Unless terms and conditions. Also, processing with little or no effect on individuals does not match this specific criterion.
Systematic monitoring
Systematic monitoring: processing used to observe, monitor or control data subjects, including data collected through networks or "a systematic monitoring of a publicly accessible area". This type of monitoring is a criterion because the personal data may be collected in circumstances where data subjects may not be aware of who is collecting their data and how they will be used. Additionally, it may be impossible for individuals to avoid being subject to such processing in public (or publicly accessible) space(s).
Does this require a DPIA?
This is not applicable when using Unless. End User monitoring is limited to request-level analytics with suppressed IP addresses and pseudonymous identifiers, which falls outside the scope of "systematic monitoring" under Article 35(3)(c) GDPR.
Sensitive data or data of a highly personal nature
Sensitive data or data of a highly personal nature: this includes special categories of personal data (for example information about individuals' political opinions), as well as personal data relating to criminal convictions or offences. An example would be a general hospital keeping patients' medical records or a private investigator keeping offenders' details.
Beyond these provisions of the GDPR, some categories of data can be considered as increasing the possible risk to the rights and freedoms of individuals. These personal data are considered as sensitive (as this term is commonly understood) because they are linked to household and private activities (such as electronic communications whose confidentiality should be protected), or because they impact the exercise of a fundamental right (such as location data whose collection questions the freedom of movement) or because their violation clearly involves serious impacts in the data subject's daily life (such as financial data that might be used for payment fraud).
In this regard, whether the data has already been made publicly available by the data subject or by third parties may be relevant. The fact that personal data is publicly available may be considered as a factor in the assessment if the data was expected to be further used for certain purposes. This criterion may also include data such as personal documents, emails, diaries, notes from e-readers equipped with note-taking features, and very personal information contained in life-logging applications.
Does this require a DPIA?
When using Unless, this criterium is not relevant by default, but may be relevant due to controller's actions. Sensitive data is not applicable as long as the controller decides not to activate an integration with third-party data sources that contain such information, and as long as End Users are not encouraged to submit sensitive data through free-text AI inputs. We advise against both.
Data processed on a large scale
Data processed on a large scale: the GDPR does not define what constitutes large-scale. The following factors may be considered when determining whether the processing is carried out on a large scale:
- the number of data subjects concerned, either as a specific number or as a proportion of the relevant population;
- the volume of data and/or the range of different data items being processed;
- the duration, or permanence, of the data processing activity;
- the geographical extent of the processing activity.
Does this require a DPIA?
When using Unless, this is not relevant as long as the outreach of the controller website does not exceed a scale that can be deemed a large proportion of the population of a country or part of the world.
Matching or combining datasets
Matching or combining datasets, for example originating from two or more data processing operations performed for different purposes and/or by different data controllers in a way that would exceed the reasonable expectations of the data subject.
Does this require a DPIA?
When using Unless, matching or combining datasets is not relevant by default, but may be relevant due to controller's actions. It's only applicable if the controller decides to initiate integrations with third party data sources, like a CRM. In such case, this should be addressed in a privacy statement.
Data concerning vulnerable data subjects
Data concerning vulnerable data subjects: the processing of this type of data is a criterion because of the increased power imbalance between the data subjects and the data controller, meaning the individuals may be unable to easily consent to, or oppose, the processing of their data, or exercise their rights. Vulnerable data subjects may include children (they can be considered as not able to knowingly and thoughtfully oppose or consent to the processing of their data), employees, more vulnerable segments of the population requiring special protection (mentally ill persons, asylum seekers, or the elderly, patients, etc.), and in any case where an imbalance in the relationship between the position of the data subject and the controller can be identified.
Does this require a DPIA?
This situation does not require a DPIA research. In general, Unless does not allow offering components to those who cannot oppose or consent to this. If the actions of the controller result in an imbalance in the relationship between the position of the data subject and the controller, this may be considered as "misleading use" of the service. This is not allowed under the Unless terms and conditions.
Innovative use or applying new technological or organisational solutions
Innovative use or applying new technological or organisational solutions, like combining use of fingerprint and face recognition for improved physical access control, etc. The GDPR makes it clear that the use of a new technology, defined in "accordance with the achieved state of technological knowledge", can trigger the need to carry out a DPIA. This is because the use of such technology can involve novel forms of data collection and usage, possibly with a high risk to individuals' rights and freedoms. Indeed, the personal and social consequences of the deployment of a new technology may be unknown. A DPIA will help the data controller to understand and to treat such risks.
Does this require a DPIA?
This situation does not require a DPIA research by default. UNLESS relies on commercially available AI models hosted within its cloud sub-processors; these are not considered unprecedented technologies. However, deploying the Services in a novel context (for example, decisions affecting access to essential services, or large-scale automated profiling of End Users) may still trigger this criterion, in which case a DPIA is recommended.
When the processing prevents data subjects from exercising a right
When the processing in itself "prevents data subjects from exercising a right or using a service or a contract". This includes processing operations that aims at allowing, modifying or refusing data subjects' access to a service or entry into a contract. An example of this is where a bank screens its customers against a credit reference database in order to decide whether to offer them a loan.
Does this require a DPIA?
This situation does not require a DPIA research. Limiting visitors from exercising a right is not allowed under the law and the Unless terms and conditions.
UNLESS processing operations mapped to DPIA criteria
For your convenience, the following chapter lists the processing operations that Unless conducts. It shows potentially relevant criteria. In this case "relevant" means that the criterium may play a role, depending on the controller's use case.
Documenting of publicly available request data
There are no relevant criteria applicable. The standard GDPR rules for the collection of website statistics apply: the controller must either have a legitimate purpose or ask the visitor for consent ("cookie policy").
Build actionable visitor segments in real time
Evaluation or scoring may be relevant. We offer our customers to create user profiles and audience memberships based on usage or navigation on the controller website. However, these profiles are either pseudonymised and only used within the scope of a controller's website, or based on already available data within a product where the user must log in.
Adjust individual user experience for personalisation, AI assistance, and AB testing
There are no relevant criteria applicable. With regards to profiling, imposing legal or similar effects without consent or legitimate purpose is not allowed under the law and the Unless terms and conditions. Also, processing with little or no effect on individuals does not match this specific criterion.
Process Customer Input through AI models
Sensitive data or data of a highly personal nature may be relevant, depending on what End Users submit. Free-text fields submitted by End Users are processed by large language models hosted within UNLESS' cloud sub-processors. Customer Data does not leave the boundary of those cloud sub-processors, and no data is sent to separate third-party LLM vendors such as OpenAI or Anthropic. Controllers should instruct End Users not to submit sensitive personal data via free-text inputs unless the Customer Application is explicitly designed for that purpose.
Enrich the visitor data by integrating a third party data source
Matching or combining datasets, sensitive data or data of a highly personal nature may all be relevant, depending on the controller's actions. By default, visitors are anonymous and there are no active integrations. Should the controller choose to use any third party integrations, assessment may be required by the controller.
Additional DPIA information
The following describes the information that is required to do a DPIA in relation to the processing activities of UNLESS for your use case. You may use it to create a report.
Systematic description of the processing
Nature of the processing
UNLESS provides organisations with an end-to-end platform to deploy AI assistance across the full customer lifecycle, covering acquisition, retention, sales expansion, and customer support. The platform combines conversational AI, AI-powered search, and agentic task automation on top of a shared layer for knowledge management, analytics, governance, and compliance.
UNLESS operates a multi-LLM architecture. All large language models used by the Services are hosted within the environments of the cloud sub-processors listed in the DPA, keeping Customer Data within the cloud sub-processor boundary at all times.
User data
Our data consists of public request information (standard data that browsers send with every page request, using the internet), plus any Customer Input that End Users or Users submit through free-text fields. See a comprehensive list here.
For further details about Processing, Data Retention and Destruction, see the Data Processing Addendum.
Functional description of the processing operation
UNLESS allows End Users of a customer's product to interact with an AI through a catalogue of UI components, browser extensions, or native apps that act as conversation starters or ambient assistants. There are several tools that can be used to do this: the user profile, audiences, and AI topics.
The user profile is a protected service where our AI Privacy Safeguard can safely store and retrieve information that is related to a user. This user profile may contain PII.
Based on profile attributes as well as public browser information, a customer can define audiences. These audiences can be used to either show or hide certain UNLESS components, and to allow for personalised responses from the AI, which never include PII unless the controller explicitly configures this.
AI topics organise content, conversations, or AI knowledge into thematic categories. They bound the AI to approved material and help prevent hallucinations and AI mistakes.
Assets on which personal data rely
The dynamic part of the system that returns audience memberships and user profiles is based on the AWS API Gateway. All code is executed using serverless Lambda functions, which execute in parallel automatically for each request.
All user profiles and all historical visitor data UNLESS collects are stored electronically in Ireland, Europe on the Amazon Web Services infrastructure, eu-west-1 datacenter. Our application servers and database servers run inside an Amazon VPC, Virtual Private Cloud. The database containing visitor and usage data is only accessible from the application servers and no outside sources are allowed to connect to the database. Our data retention times are no longer than 365 days. See the Data Processing Addendum for details.
International transfers
Personal Data processed by UNLESS is stored in the EU. Where UNLESS engages sub-processors that process data outside the EEA (for example Stripe or Chargebee), transfers rely on the EU Standard Contractual Clauses (Commission Implementing Decision (EU) 2021/914), supplemented by a Transfer Impact Assessment and, where applicable, the UK International Data Transfer Addendum. Controllers preparing a DPIA should take these mechanisms into account.
Necessity and proportionality
For a description of the measures that are taken, contributing to the proportionality and the necessity of the processing as well as to the rights of the data subjects, see the Data Processing Addendum and the Terms and Conditions.
Risks
For risk assessments and measures that have been taken, see Exhibit A of the Data Processing Addendum.
Processor assistance
Under the current Data Processing Addendum, UNLESS commits to providing reasonable assistance to controllers with:
- responses to data subject rights requests under Chapter III GDPR;
- DPIAs under Article 35 GDPR;
- prior consultations with supervisory authorities under Article 36 GDPR; and
- inquiries, investigations, or requests for information from a supervisory authority.
Standard assistance is included in the Services. Disproportionate assistance may be subject to a reasonable fee at UNLESS' then-current hourly rates.
The UNLESS Data Protection Officer can be reached at dpo@unless.com.
Interested parties
The DPO of the controller should be involved in finalising the DPIA. Where the deployment qualifies as a high-risk AI system under the EU AI Act, the controller's legal, compliance, and fundamental rights stakeholders may also need to participate in an accompanying Fundamental Rights Impact Assessment.