BLOG

YOU CAN'T OUTSOURCE RESPONSIBILITY

Why Every Organisation Using Facial Recognition Is a Data Controller — Whether They Like It or Not

10 February 2026

Legal document with pen on desk representing regulatory responsibility
← Back to Latest

The Uncomfortable Truth

If you use facial recognition on your premises, you are a data controller. Full stop. It does not matter what your supplier's contract says. It does not matter if your technology partner calls themselves the sole data controller. It does not matter if there is an indemnity clause buried on page 47 of the agreement. Under UK GDPR and the Data Protection Act 2018, if you are deciding why and how personal data is processed — and by deploying facial recognition on your property, using your cameras, against your watchlist, you absolutely are — then you are a data controller. There is no way around this, and pretending otherwise puts your entire organisation at risk.

Controller, Joint Controller, or Processor — What's the Difference?

Explain in plain terms. A data controller decides the purpose and means of processing. A data processor acts on the controller's instructions. A joint controller shares those decisions with another party. In facial recognition, the end user — the retailer, the venue, the building operator — is always at minimum a joint data controller. You are processing biometric data, one of the most sensitive categories under UK GDPR, on your property, through your cameras, using your watchlist data. Your supplier may also be a controller or a joint controller depending on their role. But no contract structure can make you just a processor when you are the one deciding whose faces go on the watchlist and where the cameras point.

Indemnity Doesn't Mean Immunity

Some suppliers offer indemnity clauses that promise to cover you if something goes wrong. This sounds reassuring, but it is fundamentally misleading when it comes to regulatory enforcement. The Information Commissioner's Office — the ICO — does not care about your commercial agreements. If there is a data breach or a complaint, the ICO will investigate every party involved in the processing chain. If they find that you, the end user, were acting as a data controller, which you were, they will hold you accountable regardless of what your supplier promised. An indemnity clause is a commercial arrangement between two businesses. It has zero bearing on your regulatory obligations. You cannot contract out of the law.

The Regulator Holds Everyone Accountable

Make this concrete. When the ICO investigates a facial recognition deployment, they look at who decided to deploy it, who chose the locations, who built the watchlist, who had access to the data, and who was responsible for informing the public. If you are the organisation operating the system on your premises, you tick most of those boxes. Even if your supplier built the system, configured the software, and hosted the data, you are still the one who decided to use it. That makes you a controller. The ICO has made this clear in multiple enforcement actions and guidance documents. They do not accept my supplier told me it was fine as a defence.

Why "It's Their System" Doesn't Work

Address the common excuse. Many organisations assume that because they bought a facial recognition system from a third party, the third party carries all the responsibility. This is wrong. You chose to deploy it. You chose which premises. You chose which cameras. You populate the watchlist with individuals you want to identify. You decide what happens when an alert triggers. These are all controller-level decisions. The fact that someone else built the software is irrelevant to your regulatory status. If you buy a car and drive it into a wall, you do not blame the manufacturer for the decision to accelerate.

What Responsible Deployment Looks Like

Explain what FaiceTech recommends. At FaiceTech, we are transparent about this. We tell every customer that they are a data controller. We help them understand their obligations under UK GDPR, including completing a Data Protection Impact Assessment (DPIA) before deployment, ensuring there is a lawful basis for processing biometric data, putting proper signage and privacy notices in place, establishing data retention and deletion policies, and training staff on how data is handled. We do not hide behind contractual language. We work with our customers as partners to make sure every deployment is lawful, proportionate, and defensible. That is what responsible facial recognition looks like.

Know Your Role Before You Deploy

If you are considering facial recognition for your organisation, start by understanding your role in the data processing chain. You will be a data controller. Accept that, plan for it, and build your compliance framework around it. Do not rely on a supplier to shield you from responsibility — it will not work, and the ICO will not buy it. If your supplier tells you that you have no data controller obligations, that should be a red flag, not a comfort. Work with a partner who is honest about the regulatory landscape and who helps you get it right from day one. That is what we do at FaiceTech, and it is why our customers deploy with confidence. Visit our compliance page to learn more about how we support responsible deployment.

Ready to see FaiceTech in action?