
Facial Scans, Fingerprints, Selfies: Are You Really Protected?
By Hermon Demsas
Biometric Breaches: A Modern Challenge
- Find good lighting.
- Wait for the border to turn green.
- Click.
- Verify.
Whether you are seeking a soul mate or logging in to check your savings account, facial recognition tools could be playing a role in shaping user authentication and trust protocols. With fast-growing start-ups and demands for easy-to-use platforms, can a growing reliance on biometric verification obscure how sensitive data travels, multiplies and at times, leaks?
The United Kingdom recently implemented the Online Safety Act 2023 with an objective to prevent access to adult content and other harmful content by underaged people. However, its execution has initiated some heated discourse as the OSA requires UK users to upload government IDs and facial scans to gain access to certain websites. The act has also been criticised as it leaves the burden of actioning the law on content developers, many of whom are based or operate outside of the UK.
In light of this development, we will examine the gap between perceived security, the legal implications and the evolving responsibilities of providing, processing, and storing highly sensitive personal data.
The Illusion of Security
Biometric data (including fingerprints, voice recordings, selfies, and photo ID scans) may feel inherently safe because it is uniquely yours. Yet, in the wake of data breaches and deep fake technology, the real measure of protection lies in post-collection storage practices. Upon data collection, the following factors determine the extent of security:
- How and where is data held?
- Who can access it?
- Is it duplicated for backups or analytics?
- What measures are being taken to protect the data from bad actors?
- How long is the retention period?
To ground this topic in reality, let us look at a recent development and what lessons we can take from where it all went wrong.
Case Study: The Tea App Breach
In the early hours of Friday 25 July 2025, Tea Dating Advice, Inc. (known as Tea app) discovered unauthorised access to its systems, sparking an immediate investigation. A mobile application originally designed to help women stay safe while dating fell victim to 2 confirmed data breaches. Official reports indicated a breach of around 72,000 images, approximately 13,000 of which were selfies and photo IDs submitted for account verification. The second cyber incident leaked private messages, some featuring sensitive topics.
Outrage erupted in response to the breach, amplified by the mass reposting and recycling of user credentials by internet opportunists across various social media platforms. Following the breach, many Tea app users expressed confusion over the privacy policy that was in place before its update on Monday 11 August 2025. Some users recall reading that sensitive data collected during the verification process would be stored temporarily and then deleted. However, data dating back to before February 2024 was compromised. While we cannot confirm the accuracy of these recollections, this situation highlights the importance of ensuring privacy policies are not only aligned with actual practices but also written clearly and without ambiguity.
An official statement published by Tea for Women stated the breach was traced back to a legacy storage system. This was an old digital storage system that held the archived sensitive data of users who joined before February 2024. Several class action lawsuits have been filed in response to the alleged mishandling of biometric data and failure to secure their legacy systems.
While cybersecurity specialists continue to investigate and work towards preventing further damage, this case is a great illustration of a cautionary tale. Any platform can falter if there is a gap in protecting sensitive user data across its entire lifecycle.
Handling Biometric Data: A Lesson for Small Businesses
Pause and Evaluate
Do you unquestionably need this data to serve users or comply with regulations? You may want to consider opting for methods that avoid long-term storage where possible. This could be storing images temporarily with strict deletion policies (e.g. after 48 hours) or real-time analysis where sensitive data is immediately deleted upon task completion.
Lock It Down
Segregate any sensitive personal data collected and use encrypted storage with airtight access controls and audit trails. In simple terms, access should only be granted to those with the right key to read the data. Any unauthorised access will prove fruitless as the data is gibberish without the magic key.
Caution:
As data controllers, losing the magic key to decrypt the data may result in a loss of data availability (a type of security incident). Whatever measures you take to secure sensitive data, always ensure that data subjects can have access to their data if/when requested.
Encrypted data can be a secure way to implement strict access controls, but this must be handled by a professional and even then, no system is breach proof. It is vital to take precautions, so you are prepared to respond effectively in the event of a breach.
Don’t Sign on the Dotted Line Until You Connect the Dots
Do you due diligence on any third-party ID verification providers you choose to work with. Enquire about their breach history, deletion policies, and transparency protocols. You may consider personalising the processing scope, retention periods, and access in the contractual agreements.
Now You See It, Now You Don’t
When handling identity verification, it’s crucial to recognised that biometric data – such as fingerprints, facial recognition, or iris scans – all fall under special category data under UK General Data Protection Regulation. This type of data is considered highly sensitive due to its uniquely identifying nature and the potential risks if misused or breached. Therefore, organisations must apply stricter safeguards, including limiting the scope of processing, minimising data retention and establish robust access controls. Data minimisation principles should guide the collection and storage of biometric data, ensuring only what is strictly necessary is retained and processed.
Drills, Drills, Drills
No one wants to be the next news headline, create a reliable playbook with a well-structured response plan and a list of important contacts in the case of an emergency.
Providing Biometric Data: A Lesson for Data Subjects
Don’t Take ID Verification Requests at Face Value
Read the privacy notice before uploading any biometric data. When reading the privacy notice, look out for the expressed purpose of data collection, data retention periods, third-party sharing, your rights, and contact information.
Separate the Snaps
Avoid using the same pictures for personal social media profiles and official verification scans. It is extremely easy to screenshot a social media profile picture and impersonate a user. Keep profile pictures and verification photos distinct.
Don’t Let Your Photo Snitch
As a general rule of thumb, strip metadata where possible. Some photos will contain hidden data that reveals GPS coordinates, device information, or time stamps. Before uploading pictures anywhere, update your settings or use tools to remove this data to prevent giving more than they ask for.
Your Data, Your Rules
In respect of UK data subjects, the enforcement of data protection rights under UK GDPR requires data controllers to provide access to their data. Data subjects generally have the right to obtain confirmation on whether personal data concerning them is being processed and if so, access to that data. Responses to the data subject’s request must be made without delay and within a month at the latest.
There are some limitations to this as the response must be proportionate and not deemed to be made with malicious intent or falls under the legal exemption category (data disclosing crime and legal, professional privilege etc.).
For international data controllers and processors (under Article 27 of the UK GDPR) are required to appoint a UK representative who will act as the organisations contact for data subjects and the ICO in the UK. A representative who can be a natural or legal person is not to be confused with a Data Protection Officer.
The Conclusion
Whether you are a business or a user, hackers do not issue pre-action letters. Act as soon as you can to strengthen your policies and systems.
Want help reviewing your privacy policy or need further guidance on your rights and obligations under GDPR and the Data Protection Act 2018? At Burley Law, that is our cup of tea.
Don’t wait for a breach to act. Get in touch today to secure your business 👇
📧 des@burleylaw.co.uk | 📞 0121 661 6501 | 🌐 www.burleylaw.co.uk