
Twitch has launched mandatory facial scanning for UK users to verify their age before watching mature content. The Amazon-owned streaming platform implemented the controversial system to comply with the UK’s Online Safety Act, sparking widespread debate about privacy and digital security.
The new verification process requires users to scan their faces through Twitch’s built-in app. Without completing this step, UK viewers lose access to streams containing mature content labels including sexual themes, drugs, violence, gambling, and excessive tobacco use.
“We are introducing mandatory age verification for users in the UK who want to watch certain types of content on Twitch that may not be suitable for everyone,” Twitch stated in emails to affected users. The system targets users between ages 13 and 18, as Twitch already blocks anyone under 13 from creating accounts.
Privacy Concerns Mount Over Data Security
The facial scanning requirement has triggered significant backlash from streamers and privacy advocates. Critics point to Discord’s recent data breach, where 70,000 users’ ID photos were leaked after implementing similar age verification in the UK and Australia.

Evolution of Twitch branding as the platform implements new verification systems
“A third-party company does not need an in-app facial scan to determine your age, along with other personal identifiable information,” tweeted cosplayer Nelku, echoing widespread community concerns about the security risks.
The timing couldn’t be worse for public confidence. Discord’s breach exposed passport and driver’s license photos, yet the UK government provided no assistance to affected users. Critics argue this proves the Online Safety Act prioritizes compliance over actual user safety.
Broader Impact on Young Streamers
The age verification system represents just one part of increasing restrictions on young users. UK regulator Ofcom has proposed additional guidelines that could prevent under-18 streamers from earning donations, subscriptions, or other revenue sources.
These proposed rules would block all commenting, gifting, and recording of livestreams featuring minors. Ofcom claims these measures would reduce grooming, bullying, and inappropriate influence through monetary gifts.
A “Repeal the Online Safety Act” petition will be debated in Parliament next month. The petition argues the legislation risks “clamping down on civil society talking about trains, football, video games or even hamsters because it can’t deal with individual bad faith actors.”
The verification checks trigger during account creation, first login after the update, or when accessing streams with mature content labels. Popular games like the upcoming GTA VI will likely require age verification due to mature content ratings.
This development marks a significant shift in how streaming platforms handle age verification, potentially setting precedent for other countries considering similar legislation. The balance between child safety and user privacy remains a contentious issue as digital platforms navigate increasingly complex regulatory landscapes.




