10 (Not So) Hidden Dangers of Age Verification

10 (Not So) Hidden Dangers of Age Verification

Curated from Deeplinks — Here’s what matters right now:

It’s nearly the end of 2025, and half of the US and the UK now require you to upload your ID or scan your face to watch “sexual content.” A handful of states and Australia now have various requirements to verify your age before you can create a social media account. Age-verification laws may sound straightforward to some: protect young people online by making everyone prove their age. But in reality, these mandates force users into one of two flawed systems—mandatory ID checks or biometric scans—and both are deeply discriminatory. These proposals burden everyone’s right to speak and access information online, and structurally excludes the very people who rely on the internet most. In short, although these laws are often passed with the intention to protect children from harm, the reality is that these laws harm both adults and children. Here’s who gets hurt, and how: 1. Adults Without IDs Get Locked Out Document-based verification assumes everyone has the right ID, in the right name, at the right address. About 15 million adult U.S. citizens don’t have a driver’s license, and 2.6 million lack any government-issued photo ID at all. Another 34.5 million adults don't have a driver's license or state ID with their current name and address. Specifically: 18% of Black adults don't have a driver's license at all. Black and Hispanic Americans are disproportionately less likely to have current licenses. Undocumented immigrants often cannot obtain state IDs or driver's licenses. People with disabilities are less likely to have current identification. Lower-income Americans face greater barriers to maintaining valid IDs. Some laws allow platforms to ask for financial documents like credit cards or mortgage records instead. But they still overlook the fact that nearly 35% of U.S. adults also don't own homes, and close to 20% of households don't have credit cards. Immigrants, regardless of legal status, may also be unable to obtain credit cards or other financial documentation. 2. Communities of Color Face Higher Error Rates Platforms that rely on AI-based age-estimation systems often use a webcam selfie to guess users’ ages. But these algorithms don’t work equally well for everyone. Research has consistently shown that they are less accurate for people with Black, Asian, Indigenous, and Southeast Asian backgrounds; that they often misclassify those adults as being under 18; and sometimes take longer to process, creating unequal access to online spaces. This mirrors the well-documented racial bias in facial recognition technologies. The result is that technology’s inherent biases can block people from speaking online or accessing others’ speech. 3. People with Disabilities Face More Barriers Age-verification mandates most harshly affect people with disabilities. Facial recognition systems routinely fail to recognize faces with physical differences, affecting an estimated 100 million people worldwide who live with facial differences, and “liveness detection” c

Next step: Keep your day-to-day compliant and secure—find privacy-forward devices that help you stay protected.

Original reporting: Deeplinks

Back to blog

Leave a comment

Please note, comments need to be approved before they are published.