Why 13?

June 2, 2021

What is the difference between 13, 16 and 18? This isn’t a math problem, but a question that social media companies, policymakers, and parents grapple with every day when it comes to the services that minors can and should be accessing online. 

Historically, at least since the Children’s Online Privacy Protection Act (COPPA) was passed in the United States in 1998, the age to consent to the processing of personal information without the permission of a parent was 13. This wasn’t based on scientific research studying adolescent brain development or anything so formal, rather it was the outcome of a negotiation between policymakers and the industry. “Under 12” had generally been the rule as it related to marketing to children offline, so it was the starting point for online content as well.  

It was from this point that the global Internet ecosystem developed. Those over 13 could consent to giving up their data, and those below 13 required parental consent. Many companies adopted a prohibition on under 13s using their services to avoid the somewhat burdensome requirements that come with contacting parents, communicating terms of service and obtaining consent. In the US, strict fines accompany COPPA violations and companies are keen to comply with the Rule. But enforcement is not always so easy. We know that children lie about their age, sometimes with the cooperation of their parents, and lines can become blurred when deciding whether or not a website is ‘directed to children’ and should therefore fall under the COPPA regime.

Today we find ourselves at a point where this Rule that has formed the global standard for over 20 years has now come into question. Some countries have made considerable changes to the laws that govern data collection for all users, including children, and an increased focus on privacy in the US has seen numerous proposals to change the age of digital consent.

The General Data Protection Regulation (GDPR) in the European Union requires parental consent for the processing of any child under 16’s data, though countries can derogate from this and keep 13 as the cut off. This led to a situation where minors who had been legally using social networks for 2 years were required to obtain permission from their parents to continue.  Unsurprisingly, busy parents were not necessarily privy to these changes, and enterprising teens quickly found a wide variety of ways to circumvent asking Mom and Dad.

The United Kingdom introduced the Age Appropriate Design Code which regulates the processing of minor’s data. The code defines a minor as anyone under the age of 18. This draws upon the definition of ‘child’ from the United Nations Convention on the Rights of the Child. The 15 standards contained in the code require that services are developed with the best interests of the child in mind, and also regulate that default settings being at their highest level of privacy, prohibition on nudge techniques and clear and comprehensible transparency standards.

None of the ages prescribed take into account the individual characteristics of each child, which every parent will tell you makes a big difference when it comes to the level of responsibility, maturity, and capacity they have to make good decisions online. Choosing an age of consent that applies uniformly to millions of children is incredibly difficult. But it is vital that there are foundational rules that allow the Internet ecosystem to work, to offer increased privacy protections for children, but without treating all users as minors or collecting too much information.

Parents need to remain informed and in regular communication with their kids about the services they’re accessing. Many caregivers think that the age 13 requirement relates to the appropriateness of the content that their kids will be exposed to, like at the movies, this isn’t the case. We all need to do more to improve users' understanding of privacy generally and especially when it relates to children.

Data minimization is a key component of privacy and requiring the sharing of incredibly personal information such as birthdate or other age identifying information is contrary to that overarching aim. It is especially concerning when that data belongs to a child. Requesting information to prove age to prohibit or permit access to a site is problematic. It can be circumvented, it leads to more data collection, and it severely impacts free speech and access rights.

Minors have rights to access information, just like adults, and it is the duty of all to ensure those rights are protected. Obviously, there are limits for children where there wouldn’t be for adults, but older teens especially need to be able to research important health and safety information and get support, sometimes free from parental oversight.

Setting the age of consent at 13 doesn’t mean that minors under 18 don’t deserve privacy protections. Indeed, many companies currently offer higher levels of safety and privacy for this age group. But older teens don’t necessarily need the same levels of protection from data processing as young kids, as they possess more situational awareness and critical thinking skills that allow them to better consent to certain uses of their information. Education is key, gradually teaching kids to take control of their own data and set controls that will ensure that they mature into privacy-aware, empowered digital citizens.

Written by

Stephen Balkam

For the past 30 years, Stephen Balkam has had a wide range of leadership roles in the nonprofit sector in both the US and UK. He is currently the Founder and CEO of the Family Online Safety Institute (FOSI), an international, nonprofit organization headquartered in Washington, DC. FOSI’s mission is to make the online world safer for kids and their families. FOSI convenes the top thinkers and practitioners in government, industry and the nonprofit sectors to collaborate and innovate and to create a “culture of responsibility” in the online world.

Prior to FOSI, Stephen was the Founder and CEO of the Internet Content Rating Association (ICRA) and led a team which developed the world’s leading content labeling system on the web. While with ICRA, Stephen served on the US Child Online Protection Commission (COPA) in 2000 and was named one of the Top 50 UK Movers and Shakers, Internet Magazine, 2001.

In 1994, Stephen was named the first Executive Director of the Recreational Software Advisory Council (RSAC) which created a unique self-labeling system for computer games and then, in 1996, Stephen launched RSACi – a forerunner to the ICRA website labeling system. For his efforts in online safety, Stephen was given the 1998 Carl Bertelsmann Prize in Gutersloh, Germany, for innovation and responsibility in the Information Society and was invited to the first and subsequent White House Internet Summits during the Clinton Administration.

Stephen’s other positions include the Executive Director of the National Stepfamily Association (UK); General Secretary of the Islington Voluntary Action Council; Executive Director of Camden Community Transport as well as management positions at the Institute of Contemporary Arts (London) and Inter-Action. Stephen’s first job was with Burroughs Machines (now Unisys) and he had a spell working for West Nally Ltd – a sports sponsorship PR company.

Stephen received a BA, magna cum laude, in Psychology from University College, Cardiff, Wales in 1977. A native of Washington, DC, Stephen spent many years in the UK and is now has dual citizenship. He writes regularly for the Huffington Post, appears often on TV and has appeared on nationally syndicated TV and radio programs such as MSNBC, CNN, NPR and the BBC and has been interviewed by leading newspapers such as the Washington Post, New York Times and The Wall Street Journal, radio and in the mainstream press. He has given presentations and spoken in 15 countries on 4 continents.