Why we need a design code
When kids are in digital spaces for learning, socializing, and relaxing, they deserve to be offered the opportunity for the most positive experience, designed in a way that understands and supports their unique ways of seeing the world.
That’s exactly why we need a design code: guidelines, rules and laws to ensure that any digital services children use are safe and fit for them.
A design code requires tech companies to think about what children need and how to reduce risks in their products. It’s a legal approach that is already protecting children in other parts of the world!
Learn more below about evidence-based reasons for a design code in the United States.
Kids come first | Better design makes sense | Families deserve better | Big Tech Needs Guardrails | Children and teens need protection
Kids come first
Wherever our kids are – including the internet – we want them to be happy, healthy, and safe! The digital environment is one of the most powerful forces shaping childhood. But for too long, the digital world has not considered children’s needs. The rules we have to protect children in the offline world have been ignored in the online environment. We want to build an internet where children are protected by the design of all digital services!
Better design just makes sense
It’s simple. When websites and apps are designed for people instead of profits, our kids are safer. Design elements like friend recommendations, autoplay, and surveillance advertising have exposed children in the US to sexual grooming, inappropriate content, and a whole host of harms. By requiring sites to make the best interests of children a primary design consideration, we can ensure digital media works for, instead of against, children and families.
Families deserve better
As much as we’d like to, parents can’t do it all! It’s hard enough to raise kids without worrying about manipulative design that keeps kids anxiously checking their devices, or the risks of cyberbullying, inappropriate content, and strangers lurking on supposedly kid-safe games. Even if parents tried to make the internet less risky for their kids, they’d come up against unreadable privacy policies and complicated profile settings designed to benefit tech companies. It’s time to hold Big Tech accountable for making parenting even harder than it already is!
Big Tech needs guardrails
Time and again, Big Tech has put profits before kids. Platforms rely on manipulative design to trick kids into spending time and money on their sites and illegally collect their data. They serve them up complex privacy policies written in legalese so no one can understand them, they deploy sneaky designs and dark patterns to trick children and teens into giving up more of their data, and use that data in ways that undermines young people’s well-being.
Children and teens need protection
Even adults have a hard time deciding the best way to avoid risks on the internet and choose what’s best for them. How many of us have spent far more time on social media than we intended to? How many of us have really read and understood all the terms and conditions? How many of us have surrendered personal information to an app without stopping to think what will be done with that data?
For kids and teens, it’s even harder. While they’re still growing up, it’s hard for them to determine what’s an ad and what isn’t, or what details they should or shouldn’t share. The digital world is going to be a big part of children’s lives and we need to make sure that it’s designed to support their wellbeing, not interfere with it.
Want to learn more about Designed with Kids in Mind and our efforts?
Stay up to date with the latest news from our organization and learn more about how you can get the conversation started with your legislators.