I don’t remember much of my life before social media. It took over in seventh grade and my individual style started bending toward whatever the algorithm pushed to the top.
My wants were no longer my own but reflections of what Instagram, Facebook, Snapchat, and YouTube algorithmically advertised as desirable. Sleepovers that used to be chaotic with pillow fights and hide‑and-seek were replaced by late‑night scrolling sessions, my friends and I quietly measuring ourselves against everyone else. Social media was supposed to give my generation a lifeline: A way to stay close, to find community, to express ourselves. Instead, tech executives created digital nicotine instead of real digital communities, engineered to tap into our insecurities and keep us coming back.
By the time we realized what it was doing to us, it already felt impossible to put it down. Infinite scroll, autoplay that makes stopping feel like an interruption, notifications calibrated to pull us back at our most vulnerable moments — these weren’t accidents. They were design decisions. Decisions meant to keep us hooked.
Now, for the first time, Big Tech executives have been forced to defend those design decisions in court, and a jury has made something clear: This harm was real and not accidental. The verdict marks a turning point, a signal that the era of Big Tech operating without consequence is beginning to crack.
But even before that verdict, something bigger had already started to shift. Inside that courtroom, one young plaintiff was barging through doors that were never meant to open. She made visible what millions of us lived through silently.
And she is not alone.
We see ourselves in her. We see our friends. We see the filtered selfies, the 2 a.m. scrolling, the comparison spirals, the quiet shame. These are not edge cases. They are not outliers. Big Tech’s platforms don’t discriminate. When a platform is designed to manipulate, everyone is vulnerable.
Because what we grew up with was not neutral technology. It was technology designed for maximum engagement at any cost.
Evidence from these cases shows that Big Tech followed Big Tobacco’s playbook: Addict them young to create lifelong consumers, no matter the cost to their lives. Their business model didn’t just deprioritize young people’s wellbeing — it depended on our lack of it to sell products.
Mashable Light Speed
I recently read whistleblower Sarah Wynn‑Williams’ memoir, Careless People. She revealed that Meta could detect when teen girls deleted a selfie, reading it as a moment of low self‑esteem, and then prompt beauty brands to target them with ads at that exact moment. It wasn’t an accident. So I have to wonder how many Instagram ads I was served when I felt the worst about myself. Was the face serum, lip kit, or lip filler I bought with my limited income exactly what Mark Zuckerberg and Adam Mosseri designed?
I came out as gay before Caitlyn Jenner came out as trans. When she publicly debuted her transition, I didn’t have the words for why it hit me so hard. It wasn’t jealousy. It wasn’t resentment. It was that overnight, she became my Kendall Jenner — the glossy, perfected, impossible image of womanhood that a closeted trans girl like me was supposed to aspire to.
Girls my age were drowning in Kendall’s edited photos and endless beauty ads. I was drowning in a trans version of the same thing. Caitlyn appeared effortlessly beautiful, and I bought product after product trying to chase even a fraction of that feeling. Every time I took a selfie and got called a slur, Instagram responded by pushing more beauty ads at me instead of community or creators who could’ve helped me understand myself.
Sarah Wynn‑Williams’ revelations make the pattern clear: Meta saw our lowest moments and treated them as sales opportunities. They could have protected us. They could have connected us. Instead, they monetized the wound.
We were children when we first downloaded these apps. Our brains were still developing. Our sense of self was still forming. And yet we were placed inside systems engineered to override self-control and exploit social validation.
That’s why these trials matter.
They aren’t just about one young plaintiff or one tragic story. They’re about thousands of families demanding accountability — and now, with this verdict, proving that accountability is possible. They’re about internal documents that companies tried to hide now becoming public. They’re about executives being forced to answer questions under oath rather than hide behind PR statements.
The verdict is a milestone — but it’s not the finish line. The truth being exposed is what makes lasting change possible. The courtroom doors finally being open is what ensures this doesn’t stop here.
Every day, more cases are filed. Every day, more children are harmed. Pressure will not let up until Meta, YouTube, TikTok, and Snap fix their addictive products.
People need to understand that this isn’t an attack on social connection or creativity. The problem is that Big Tech built the digital equivalent of nicotine instead of designing ways for people to enjoy short‑form media without being trapped by it. I don’t need endless scroll or autoplay to enjoy fan edits of my favorite shows. I don’t need an algorithm to serve me a sad dog video because it knows I’ll need a mood booster before closing the app. I’m not trying to get rid of the joy — I’m trying to get rid of the traps.
There is a future without the bad stuff — and this verdict is proof that we can start to build it.
Lennon Torres is a former Dance Moms performer now fighting for young people’s safety online. A trans activist and University of Southern California alum, she uses her pop‑culture fluency and lived experience to power her work at the Heat Initiative, taking on tech giants and demanding platforms to protect and empower the next generation.