Social media algorithms may be exposing children to aggressive gun marketing, according to a new report released by Children and Screens in collaboration with Sandy Hook Promise. The study highlights a lack of transparency from tech companies regarding how frequently firearm-related content reaches minors, and warns that vulnerable children — particularly boys — are being targeted with messaging that links guns to power, sex appeal, and conflict resolution.

The Problem: Targeted Exposure

The report points out a disturbing trend where firearm manufacturers actively seek to cultivate young consumers. Boys are disproportionately exposed to this content, often on a weekly basis, as they navigate their formative years and develop their identities. Platforms reportedly leverage user data, including emotional or mental health indicators, to serve firearm-related content to children already struggling with depression or loneliness, making them even more susceptible.

“Firearm manufacturers have been targeting children as a future consumer.” — Nicole Hockley, Sandy Hook Promise

This isn’t just about random exposure; the report suggests that marketing tactics deliberately exploit psychological vulnerabilities to normalize gun ownership and associate it with desirable traits. The study stresses that parents deserve better insight into what their children are seeing online and how platforms operate behind the scenes.

Lack of Oversight and Key Questions

Currently, social media companies provide little to no information about the prevalence of firearm-related content served to minors. The report outlines six critical questions tech companies should answer to improve transparency:

  • How often are minors exposed to gun content?
  • How does user data influence content recommendations?
  • Do emotional or mental health signals affect algorithmic targeting?

Without this information, parents remain in the dark about the potential influence of violent marketing on their children. The report argues that tech platforms must strengthen oversight and provide parents with the tools to monitor and control their children’s online experiences.

What’s Next?

Greater transparency is not just a matter of parental rights; it’s a matter of public safety. By understanding how algorithms amplify harmful content, platforms can take steps to protect vulnerable youth from predatory marketing tactics. The report emphasizes that parents should be empowered with the information they need to navigate the digital landscape safely and protect their children’s well-being.

The issue raises deeper questions about the ethics of targeted advertising and the responsibility of tech companies to mitigate harm, especially when dealing with impressionable minors.