Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety
Investors will vote on child safety resolution at Meta's Annual General Meeting
MENLO PARK, Calif., May 27, 2025 Tomorrow, Meta shareholders will vote on a asking Meta to assess its child safety impacts and whether harm to children on its platform has been reduced. The vote follows reports that the company's Instagram Teens feature "", including . The resolution - filed by Proxy Impact on behalf of Dr. Lisette Cooper and co-filed by 18 institutional investors from North America and Europe - will be presented by child safety advocate Sarah Gardner, CEO of the Heat Initiative.
"Two weeks ago, I stood outside of Meta's office in NYC with bereaved parents whose children died as a result of sextortion, cyberbullying, and drug purchases on Meta's platforms and demanded stronger protections for kids," said Sarah Gardner, CEO of the Heat Initiative, "Meta's most recent 'solution' is a bandaid. They promised parents that Instagram Teens would protect their kids from harm. In reality, it still recommends sexual, racist, and violent content on their feeds. We are asking shareholders to hold Mark Zuckerberg and Meta accountable and demand greater transparency about why child safety is still lagging."
"Meta algorithms designed to maximize user engagement have helped build online abuser networks, normalize cyberbullying, enable the exponential growth of child sexual abuse materials, and flood young users with addictive content that damages their mental health," said Michael Passoff, CEO of Proxy Impact, "And now, a major child safety concern is Meta's doubling down on AI despite the unique threats it poses to young users. Just this year, the National Center for Missing and Exploited Children saw , a 1,325% increase from 2023. Meta's continued failure to address these issues poses significant regulatory, legal, and reputational risk in addition to innumerable young lives."
The asks the Meta Board of Directors to publish "a report that includes targets and quantitative metrics appropriate to assessing whether and how Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms." has been filed with the SEC.
Meta has been under pressure for years linked to online child safety risks, including:
alleging that Meta Platforms has intentionally built programs with addictive features that harm young users. according to Meta's internal research. A on Meta's SSI expert panel on suicide prevention and self harm, alleging Meta is willfully neglecting harmful content, disregarding expert recommendations, and prioritizing financial gain. As many as in 2021. Meta took no action until they were called for Senate testimony . Internal research leaked by Meta whistleblower Frances Haugen showed that the company is mental health including thoughts of suicide and eating disorders.
Since 2019, Proxy Impact and Dr. Cooper have worked with members of the pension funds, foundations, and asset managers to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media.
Proxy Impact provides shareholder engagement and proxy voting services that promote sustainable and responsible business practices. For more information, visit .
works to hold the world's most valuable and powerful tech companies accountable for failing to protect kids from online child sexual exploitation. Heat Initiative sees a future where children's safety is at the forefront of any existing and future technological developments.
Contact: Sloane Perry,
SOURCE Heat Initiative