Investors Raise Concerns to Meta Regarding Child Safety on Social Media Platforms
Meta shareholders, led by Lisette Cooper, PhD, from Fiduciary Trust International and Proxy Impact, have filed a resolution for Meta's annual meeting on May 29, 2024. The proposal urges Meta's Board to set targets within a year to mitigate child safety risks on its platforms and to publish an annual report on progress. Supported by Institutional Shareholder Services (ISS) and Glass Lewis, the resolution aims to address dangers such as cyberbullying, grooming, and exposure to harmful content. Cooper highlights the long-term benefits for investor security, emphasizing Meta's responsibility amid rising legal and regulatory pressures globally.
- Resolution aims to protect children on Meta platforms and improve long-term financial stability.
- Institutional Shareholder Services (ISS) and Glass Lewis recommend voting in favor of the resolution.
- Potential for improved public perception and trust in Meta’s commitment to user safety.
- Alignment with new legislation in the U.S., U.K., and E.U. could mitigate future financial and legal penalties.
- Focus on setting clear targets and performance metrics for evaluating child safety improvements.
- Meta faces significant legal and regulatory challenges globally regarding child safety issues.
- Social media platforms linked to widespread child safety concerns, including exploitation and mental health risks.
- End-to-end encryption on Facebook Messenger may reduce the visibility of child abuse reports.
- Meta fined over $400 million by Ireland’s Data Protection Commission for child safety breaches.
- Potential for continued lawsuits and penalties if child safety measures are deemed insufficient.
Insights
Meta's current situation regarding child safety presents a multifaceted challenge that deserves detailed scrutiny. Institution of child safety measures could potentially safeguard Meta against regulatory and legal repercussions. However, from a business perspective, it is important to note that implementing these measures may initially incur high financial costs due to the necessary changes in technology, policies and staffing. In the short term, these expenses could affect Meta's operating margins.
Furthermore, increasing scrutiny and legal obligations could influence advertisers' perceptions and impact advertising revenue, which forms a significant part of Meta's income. On the flip side, successfully addressing child safety concerns might significantly enhance Meta's reputation, leading to long-term benefits. Enhanced user trust could foster greater user engagement and loyalty, thereby potentially increasing the user base and advertising revenues over the long haul.
Finally, the involvement of multiple regulatory bodies across different regions means that Meta must navigate a complex legal landscape. Failure to comply with these regulations could result in hefty fines and further damage to the company's reputation. In contrast, proactive compliance could alleviate some regulatory pressures and position Meta as a responsible leader in the tech industry.
Analyzing the legal ramifications of Meta's current situation requires understanding multiple layers of jurisdictional oversight. The new legislation in the U.S., U.K. and the E.U. imposes stringent requirements on tech companies to monitor and remove child sexual abuse material. Failing to do so could lead to
The proactive measures called for in the shareholder resolution align with these legal requirements and can serve as a compliance strategy to mitigate legal risks. Adopting the proposed targets and metrics could help Meta demonstrate its commitment to regulatory compliance, potentially reducing the likelihood of future lawsuits and penalties. However, the implementation of these measures must be diligent and transparent to withstand legal scrutiny.
Moreover, shifting towards more transparent practices, as suggested by the resolution, can have the added benefit of reducing the incidence of reactive, crisis-driven legal responses, which often result in higher financial and reputational costs. In the long term, consistent compliance and proactive measures will be important for maintaining operational stability and shareholder confidence.
From a financial standpoint, the proposed resolution could have mixed implications for Meta's short-term and long-term performance. In the immediate future, the financial burden of implementing new safety measures and the potential for increased operational costs due to regulatory compliance cannot be ignored. This could lead to a squeeze on profit margins and may have a negative impact on quarterly earnings, which are closely watched by investors.
However, in the longer term, improving child safety on Meta’s platforms could yield substantial benefits. Enhanced public perception and investor confidence can translate into stable user growth and potentially more resilient advertising revenue. Furthermore, reducing legal and regulatory risks could prevent costly penalties and litigations, preserving Meta's financial health over time.
Overall, while the short-term financial outlook may face some pressures due to increased spending, the long-term view presents a potential for stronger, more sustainable growth driven by improved trust and compliance. Investors should weigh these factors carefully, considering both immediate financial performance and the broader impacts on Meta’s market position and reputation.
Meta Shareholders Represented by Proxy Impact—Including Lead Filer Lisette Cooper, PhD, Vice Chair of Fiduciary Trust International—Seek to Protect Children and the Long-Term Financial Performance of Meta
The proposal, filed on behalf of Dr. Cooper and other Meta shareholders by Proxy Impact, calls on Meta’s Board of Directors to, within one year, adopt targets for reducing dangers and threats to children on its global social media platforms, as well as quantitative metrics for assessing the company’s improvement in this area. The resolution also calls for Meta’s Board of Directors to ensure these targets and performance metrics are published in an annual report, enabling investors and stakeholders to judge how effective Meta’s tools, policies, and actions for protecting children have been.
Institutional Shareholder Services (ISS) and Glass Lewis, the two largest proxy advisory services, both recommend voting for the resolution.
“Meta is the largest social media company in the world, with billions of users, but its platforms—including Facebook, Instagram, Messenger, and WhatsApp—have been shown to pose a variety of physical and psychological risks to children and teens,” said Lisette Cooper, PhD, vice chair of Fiduciary Trust International. “As a parent, and an investor, with a deep personal connection to this issue, I support this shareholder resolution as a meaningful step to encourage Meta’s leadership to do more to protect the young people who use its platforms—which we believe will also protect the long-term security of shareholders’ investments.”
Meta’s social media platforms have been linked to many dangers to the physical and mental wellbeing of children and teenagers. These range from sextortion, grooming, and human trafficking to cyberbullying, harassment, exposure to sexual or violent content, depression, anxiety, self-harm, and self-image distortion.
- The National Center for Missing and Exploited Children reported that its CyberTipline received nearly 36 million reports of online exploitation of children in 2023, including child sexual abuse material, child sex trafficking, and online enticement—and almost 31 million of them came from Meta platforms.
- A Wall Street Journal investigation published in June 2023 found that Meta’s algorithms for Instagram guide pedophiles to sellers of child sexual abuse materials, essentially “connecting a vast pedophile network.”
- Meta has also begun end-to-end encryption of Facebook Messenger, despite warnings from law enforcement and child safety organizations that doing so will hide millions of reports of child sexual abuse materials—masking the actions of predators, and making children more vulnerable.
-
In the wake of the
U.S. Surgeon General’s Advisory on social media and youth mental health, 42 U.S. state attorneys general have filed lawsuits against Meta, claiming Facebook and Instagram algorithms are designed to intentionally make the platforms addictive, and that they harm young people’s mental health. -
In September 2022, Meta was fined
€405 million , or just over , by Ireland’s Data Protection Commission for not safeguarding children’s information on Instagram.$400 million
“The Internet is like the Wild West for children and teens. Meta and other social media companies need to do more to prevent their technology from being weaponized against their youngest users,” said Michael Passoff, chief executive officer of Proxy Impact. “The more tech companies try to evade responsibility for the harm caused by algorithms designed to maximize user engagement, the more the world is fighting back. Shareholders in Meta and other social media companies can make an enormous difference by raising their voices against business practices that treat children as collateral damage.”
If Meta does not sufficiently address child safety issues, it faces potential financial, regulatory, and legal penalties under new legislation in the
- The E.U.’s Digital Services Act and Digital Markets Act, which went into effect in February 2024, will require companies like Meta to identify, report, and remove child sexual abuse materials.
- The U.K.’s Online Safety Act of 2023 includes measures to keep children and other online users safe from harmful and fraudulent content.
- In this country, the REPORT Act was signed into law on May 7, 2024. The legislation will strengthen the capabilities of the National Center for Missing and Exploited Children’s national tipline to collect reports of online exploitation, and require the reports and evidence to be preserved for a longer period—thereby giving law enforcement more time to investigate and prosecute.
Dr. Cooper’s daughter Sarah is a founding member of the Brave Movement and has been deeply involved in the Heat Initiative’s campaign as a survivor/lived experience expert. She is a survivor of child sexual abuse by an older man who misrepresented himself on Facebook Messenger. Sarah Cooper has spoken at two of Meta’s previous annual meetings.
Dr. Cooper is also a member of the Interfaith Center on Corporate Responsibility’s working group on child safety and technology. Since 2019, Proxy Impact and Dr. Cooper have worked with members of the Interfaith Center on Corporate Responsibility to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media.
About Fiduciary Trust International
Fiduciary Trust International, a global wealth management firm headquartered in
About Franklin Templeton
Franklin Resources, Inc. [NYSE: BEN] is a global investment management organization with subsidiaries operating as Franklin Templeton and serving clients in over 150 countries. Franklin Templeton’s mission is to help clients achieve better outcomes through investment management expertise, wealth management and technology solutions. Through its specialist investment managers, the company offers specialization on a global scale, bringing extensive capabilities in fixed income, equity, alternatives and multi-asset solutions. With more than 1,500 investment professionals, and offices in major financial markets around the world, the
About Proxy Impact
Proxy Impact provides shareholder engagement and proxy voting services that promote sustainable and responsible business practices. For more information, visit www.proxyimpact.com.
Copyright © 2024 Fiduciary Trust International. All rights reserved.
View source version on businesswire.com: https://www.businesswire.com/news/home/20240521870308/en/
Rebecca Radosevich: 212-632-3207
rebecca.radosevich@franklintempleton.com
Sabrina Scarpa: 973-309-0051
ft@jconnelly.com
Source: Fiduciary Trust International
FAQ
What is the resolution filed by Meta shareholders about?
When will Meta's annual meeting take place?
Who supports the child safety resolution at Meta?
What are some of the dangers children face on Meta platforms?
How many reports of online child exploitation were linked to Meta platforms in 2023?
What financial penalty did Meta face for child safety breaches on Instagram?
What is the impact of end-to-end encryption on Facebook Messenger?