Notice of Exempt Solicitation
NAME OF REGISTRANT: Facebook Inc.
NAME OF PERSONS RELYING ON EXEMPTION: Arjuna Capital
ADDRESS OF PERSON RELYING ON EXEMPTION: 1 Elm St. Manchester, MA 01944
WRITTEN MATERIALS: The attached written materials are submitted pursuant to Rule 14a-6(g)(1) (the “Rule”) promulgated under the Securities Exchange Act of 1934,* in connection with a proxy proposal to be voted on at the Registrant’s 2020 Annual Meeting. *Submission is not required of this filer under the terms of the Rule but is made voluntarily by the proponent in the interest of public disclosure and consideration of these important issues.
April 16th, 2021
Dear Facebook Inc. Shareholders,
We are writing to urge you to VOTE “FOR” PROPOSAL 7 on the proxy card, which asks our Company to provide a report regarding the nomination of a board member with civil/human expertise. The proposal makes the following request:
RESOLVED: Shareholders request that Facebook’s Board of Directors nominate for the next Board election at least one candidate who:
| • | has a high level of human and/or civil rights expertise and experience and is widely recognized as such, as reasonably determined by Facebook’s Board, and |
| • | will qualify as an independent director within the listing standards of the New York Stock Exchange. |
We believe shareholders should vote “FOR” the Proposal for the following reasons:
| 1. | Facebook’s track record on civil and human rights issues has been abysmal and demonstrates a repeating pattern of changing policies and practices only after substantial and highly-visible public pressure from a variety of key stakeholders, including regulators and legislators, journalists, civil and human rights organizations, advertisers, litigants, and ordinary Facebook users. |
| 2. | Highly-publicized changes in policies and practices are not consistently or adequately enforced. The result is often a double standard in which advocates for civil and human rights are censored while racists and conspiracy theorists persist on Facebook’s platform. |
| 3. | Addressing the many civil and human rights concerns at Facebook requires a board member who is capable of challenging a business model and culture that is resistant to change. While the company has appointed its first-ever Vice President of Civil Rights and adopted a Human Rights Policy in the past year, such changes are not sufficient to address board-level governance and business strategy concerns. |
The Board’s Statement of Opposition:
In its statement of opposition to this proposal, the board contradicts itself, arguing on one hand that a human and civil rights board expert could not address a broad range of issues, while arguing on the other hand how broadly human and civil rights impact Facebook’s operations, how much work is underway, and how much more work is needed. The Company carefully sidesteps the question of who on its board of directors has responsibility for civil and human rights concerns, and how any concerns will be evaluated, making clear that the board’s oversight might be conducted on an annual basis only. Moreover, the statement implies that a director with expertise in technology or financial matters could sufficiently oversee civil and human rights concerns, which the proponent suggests is not supported by recent history.
Legal, Financial, and Reputational Risk:
Available recent evidence cited below strongly suggests that the governance model at Facebook has not worked in the past and is not working today, exposing the Company to enormous legal, financial and reputational risk. Alarm bells are sounding. The public record demonstrates a need for greater expertise and oversight on civil and human rights issues at the highest levels of the company - i.e., its Board of Directors.
| 1. | Discriminatory Advertising Practices: One of the most critical civil and human rights issues at Facebook stems from the company’s troublesome targeted advertising policies and practices. These have been highlighted for the company’s leadership in highly-public fashion on numerous occasions in recent years. |
| a. | Sex-based Discrimination: Most recently, an April 2021 study by independent researchers at the University of Southern California found that Facebook’s ad-delivery system shows different job ads to women and men even though the jobs require the same qualifications, for instance showing pizza delivery ads to more men and grocery delivery ads to more women. The Wall Street Journal said the study “reflects Facebook’s difficulties in understanding and managing the societal effects of its content-recommendation systems.” MIT Technology Review observed: “This is considered sex-based discrimination under US equal employment opportunity law, which bans ad targeting based on protected characteristics. The findings come despite years of advocacy and lawsuits, and after promises from Facebook to overhaul how it delivers ads.” |
Indeed, discrimination in its targeted advertising systems has been a problem at Facebook for many years, one the company has repeatedly assured shareholders was under control.
| b. | Race/Gender/Age Profiling: In 2019, for example, the U.S. Department of Housing and Urban Development sued Facebook over what it called biased ad targeting; civil rights organizations filed separate lawsuits. Facebook settled both lawsuits - calling the settlement with civil rights groups “historic”- and said it would stop allowing advertisers in key categories to show their messages only to people of a certain race, gender or age group. Six months later, research by ProPublica found that the company’s new system to ensure diverse audiences for housing and employment ads had many of the same problems as its predecessor. It was not until August 2020 that Facebook quietly removed the “multicultural affinity” label which enabled advertisers to target users by using proxies for race - and, then again, only after a media outlet contacted Facebook about a discriminatory job advertisement that appeared on its platform. |
Despite that history, the patterns and harms of discriminatory advertising persist. In April, Northeastern University researcher Piotr Sapieżyński, who has conducted three audits of Facebook’s platform, says “Facebook still has yet to acknowledge that there is a problem.” And Christo Wilson, another researcher at Northeastern who studies algorithmic bias, agrees:
“How many times do researchers and journalists need to find these problems before we just accept that the whole ad-targeting system is bankrupt?”
| 2. | Failed Content Moderation Policies and Practices: Facebook management has repeatedly told regulators and legislators, as well as the media and general public, that its content moderation policies adequately address civil and human rights concerns. These assurances were occurring during the Cambridge Analytica scandal in 2018, but they persist today, and have led to increasingly serious allegations about the integrity of Facebook’s representations regarding content moderation, which poses legal risk. |
| a. | Anti-Muslim Content: An April 2021 lawsuit filed by Muslim Advocates, a social justice organization, accuses Facebook and its top executive of organizing a “coordinated campaign to convince the public, elected representatives, federal officials, and non-profit leaders in the nation’s capital that Facebook is a safe product—by misrepresenting that Facebook takes down or removes any content that violates Facebook’s Community Standards or other policies...If Facebook’s executives had enforced their own Community Standards and policies as they promised, a significant amount of the anti-Muslim hate and real world damage could have been avoided.” |
Muslim Advocates in 2020 published a ground-breaking report, Complicit: The Human Cost of Facebook’s Disregard for Muslim Life which documents Facebook’s instrumental role in anti-Muslim violence and threats in nine nations around the world, as well as Facebook’s support of anti-Muslim authoritarian regimes and its anti-Muslim senior staff. The report is the first comprehensive, worldwide review of Facebook’s role in enabling anti-Muslim hate.
The lawsuit, filed in a Washington, D.C. court, alleges: “Facebook has been used, among other things, to orchestrate the Rohingya genocide in Myanmar, mass murders of Muslims in India, and riots and murders in Sri Lanka that targeted Muslims for death. Anti-Muslim hate groups and hate speech run rampant on Facebook with anti-Muslim posts, ads, private groups, and other content. Armed, anti-Muslim protests in the United States have been organized on Facebook event pages. The Christchurch, New Zealand, mosque massacres were live-streamed on Facebook. The resulting video was shared via Facebook an untold number of times worldwide.”
| 3. | Prioritizing PR over a Systemic Approach: An April 2021 investigation by the Guardian concluded that “Facebook has repeatedly allowed world leaders and politicians to use its platform to deceive the public or harass opponents despite being alerted to evidence of the wrongdoing.” Relying on “extensive internal documentation” and a former Facebook employee, the Guardian reported: |
“The investigation shows how Facebook has allowed major abuses of its platform in poor, small and non-western countries in order to prioritize addressing abuses that attract media attention or affect the US and other wealthy countries. The company acted quickly to address political manipulation affecting countries such as the US, Taiwan, South Korea and Poland, while moving slowly or not at all on cases in Afghanistan, Iraq, Mongolia, Mexico, and much of Latin America.
“There is a lot of harm being done on Facebook that is not being responded to because it is not considered enough of a PR risk to Facebook,” said Sophie Zhang, a former data scientist at Facebook who worked within the company’s “integrity” organization to combat inauthentic behavior. “The cost isn’t borne by Facebook. It’s borne by the broader world as a whole.”
Facebook pledged to combat state-backed political manipulation of its platform after the historic fiasco of the 2016 US election, when Russian agents used inauthentic Facebook accounts to deceive and divide American voters.
But the company has repeatedly failed to take timely action when presented with evidence of rampant manipulation and abuse of its tools by political leaders around the world.”
| 4. | Enabling White Supremacist Violence: Facebook facilitated deadly white supremacist violence in Kenosha, Wisconsin, in August 2020, when violent, racist posts went unchecked until after two protesters were murdered, despite the posts clearly violating Facebook's “Dangerous Individuals and Organizations” policy. CEO Mark Zuckerberg acknowledged in a company video post that Facebook had erred, calling it a “largely operational mistake,” despite the fact that the post was reported 455 times and ignored by Facebook’s content moderators. The company was subsequently sued by four individuals, including the partner of one of the civil rights protesters who was shot and killed. The lawsuit alleges that Facebook was negligent in failing to remove posts calling on local militia members to take up arms. |
| 5. | Perpetuating Racial Bias: In July 2020, NBC News reported that a year earlier, researchers at Facebook had found that users on the Facebook-owned Instagram in the United States whose activity on the app suggested they were Black were about 50 percent more likely under new rules to have their accounts automatically disabled by the moderation system than those whose activity indicated they were white, according to two current employees and one former employee. “The researchers took their findings to their superiors, expecting that it would prompt managers to quash the changes,” NBC News reported. “Instead, they were told not (to) share their findings with co-workers or conduct any further ‘research into racial bias in Instagram’s automated account removal system.” Facebook did not deny that some researchers were told to stop exploring racial bias but said that it was because the methodology used was flawed. Alex Schultz, Facebook's vice president of growth and analytics, said research and analyses on race are important to Facebook but (it) is a “very charged topic” and so needs to be done in a rigorous, standardized way across the company. |
| 6. | Inconsistent Approach to Civil Rights: Independent experts hired by Facebook to conduct a civil rights audit of the company concluded in July 2020 that in “numerous legal filings, Facebook attempts to place itself beyond the reach of civil rights laws.” “What the Auditors have experienced is a very inconsistent approach to civil rights,” the report said. “Facebook must establish clarity about the company’s obligations to the spirit and the letter of civil rights laws.” In discussing the civil rights audit, Rashad Robinson, CEO of Color of Change, a national civil rights organization, said: “What we get is recommendations that they end up not implementing.” |
As the New York Times reported:
The civil rights audit found that Facebook had not done enough to protect people on the platform from discriminatory posts and ads and that its decisions to leave up President Trump’s inflammatory posts were “significant setbacks for civil rights.”
“Many in the civil rights community have become disheartened, frustrated and angry after years of engagement where they implored the company to do more to advance equality and fight discrimination, while also safeguarding free expression,” wrote the auditors, Laura W. Murphy and Megan Cacace, who are civil rights experts and lawyers.
| 7. | Advertiser Boycott: In July 2020, civil rights groups organized a #StopHateForProfit campaign and urged advertisers to stop paying for ads on Facebook to protest the platform’s handling of hate speech and misinformation. More than 1,000 advertisers joined in the boycott, reducing their spending by millions of dollars. Following a meeting with Facebook executives to discuss the boycott and their concerns, civil rights leaders said the company’s CEO and other executives had reverted to “spin” and PR tactics rather than address substantive issues. “Facebook approached our meeting today like it was nothing more than a PR exercise,” said Jessica González, co-chief executive officer of Free Press, a nonprofit media advocacy group. |
| 8. | Lack of Confidence in Facebook’s Future Leadership on Civil and Human Rights: |
Access Now, a tech policy organization that addresses tech issues on a global scale, said Facebook’s newly-adopted Human Rights Policy “raises more questions than answers around how Facebook will more effectively integrate human rights considerations into its day-to-day operations and strategic decision making.” Access Now said the “ultimate success” of the policy “will depend greatly on the extent to which Facebook’s leadership prioritizes keeping human rights at the forefront of every area of the company’s work, and on the human rights team inside of Facebook — alongside external experts — being given additional resources, top-level support, and a more central role in everything from product development to safety and security to sales and beyond.” Facebook could start that process, according to Access Now, by “heeding its own shareholders’ call to appoint a member to the Board of Directors with a high level of human and/or civil rights expertise.”
An Amnesty International report titled Surveillance Giants: How the Business Model of Google and Facebook Threatens Human Rights questions whether Facebook’s efforts to respect human rights would even be possible without changes to the business model, while also suggesting that both companies’ existing rights-respecting efforts through the Global Network Initiative may not be enough, stating:
“In line with international human rights standards, Google and Facebook should be carrying out due diligence to identify and address the potential and actual impacts of their business model on specific human rights... However, the fact that the harvesting, analysis and monetisation of data is so core to their business model...means that the companies should also be assessing whether their surveillance-based business model can ever be compatible with their responsibility to respect human rights. Google and Facebook have both made a longstanding commitment to the rights to privacy and freedom of expression through participation in the Global Network Initiative (GNI). However, the scope of the GNI means it does not address risks to other rights beyond freedom of expression and privacy; it is also primarily focused on how companies respond to government requests for data.”
Conclusion
For all the reasons provided above, we strongly urge you to support the Proposal. To say that Facebook has adequately addressed human and civil rights is a gross overstatement. Rightly, the company is scrambling to build some internal scaffolding to address the racism, sexism, hate, and violence perpetuated on its platforms. But Facebook continues to ignore the root cause—a business model that is fueled by click bait and user profiling. It is the board’s responsibility to steward business strategy and address this root cause issue. Please contact Natasha Lamb at natasha@arjuna-capital.com for additional information.
Sincerely,
Natasha Lamb
Arjuna Capital
This is not a solicitation of authority to vote your proxy. Please DO NOT send us your proxy card. Arjuna Capital is not able to vote your proxies, nor does this communication contemplate such an event. The proponent urges shareholders to vote for Proxy Item 7 following the instruction provided on the management’s proxy mailing.
The views expressed are those of the authors and Arjuna Capital as of the date referenced and are subject to change at any time based on market or other conditions. These views are not intended to be a forecast of future events or a guarantee of future results. These views may not be relied upon as investment advice. The information provided in this material should not be considered a recommendation to buy or sell any of the securities mentioned. It should not be assumed that investments in such securities have been or will be profitable. This piece is for informational purposes and should not be construed as a research report.
5