Why didn’t Facebook crack down on the Border Patrol group?

The company has not explained how thousands of former and current border patrol officials were able to post hateful content.

A US Border Patrol patch on a border agent's uniform in McAllen, Texas, on January 15, 2019. (Photo credit: SUZANNE CORDEIRO/AFP/Getty Images)
A US Border Patrol patch on a border agent's uniform in McAllen, Texas, on January 15, 2019. (Photo credit: SUZANNE CORDEIRO/AFP/Getty Images)

Revelations that thousands of federal law enforcement officers were part of a Facebook group that mocked the very people they are sworn to help has turned the spotlight on glaring flaws in Facebook’s hate speech policy.

On Monday, ProPublica revealed that nearly 10,000 former or current Border Patrol agents were part of a secret Facebook group that regularly shared derogatory, misogynistic and racist content. In one instance, members responded with amusement or indifference to a story posted about a 16-year-old Guatemalan migrant who died in Border Patrol custody.

In another interaction, a member of the group encouraged agents to hurl a “burrito at these bitches,” referring to Reps. Alexandria Ocasio-Cortez (D-NY) and Veronica Escobar (D-TX), who at the time were visiting a border facility near El Paso.

Customs and Border Protection (CBP) officials said this week that they were investigating the incident. “We take all the posts that were put out today very seriously. These do not represent the thoughts of the men and women of the US Border Patrol,” Brian Hastings, US Border Patrol chief of operations, told CNN Monday. “Each one of these allegations will be thoroughly investigated.”


The incident is hardly an isolated one. Last month, Reveal reported that hundreds of retired and active-duty law-enforcement officers were sharing racist, misogynistic and Islamaphobic content across dozens of Facebook groups. Some of the content was sympathetic towards “Patriot” groups like the Oathkeepers and Three Percenters —  the latter of which have vowed to protect a Republican state senator in Oregon who threatened to shoot state police.

Numerous law enforcement agencies have begun investigations into their officers’ posts on these Facebook groups. One of these, the Philadelphia Police Department, recently placed 72 officers on desk duty amid revelations that they took part in the groups.

On Tuesay, Facebook said it was cooperating with investigators looking into the Border Patrol group. “We want everyone using Facebook to feel safe. Our Community Standards apply across Facebook, including secret Groups,” a spokesperson told ThinkProgress. “We’re co-operating with federal authorities in their investigation.”

Pressed on how both that group and the other law enforcement groups managed to sail under the radar the social networking site, the Facebook spokesperson declined to comment, citing the ongoing federal investigation.

Such incidents provide stark examples of the cracks in Facebook’s policies surrounding hate speech. The social media giant has been frequently eager to point out how serious it is about tackling violent and hateful content. For instance, after a far-right terror attack in Christchurch, New Zealand, in March, Facebook announced it would ban all posts praising and supporting white nationalism.


On Sunday, Facebook’s Chief Operating Officer Sheryl Sandberg also announced that the company would organize a civil rights task force to better tackle misinformation and content policy.

But Facebook’s success at moderating offensive content — of which the Border Patrol group is just the latest example — lags far behind the sunny image it is eager to present to the public. The same day that Sandberg announced the civil rights task force, Facebook released its Civil Rights Audit which showed the company’s continued failures in content moderation. Civil rights advocates who saw the audit and others who even participated in it were highly critical, saying current plans to combat hate on the platform were not nearly comprehensive enough.

“We cannot move forward to protect targeted groups harmed by activity unless we have both an unvarnished look at the cesspools of hate and misinformation growing and spreading on Facebook with the company’s detailed plan for action to be taken on an urgent timeline,” Keggan Hankes, interim research director of the Southern Poverty Law Center, said about the audit. “This update provided the public with neither.”

“Facebook remains turtle-slow to change. They need to move now to build a diverse team of experts with real authority to oversee ending hate on their platform to get it moving,” added Henry Fernandez, senior fellow at the Center for American Progress. (ThinkProgress is an editorially independent newsroom housed within the Center for American Progress.)

“Relying primarily on monthly meetings of executives and a couple of outside consultants with civil rights expertise is a step forward, but insufficient,” he added.