Human rights groups say Facebook is stifling an independent report it commissioned to investigate hate speech on its services in India, the company’s largest market by customers and where scrutiny of its operations is increasing.

Representatives for the organizations say they have provided extensive input to a U.S. law firm that Facebook commissioned in mid-2020 to undertake the report. The groups say they supplied hundreds of examples of inflammatory content and suggested ways Facebook could better police its services in...

Human rights groups say Facebook is stifling an independent report it commissioned to investigate hate speech on its services in India, the company’s largest market by customers and where scrutiny of its operations is increasing.

Representatives for the organizations say they have provided extensive input to a U.S. law firm that Facebook commissioned in mid-2020 to undertake the report. The groups say they supplied hundreds of examples of inflammatory content and suggested ways Facebook could better police its services in India.

Facebook executives from the company’s human rights team, which is overseeing the law firm’s effort, have since narrowed the draft report’s scope and are delaying a process that has already taken more than a year, the groups say.

“They are trying to kill it,” said Ratik Asokan of India Civil Watch International, one of the organizations that provided the law firm with input. Mr. Asokan said that Facebook has raised technical objections through the law firm that have caused delays, such as changing definitions of what can be considered hate speech and included in the report, undermining what Facebook said would be an independent study. The law firm hasn’t provided a timeline for completing it, he said.

Much of the material his group has flagged as dangerous hasn’t been removed by Facebook, even though it violates Facebook’s rules, he said, such as posts comparing Muslims to locusts and calls for Hindus to gather weapons in preparation for violence against Muslims.

“With a complex project like this, the goal is to be thorough, not to meet an arbitrary deadline,” said Andy Stone, a spokesman for Facebook, a unit of Meta Platforms Inc., referring to the timing of the report. “We look forward to our independent assessor, Foley Hoag, completing their India assessment,” he said, referring to the law firm Facebook has commissioned to write the report.

Foley Hoag is running the process, and Facebook doesn’t know which groups were contacted and isn’t in touch with them, Mr. Stone said. Facebook has removed material that violates its rules that the groups flagged to Foley Hoag, he said.

“Our work still is ongoing,” said Gare Smith, partner and chair of Foley Hoag’s global business and human rights practice. The firm is committed to its thoroughness and has taken “numerous steps to ensure the assessment is conducted fairly and independently,” he said. “Although the pandemic has considerably slowed” the firm’s work, “there has never been a specific date set for its completion,” he said.

Facebook has faced sustained criticism from rights groups for failing to police its platform in India, where it has more than 300 million Facebook users and more than 400 million people on its WhatsApp messaging service.

Members of Congress have likened Facebook and Instagram’s tactics to that of the tobacco industry. WSJ’s Joanna Stern reviews the hearings of both to explore what cigarette regulation can tell us about what may be coming for Big Tech. Photo illustration: Adele Morgan/The Wall Street Journal The Wall Street Journal Interactive Edition

As part of its Facebook Files project, The Wall Street Journal separately reported last month that the company’s researchers have found that its products in India are awash with inflammatory content that one report linked to deadly religious riots, according to internal documents. Facebook said at the time that it has invested significantly in technology to find hate speech in India and is improving its enforcement.

The Journal reported in August 2020 that Facebook’s then top public-policy executive in India had opposed applying Facebook’s hate-speech rules to a Hindu nationalist politician and others. The executive didn’t respond to requests for comment at the time and later departed the company. A group of civil-society organizations citing the Journal’s reporting published an open letter urging Facebook to “address dangerous content in India.”

Nick Clegg, Facebook’s vice president of global affairs, said in a Sept. 30, 2020, letter responding to the groups that the company prohibits hate speech and that it had “several months ago” commissioned Foley Hoag to undertake an “independent human rights impact assessment” on Facebook in India.

“That work has already started, and it will continue,” Mr. Clegg wrote in the letter, which was reviewed by the Journal. He said Facebook would brief stakeholders on “the ways in which we are using the findings of our HRIA to improve our work” and looked forward to sharing more “as our work unfolds.”

Mr. Clegg said in the letter that Foley Hoag would have “complete independence” in determining its methods and groups to consult but that Facebook would suggest the law firm incorporate feedback from individual Facebook users and vulnerable groups.

Ritumbra Manuvie, who co-founded an Indian-diaspora-led group based in The Hague called Stichting London Story, which studies disinformation and hate speech, said her organization has provided Foley Hoag with a variety of pieces of content that she says violate Facebook’s rules.

But much of the material remains on Facebook, she said, such as a video that has received an estimated 40 million views in which a Hindu speaker tells an audience that Muslims and Islam should be exterminated. Facebook’s “lack of oversight” of content on its services has “normalized dehumanization and hate speech against Indian Muslims,” Ms. Manuvie said.

Foley Hoag pushed her group to demonstrate that specific content on its platforms has caused harm, a higher bar than human rights impact assessments must typically meet and not in the spirit of assembling an independent report, she said. Mr. Stone said Facebook hasn’t given such guidance.

The law firm has also challenged her group on what it flagged as hate speech, asking first whether it had reported it through Facebook’s official reporting tools, and then later whether it had done so within specific time frames. “There is a lot of moving of the goal posts,” Ms. Manuvie said.

Mr. Smith, of Foley Hoag, said the group can’t comment on specific elements of the report or its exchanges with Facebook, citing attorney-client privilege.

Dia Kayyali, a technology-policy researcher who worked for one of the human-rights groups that wrote to Facebook last year, said Facebook hasn’t fulfilled its promise in Mr. Clegg’s letter to keep research groups apprised of the status of the report and how Facebook might be acting on it.

Mr. Stone said Mr. Clegg was referring to Facebook doing so once the report was completed.

SHARE YOUR THOUGHTS

Is Facebook doing enough to tackle hate speech on its platform in India? Join the conversation below.

In recent years, Facebook has released executive summaries of human rights impact assessments it commissioned on its operations in Indonesia, Sri Lanka and Cambodia. It released the full version of one it commissioned on Myanmar. In each instance, it said the consultants it engaged completed their work in less than one year.

Following the Journal’s Facebook Files report last month, India’s Ministry of Electronics and Information Technology wrote to Facebook’s top executive in India to ask for details on how the company monitors and removes inflammatory content on its platform, according to government officials.

Write to Newley Purnell at newley.purnell@wsj.com