MeitY seeks details on Facebook algorithm after Whistleblower allegations

Weeks after Facebook whistle-blower Frances Haugen’s testimony, the Ministry of Electronics & Information Technology (MeitY) has written a letter to Facebook India’s Managing Director Ajit Mohan seeking information on the processes that the company follows to moderate content on its platform and prevent harm to online users, ET reported.

The government has asked for information on algorithms that moderate content on Facebook’s network. It will investigate the issue further after receiving a response.

What information does the government want, and why?

People familiar with the development told ET that the revelations have “alarmed the government”. What has caused concern is perhaps the ‘test account’ that displayed more images of dead people in 3 weeks than what a researcher had seen in his entire life, and the promotion of violent and provocative posts, especially anti-Muslim content.

You can read this special newsletter issue to understand the issue.

The person said, “The government has to probe how their [Facebook’s] systems currently work and how they plan to reform or change it”.

The officials say that the government can demand such information exercising India’s sovereign power and the due diligence requirement under the IT Rules.

Done very little to control inflammatory content

The report titled “An Indian test user’s descent into a sea of polarizing, nationalist messages” adds to past exposés of how little Facebook has done to censor inciteful and inflammatory content, particularly in regional languages.

Read Further: Facebook Whistleblower testimony before the Senate- Key Takeaways

Mark Zuckerberg has denied Haugen’s allegations in a letter to Facebook employees. He said the company cares “deeply about issues like safety, well-being and mental health”. He added:

“Many of the claims don’t make any sense. If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us?”

Meanwhile, Nick Clegg, Facebook’s vice president of global affairs, has said that the company is willing to subject itself to greater oversight and ensure its algorithms are not harming users.


Do subscribe to our Telegram group for more resources and discussions on tech-law & policy. To receive weekly updates, don’t forget to subscribe to our Newsletter.

Pukhraj Biala

I am an undergraduate student at Symbiosis Law school, NOIDA, pursuing B.A.LL.B. I am a problem solver who believes in reaching to a conclusion by weighing all the options and identifying the best possible one. I find Technology Laws quite fascinating and I continue to follow and learn the subject.

Share your thoughts!

This site uses Akismet to reduce spam. Learn how your comment data is processed.