Instagram corrects algorithm after Palestinian censorship allegations

Instagram is changing its app to show more messages on viral and current affairs amid complaints from its staff that users did not see pro-Palestinian content during the recent conflict in Gaza.

Until now, the social media app preferred the original content of “stories” that it displayed above the user’s news feed to content that was recharged or reposted by other people.

Instagram will now evenly categorize original և reposted content according to the familiarity of the two people իրավիճ staff’s internal messages, in a move that will help find news posts for a wider audience.

According to the spokesman, the number of users sharing messages about the recent conflict in Gaza has increased, but the way the application is currently installed “had a greater impact than expected” on how many people saw the messages.

“Stories that update updates do not reach the people who expect them; it’s not a good experience,” he said. “Over time, we will move to balance the redistributed messages evenly as we did the originally produced stories.”

Instagram said the move did not fully address issues related to pro-Palestinian content, but has been discussed for some time.

The spokesman said the algorithm “made people believe that we were pushing for stories on certain topics or points of view,” but added: “It simply came to our notice then. This applies to any post that redistributes in the stories, no matter what it is about. ”

The group of 50 Facebook employees, which owns Instagram, has raised concerns about suppressing pro-Palestinian voices.

The employee said the group had filed more than 80 complaints about content being censored by the company’s automated calibration system. BuzzFeed also earlier reports about the existence of the group.

Facebook algorithms labeled commonly used words by Palestinian users, such as “martyr” և “resistance” as incitement to violence media reports,

The employee told the Financial Times that they did not believe there was a deliberate censorship by Facebook, but said that “scale moderation is biased against any marginalized groups” and led to over-elimination.

Facebook said. “We know there have been some issues that have affected people’s ability to share in our programs. We rejoice for anyone who felt that he could not pay attention to important events, or who thought it was a deliberate pressure on his voice. That has never been our goal, nor have we ever wanted to silence a particular community or point of view. ”

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button