Meta shuts down its content moderation hub in East Africa

In a move that will affect approximately 200 employees, Meta, the parent company of Facebook, WhatsApp and Instagram, is shutting down its East African content moderation hub as it shifts away from policing harmful content.

The decision comes as Meta's third-party contractor, Sama, announced that it will focus solely on data labelling and computer vision data annotation, which includes positioning animations in augmented reality filters such as bunny ears.

Meta first contracted Sama in 2017 to assist with data labelling and AI training, ultimately hiring around 1,500 employees. However, within two years, the Nairobi office was moderating some of the most graphic and harmful material on Meta's platforms, including beheadings and child abuse.

The company, which currently employs more than 15,000 content moderators worldwide, has stated that it has a new partner in place and that its moderation capabilities will remain the same.

This news comes two months after Meta announced it would be cutting its global headcount by 13%, or around 11,000 employees, as the company suffers from falling revenue, a slump in digital advertising and fierce competition from rivals, including TikTok.