Sunday, December 22, 2024

The role of Big Tech commitment to uphold election integrity in Africa, as countries head to polls

- Advertisement -

A dozen countries in Africa, including Nigeria, the continent’s biggest economy and democracy, are expected to hold their presidential elections next year, and questions linger on how well social media platforms are prepared to curb misinformation and disinformation after claims of botched content moderation during Kenya’s polls last August by the Big Tech.

25 January 2021, Berlin: The logos of social media platforms WhatsApp (l-r), Twitter, TikTok, Microsoft Teams, Clubhouse, Facebook, Instagram, Slack and Telegram are seen on an iPhone 12 Pro Max. Photo: Christoph Dernbach/dpa (Photo by Christoph Dernbach/picture alliance via Getty Images)

Concerns are mounting as it emerges that Twitter has scaled back content moderation after Elon Musk took over and later laid-off more than half the employees, and nearly cleaning out the entire Africa team, a decision that left outsourced moderators out of jobs too. With very limited support to filter or stop the spread of propaganda, Africa will likely be a casualty of Twitter’s oft-erratic or slow response to falsehoods — which catalyze violence in times of political polarization.

But this is not unique to Twitter; widely used platforms like Facebook, TikTok, WhatsApp and YouTube have also been fingered for doing little to stop misinformation and disinformation in Africa.

Read Also: Facebook sued in Kenyan High Court for fueling Tigray War in Ethiopia

In Nigeria, for instance, sitting president Muhammadu Buhari has voiced concerns over how disinformation and misinformation on social media is fanning conflict, insecurity and distrust in the government in the lead up to the February elections — even as the country’s economy continues to struggle, causing a sense of instability. Yet, as momentum picks up for what is one of the most hotly contested elections, activists, researchers and a section of civilians are apprehensive about the mounting spread of negative campaigning.

Researchers anticipate that hateful content and falsehoods, meant to stir confusion or sway voters in Nigeria, will continue to be shared online. They are insistently calling on tech companies to hire and train local experts with the knowledge of local languages and context to intercept misleading, violent or intimidating posts that could undermine election integrity.

“Social-media platforms especially – Twitter, Meta (Facebook), YouTube, WhatsApp and Telegram – should step up efforts to identify and deal with election-related misinformation, disinformation and conspiracies as well as intercepting violent or intimidating messages,” said Audu Bulama Bukarti, a senior fellow, Tony Blair Institute for Global Change, in a report published a fortnight ago about security risks in Nigeria.

Nigeria’s youthful and tech-savvy population is Africa’s most active on social media. The calls for the platforms to step up content moderation, while not new, follow the increased use of social sites owing to smartphone and internet penetration.

“The reach and influence of social media have grown ever larger in the years since the 2019 election. It will play a pivotal role in the 2023 election, in terms of positive political communication and in terms of its ability to spread misinformation and disinformation,” said Bukarti.

In Nigeria, Meta claims to have invested in people, including content moderators and technology to stem the misuse of its platforms ahead of the elections. The social media giant is also taking the same measures as before and during Kenya’s elections, which included verifying the identities of persons posting political ads. But Mozilla tech and society fellow Odanga Madung is not convinced Facebook and other social sites are prepared well enough.

“Social media platforms are still not completely ready to deal with election environments especially because they’ve had massive layoffs that have greatly affected how the work within several of the areas these elections will be held,” said Madung.

“And quite frankly, they have consistently failed to address the key aspects that make an election environment a dangerous information environment in the first place, where things are neither true or false and information tends to get weaponized quite a bit. Election environments are incredibly low trust environments. I do not think they’re going to actually succeed on this.”

Away from Nigeria, a pivotal moment is also approaching for social media platforms and fragile nations such as Sudan, South Sudan, DR Congo, Libya and Mali — most of which have blocked social media access in the recent past to quell protests against their governments — as they head to polls next year.

Read Also: Kenya: Google engages over 150 software engineers in Nairobi through Sandbox

Bungled labeling and moderation

Social sites like Facebook, Twitter and TikTok recently came under heavy scrutiny over their role in undermining election integrity in Kenya. A Mozilla Foundation report claims that content labeling failed to stop misinformation, while platforms such as Facebook profiteered from political advertising that served to amplify propaganda.

Twitter and TikTok’s spotty labeling of posts calling the elections ahead of the official announcement made the platforms seem partisan, and failed to stop the spread of falsehoods, despite partnering with fact-checking organizations.

Facebook, the leading social media platform in Africa, failed majorly on this front by not having “any visible labels” during the elections, allowing the spread of propaganda — like claims of the kidnapping and arrest of a prominent politician, which had been debunked by local media houses. Months later, Facebook put a label on the original post claiming the kidnapping and arrest of the prominent politician.

Sluggish responses to falsehoods by Facebook are now at the center of a lawsuit filed last week claiming that Meta is fueling violence and hate in eastern and southern Africa.

Abrham Meareg, one of the petitioners and whose father, Professor Meareg Amare, was killed during the Tigray War after Facebook posts doxed and called for violence against him, says that Facebook failed, on multiple requests, to bring down posts that put his father’s life in danger. He said that one post was recently taken down, a year after his father’s murder — more than 600,000 Ethiopians were killed during the two-year war that started in 2020.

The case claims that Facebook’s algorithm fuels viral hate and violence while that content moderation in Africa is bungled as moderators lack local knowledge to moderate content posted in local languages.

“Many of them (platforms) lack context and they are always going to fall short in terms of the promises they make to their users because, again, a lie is able to move very fast across platforms before they able to get ahold of it,” said Odanga.

Whistleblower Frances Haugen previously accused Facebook of “literally fanning ethnic violence” in Ethiopia, and a recent Global Witness investigation also noted that the social site was “extremely poor at detecting hate speech in the main language of Ethiopia.”

“Something is wrong with the way Facebook moderates content, and … there is a lack of investment in content moderation, especially for African countries. When you compare to other regions, we are getting the second-rate treatment. And what’s the effect? We are seeing a catalyst for civic unrest, civil war coming from normal interactions; viral posts that make fun of people and then escalate to insightful posts that my client is proof do end up causing violence in real life,” said Meareg’s lawyer, Mercy Mutemi.

Meanwhile, social media remains central to the permeation of political propaganda and the dilution of important investigations in matters around economic and social corruption. Last year, the former Kenyan president, Uhuru Kenyatta, was mentioned in the Pandora Papers — a leakage of files detailing the hidden wealth of a number of global leaders, celebrities and billionaires in offshore havens. However, researchers noticed the soaring of two hashtags, #offshoreaccountfacts and #phonyleaks, which topped trending topics and shadowed organic discussions on Twitter in Kenya, undermining the findings of the documentary.

Foreign-sponsored campaigns with political objectives have also affected more than three-quarters of the countries in Africa as “disinformation campaigns become increasingly sophisticated in camouflaging their origins by outsourcing posting operations.”

According to a Africa Center for Strategic Studies report published in April this year, Russian-sponsored disinformation campaigns by the Wagner Group mercenary force, promoting the Kremlin’s interests in the continent, for instance, have so affected more than 16 countries in Africa.

Samuel Musila
Samuel Musilahttps://techknow.africa
Passionate Software Developer and Tech content creator From Nairobi, Kenya

Related Articles

Stay Connected

1,198FansLike
144FollowersFollow
440FollowersFollow
196SubscribersSubscribe
- Advertisement -

Latest Articles