Between 2020 and 2023, Facebook introduced a series of feed-ranking adjustments designed to reduce the visibility of political content and the spread of hostile discourse. While these algorithmic interventions were presented as efforts to improve user well-being and protect electoral integrity (Meta, 2021), their effects in Nigeria during the 2023 general elections were complex. Drawing on computational text analysis of over two million election-related Facebook posts, a two-wave panel survey of 2,400 respondents, and focus group discussions across four regions, the study finds that visible incivility on Facebook declined by 43 percent. However, hostile speech was not eliminated; rather, much of it migrated to encrypted platforms such as WhatsApp and Telegram, echoing global concerns about the “displacement effect” of content moderation (Bradshaw & Howard, 2018; Howard, 2020). Alarmingly, survey evidence reveals that exposure to down-ranked feeds corresponded with declining trust in the Independent National Electoral Commission (INEC) and increased belief in electoral conspiracy theories, confirming prior studies on the link between algorithmic opacity and institutional mistrust (Orji, 2023; Tucker et al., 2018). Looking ahead to 2027, the study models three scenarios: maintaining the status quo, stricter throttling of political content, and a co-governance framework involving INEC, civil society, and digital rights actors. The evidence suggests that only the co-governance approach can balance the reduction of incivility with the preservation of electoral trust. The Nigerian case highlights the need for democratic legitimacy in platform governance, demonstrating that algorithmic depoliticisation without local oversight may undermine rather than strengthen democratic integrity.
SSR Publisher
A Platform for Scholarly Collaboration, Dedicated to Knowledge and Innovation.
Open Access License
Copyright © 2025 SSR PUBLISHER | Powered by SSR PUBLISHER