Opinions of Thursday, 14 August 2025

Columnist: Dr. Florence Odarkor Entsua-Mensah

From Data to Influence: How social media now decides what we 'know'

A crystal-clear video surfaced on Facebook two days prior to Ghana’s hotly contested election on December 7, 2024. One presidential candidate was seen acknowledging that he would repeal the free SHS policy.

The video quickly went viral on TikTok duets, WhatsApp groups, and even the radio talk shows. According to news reported by Citi Newsroom in January 2025, the Ghana Fact-Checking Coalition had flagged it as an AI-generated deep-fake. It was just one of more than 100 false publications the coalition logged in a single week. Fully 85 % of those fabrications travelled through social media first.

That incident captures the new physics of knowledge in 2025; a world where algorithms, virality, and cheap artificial-intelligence tools can out-run traditional gate-keepers and rewrite public perception before breakfast.

According to a Data Reportal study on one of their articles, "Digital 2025: Ghana," 7.95 million Ghanaians, or nearly one in four people, have at least one social media profile at home. With 5.24 billion people worldwide, 94% of people with internet access also scroll, post, and comment on a daily basis. Not only are those figures significant, but they also mark a historic shift in the composition of knowledge. Universities, libraries, and legacy newsrooms used to filter information for relevance and accuracy. These days, billions of unscreened thumbs share that role.

Personalized feeds promised to show you more of the things you enjoy. The danger is that there might not be much else to see. In just 35 minutes of browsing, TikTok's algorithm can drag a young user into a "filter bubble" with a self-harm theme, according to internal court documents in the US. If we apply that pattern to politics, science, or health advice, we can start to understand why public consensus breaks down so easily.

In knowledge-management terms, the classic problem of “information overload” has morphed into “narrative overload” competing story-lines, each incubated in its own bubble and reinforced by emotionally charged design.

According to David Gilbert's article on A Pro-Russian Disinformation Campaign on Wired.com, researchers who were monitoring an operation that was blatantly named Operation Overload and was connected to the Kremlin claimed that during a nine-month period, free AI image and voice cloning tools produced more than twice as many disinformation artifacts as they had the year before. Cheap deepfakes now only need a laptop and a sour attitude, not a Hollywood budget.

Similarly, a report by the Disinformation Social Media Alliance (DISA) asserted that, likely strategies were used on a smaller scale during Ghana's 2024 election cycle, including doctored videos that twisted comments on the contentious LGBTQ bill and recycled images presented as new violent smears directed at both major parties. The conclusion is obvious: the velocity of false knowledge increases exponentially when production costs drop to nearly zero.

Scholars Nonaka and Takeuchi once described knowledge creation as a SECI spiral Socialization, Externalization, Combination, Internalization. In the age of Facebook, that first step (Socialization) happens in megaphone mode where ideas emerge publicly, instantly, and at eye-watering scale. The result is what some academics call social (SECI) a loop where informal, emotionally resonant snippets leapfrog formal validation.

Institutions whether a newsroom, a university or a health agency must therefore build sense-making hubs that sit between their verified repositories and the wild social stream. Key elements include:

1. Real-time listening dashboards to flag emerging narratives (the Ghana Fact-Checking Coalition used one to great effect).

2. AI-assisted triage, not to decide truth automatically, but to prioritize human fact-checkers where the stakes are highest.

3. Context layering: attaching concise explainers or hyperlinks to credible datasets whenever an institutional account addresses a trending claim.

4. Community co-creation: inviting informed citizen moderators or local experts to annotate content, improving both trust and reach.

Grass-roots experiments worth watching

• Fact-checking as a Service: The Coalition’s new AI-image forensics unit can run a suspected photo through error-level analysis in under a minute, supplying radio hosts with on-air rebuttals before rumors harden.

• Digital-literacy bootcamps: Several Ghanaian universities now embed critical platform studies into first-year orientation, teaching students how algorithmic ranking works and how to break out of a bubble.

• Regional moderation hubs: African policy researchers propose situating content-moderation centers in Nairobi and Accra to bring linguistic nuance (and local labour) closer to the decision loop.

These initiatives share a common thread: they treat social platforms not merely as distribution channels but as contested knowledge spaces requiring active stewardship. What individuals can do today

1. Pause and triangulate: Adopt a three-source rule before sharing explosive claims.

2. Reverse-image search: A 10-second Google Lens scan often debunks recycled visuals.

3. Algorithm hygiene: Periodically reset or diversify your “For You” feed by following accounts outside your usual interest zones.

4. Support credible outlets: Quality journalism and peer-reviewed research remain the backbone of collective understanding if we fund and cite them.

Whether it is a voter casting a ballot, a parent weighing vaccine advice, or a teenager wrestling with self-worth, each decision sits atop a personal stack of known facts. When that stack is built by invisible algorithms optimized for engagement rather than truth, society risks substituting popularity for accuracy.

Yet the same platforms that spread chaos also carry the antidote. They can amplify verified data, foster cross-cultural dialogue and surface marginalized expertise if we, the users, demand it and if institutions learn to manage knowledge at algorithmic speed.

So, the next time a sensational clip lands in your group chat, remember: hit pause, verify, then decide whether it deserves to travel further. Knowledge is still power but only when we keep it honest, shared and accountable.

Meanwhile, watch the trailer to GhanaWeb’s yet-to-air documentary on teenage girls and how fish is stealing their futures below: