Adichie’s Influence Matters. Here’s Where The Evidence Points On Social Media And AI

Image via Instagram | Notesphere, all rights belong to original owner

As the Yoruba warn, closing the eyes does not make the sun set; looking away will not make social media and artificial intelligence disappear. Chimamanda Ngozi Adichie shapes public conversation, and her cautions about these powerful tools deserve respect. They also deserve calibration. Communities that rely on them to connect, organize, and create need guidance that is accurate, workable, and fair.

At a recent community event, a friend praised Chimamanda’s call to step back from social media, while a young entrepreneur showed me an online store built entirely through Instagram. Both were right in their own way, and these exchanges highlight gaps we need to bridge.

Social media

Claims that platforms are “bad for our brains” cast too wide a net. Current research shows small average effects on well-being and attention, with harms concentrated in specific conditions—compulsive use, poor design, and vulnerable groups. Benefits also exist. Moderated use in pro-social spaces supports learning, professional discovery, and civic ties. These are practical gains for African and Caribbean diaspora audiences navigating distance, culture, and local news gaps. The question is less “social media or literature” and more “how to pair deep reading with disciplined digital practice”.

AI

Fears that creative work will be replaced underrate the present reality and the craft of authorship. In field and lab settings, generative tools raise drafting speed and improve average quality on many professional tasks. The risks are real: sameness, clichés, and hallucinated facts when used uncritically. The fix is straightforward: Treat AI as a co-pilot for ideation, outlining, and polishing, then apply your own reporting, judgment, and voice. As a Swahili saying puts it, hiding the head in the sand does not hide the body; guidance should meet reality, not avoid it.

Power and policy

Concentration among a few AI firms is a valid concern, yet public guardrails exist and are growing. Comprehensive regulation in Europe and widely used risk-management standards show that democratic shaping is possible. The productive stance is engaged scrutiny: push for transparency and safety, support public-interest research, and build civic coalitions that influence how systems are made and used.

What this means for us

For communities in Philadelphia, across the diaspora, and in the motherland, social media and AI now function as civic infrastructure. These are not abstract tools; they help the Nigerian dressmaker find clients, enable content creators to make a living, keep families connected from here to Kingston, and let community leaders tell their own stories. They also amplify disinformation and outrage when used carelessly. The job is to tilt the balance toward value through habits, skills, and local institutions.

A practical playbook:

  • Set rules of use. Time-box sessions and trim push alerts.
  • Curate feeds. Follow credible sources and block rage-bait.
  • Use AI with intent. Draft with a model, then rewrite, fact-check, and make the voice yours.
  • Teach forward. Host clinics on digital hygiene and AI literacy, pairing the youth’s fluency with the elders’ context.
  • Demand better design. Ask platforms for accessible controls and features that reduce spam and harassment.

A constructive ask

We trust leaders like Adichie to champion dignity and clear thinking. When a voice like hers suggests retreat from these tools, many listen, but that retreat leaves people unprepared. The concept of Sankofa offers a frame: it is not wrong to go back and fetch what was missed, leaders can revisit and refine positions as evidence evolves.

The call is to update the guidance to encourage balanced participation, showing how to protect attention and preserve craft while still showing up online, where communities build power. Communities that learn to use these tools well will tell fuller stories and mobilize faster when it counts. This moment calls for leadership that meets people where they are: connected, creative, and ready to build.

Dr. Eric John Nzeribe is the Publisher of FunTimes Magazine and has a demonstrated history of working in the publishing industry since 1992. His interests include using data to understand and solve social issues, narrative stories, digital marketing, community engagement, and online/print journalism features. Dr. Nzeribe is a social media and communication professional with certificates in Digital Media for Social Impact from the University of Pennsylvania, Digital Strategies for Business: Leading the Next-Generation Enterprise from Columbia University, and a Master of Science (MS) in Publication Management from Drexel University and a Doctorate in Business Administration from Temple University.

Back To Top