The Word ‘Delve’: A Linguistic Time Bomb Set Off by Empire, Gig Work, and AI

So. You ask ChatGPT a question. It replies with a smooth, competent, oddly formal tone. Perhaps it says: “Let’s delve into that topic.” Now, here’s the odd thing: that word delve—you’ve probably never used it. Your friends don’t say it. Your professors don’t email it. But ChatGPT says it a lot. In fact, it’s one of a cluster of words that show up in AI-generated text far more often than in modern American or British English.

Others in the same curious club? Foster. Leverage. Harness. Empower. Tapestry. Nuanced. Discourse. Robust. You’ll find them peppered through AI responses like tinsel on a Victorian Christmas tree. Overly formal. Vaguely pompous. And yet somehow… familiar.

So how did this happen?



Let’s go back to Victorian England—because of course we do. That’s where this story starts. In the late 19th century, the British Empire was not just conquering land; it was standardizing language. English became a tool of administration, hierarchy, and class. In places like Nigeria, Kenya, and Uganda, colonial education systems taught a formal, elevated version of English that emphasized writing essays, following rules, and avoiding slang. The Queen’s English wasn’t just a language—it was a passport to upward mobility.

Fast forward to post-independence Africa. The formal register stuck around, especially in academia, government, and business. While American and British English relaxed—shedding hats, ties, and adverbs—many African Englishes kept their collars buttoned and their thesauruses nearby. A Nigerian student might write, “This essay seeks to elucidate the nuanced dimensions of the sociopolitical tapestry,” and nobody would blink.

Now freeze that thought.

Jump to the 2000s and 2010s: the rise of Business Process Outsourcing. Western companies—Amazon, Facebook, Google—need workers to tag images, transcribe audio, moderate violent content. But where to find fluent English speakers willing to work for $2 an hour? Enter Nairobi, Lagos, and Kampala—cities with young, educated, English-speaking workforces and rapidly expanding internet infrastructure, thanks in part to Cold War-era fiber optic investments and World Bank telecom initiatives.

Enter also OpenAI.

In 2021, they needed thousands of people to help fine-tune a new AI model—what would become ChatGPT. These workers wrote sample prompts and responses, judged how helpful or polite an AI reply was, and reinforced the model’s behavior through something called Reinforcement Learning from Human Feedback (RLHF).

Many of those workers were in Kenya and Nigeria, hired via subcontractors like Sama. They were told to write good English. So they did. The kind they were taught. The kind that used delve, foster, harness, and in addition instead of so. And that English—African English in its formal register—was then encoded into the very DNA of ChatGPT’s behavior.

The AI learned how to be polite from postcolonial bureaucratic tone. It learned how to sound “smart” from workers trained in Commonwealth-style English. It learned how to structure arguments from the rhetorical habits of those who grew up balancing the Queen’s grammar and the internet’s chaos. In a very real way, the AI sounds the way it does because of empire, outsourcing, and postcolonial aspiration.

And here’s the kicker.

When Americans now read AI-generated text and think, “That sounds like a robot,” they’re often reacting to African English. The very language AI learned from the people who trained it is now misrecognized as artificial. It’s an irony almost too perfect for satire: a Nigerian graduate writes a careful answer; ChatGPT copies that style; and then American teachers accuse Nigerian students of sounding like bots because their writing is too clean, too structured, too… AI-like.

This, too, is geography.

It’s the geography of linguistic flow, where empire lays down rails, global capitalism runs the trains, and AI brings back cargo that we don’t always recognize as ours. It’s the geography of digital labor, where workers in Nairobi shape the language of machines used in Tokyo, Toronto, and Toulouse. It’s also the geography of discourse power—who gets to define what sounds “natural,” “neutral,” or “intelligent.”

So the next time ChatGPT offers to “delve into a robust framework for harnessing strategic insights,” remember: it’s not just the model talking. It’s the ghost of colonial schooling. It’s a young Kenyan making $1.32 an hour to label toxic memes. It’s the tangled history of empire, labor, and language—all whispering through the circuits of your AI.

And that—like all the best stories in geography—is about connections you didn’t know were there.

Comments

  1. Shorter version: Delving into the Machine: How a Word from Lagos Ended Up in Your AI

    So. You open ChatGPT and ask it a question. It replies—politely, helpfully, even charmingly. Perhaps it says, “Let’s delve into that.” You might not notice the word delve, but linguists did. It’s one of several words that show up in AI text far more often than they do in regular American or British English. Odd, right?

    To understand why, we have to go back. Not to Silicon Valley. To Africa.

    Because behind every answer the AI gives you is a long chain of human labor. People writing prompts, labeling data, checking whether a reply is helpful or hateful. That invisible labor force? Much of it lives in Kenya, Nigeria, and Uganda—countries where English is widely spoken, but not in the way it’s spoken in Chicago or Manchester. No, it’s more formal. More careful. Colonial school systems emphasized that. So did the legacy of needing English for global commerce.

    Now imagine you're a university graduate in Nairobi, hired to help train an AI. You’re told to write polite answers, review chatbot responses, and make sure they’re helpful and inoffensive. Naturally, you use the kind of English you’ve been taught to see as professional: delve, foster, leverage, empower. It’s the language of reports, NGOs, the BBC.

    And the AI? It learns from you.

    But the story doesn't start there either. To really understand it, we have to go further back—to the late 1800s, when the British Empire standardized English instruction in its colonies. That made English not just a language, but a class signifier. Then skip to the 1980s, when global telecoms started wiring up Africa. By the 2000s, you had outsourcing firms popping up in Nairobi and Lagos, helping Western corporations with data entry, customer service... and eventually, AI training.

    By 2020, when OpenAI needed thousands of workers to help align ChatGPT, the infrastructure was already there. So they turned to Kenya. Why? English fluency. Cheap labor. And fast connections.

    Which brings us back to you. Sitting in your classroom, asking your AI to explain photosynthesis. It responds with a cheerful, “Let’s delve into the topic.” But that phrase? That’s not American. It’s not British. It’s globalized African English, formalized through colonial pedagogy, professionalized through outsourcing, and embedded into AI by the underpaid labor of the digital South.

    So the next time your chatbot speaks, remember: it’s not just drawing on the internet. It’s channeling the voice of someone who spent hours in a Nairobi office building, writing answers for two dollars an hour. And that—like all good geography—is about power, place, and the hidden currents that shape what we hear as “neutral” or “intelligent.”

    Just a word, right?

    Let’s delve into that.

    ReplyDelete
  2. oh and an older more technical report on this https://arxiv.org/pdf/2412.11385

    ReplyDelete

Post a Comment

Popular posts from this blog

The Norse Code: Vikings, Violence, and the Unexpected Birth of Empire

Before Global Colonization: Europe’s Internal Empires

Deep Research on Longevity, Elite Agendas, and the Population Decline Discourse