This is part one of a two-part series on China's annotation workers.

When the public talks about AI today, the conversation is usually grand and abstract. Will it replace you, if not humanity? Are our ethical guidelines strong enough, or will it spiral out of control? And when people picture those who work in AI, they tend to imagine a class of brilliant elites: men and women fought over by global capital, whose ideas may, indeed, alter the course of civilization.

But when I think of AI, a different image comes to mind.

It is a little after nine in the morning. A group of young mothers has just dropped their children off at kindergarten and primary school. They turn and hurry to work. Once inside the computer lab, they barely have time to catch their breath before logging in and scrambling to claim the latest batch of tasks — image-labeling assignments for autonomous vehicles, freshly posted by one of China’s tech giants.

This is the world that Xia Bingqing, of East China Normal University, and I have spent five years studying. AI, after all, must be fed. It must practice. It must be taught, piece by piece, the knowledge, values, and norms of human society. In that sense, AI is not nearly as futuristic, or as elite, as its mythology suggests. It does not hover in the clouds. It rests on the ground — specifically, in the valleys of inland China, inside “data-labeling centers” built in relocation communities created through poverty-alleviation campaigns. It lives at rows of workstations, in mice and headsets, in timers, correction slips, and rework orders.