A defaults problem.
Not a values problem.
The pattern that repeats at every layer -
and the question nobody asked.
To calibrate color film, every processing lab in the world needed a standard - a face, a benchmark. Chemistry was checked against it daily. If your print matched, the process was working.
They chose a white woman named Shirley Page. Her image was printed on cards and sent to every darkroom in the world.
Nobody at Kodak decided that brown skin didn't matter. They made a reasonable choice - and then stopped thinking about it. The choice became infrastructure. Infrastructure becomes invisible.
Kodak updated their Shirley Cards in the 1990s - after complaints from furniture and chocolate manufacturers. Not skin tone advocates. Furniture manufacturers. That is what moved the timeline.
The Shirley Card wasn't a film problem. It was the first iteration of a pattern that repeated at every layer technology built next.
Kodak's Shirley Card defines correct color reproduction for the entire photographic industry. The standard isn't built against anyone. It's built without thinking about everyone.
The technicians weren't racists. They were optimizing against the only reference they had. That reference happened to be one type of face.
Early search algorithms optimize for the users who generate the most feedback signals: English-language, desktop, Western query patterns. The engineers weren't exclusionary. They built for their users.
Their users happened to be a narrow slice of the world. The algorithm learned from what existed. What existed was not everyone.
Commercial facial recognition shows error rates up to 34% for darker-skinned women versus under 1% for lighter-skinned men. The systems learned from what existed. What existed was not everyone.
A man in Detroit was misidentified and arrested because of a system that nobody designed to fail him. Nobody designed it to fail him. That is precisely the point.
AI agents will negotiate contracts, apply for jobs, manage finances, argue rights - on behalf of individuals. They can only represent you if they know you. They learn who you are from your digital history.
Four billion people have thin or no digital history. Their agents will operate from defaults. Aggregate data. Demographic assumptions. What a person like them typically does - not what this person actually does.
Malice has a face. You can point to it, argue with it, legislate against it. Defaults are invisible - especially to the people who set them - because the cost of a bad default is paid entirely by the people it excludes. The Kodak engineer who chose Shirley Page never sat in front of a camera and watched his face render incorrectly. He never had to. That's the mechanism. Not the exception - the rule.