AI as physical and social infrastructure

Seen only as a collection of concrete and light poles, a highway is not racist. But the highway is a system within a system, and its path through a city or town can trace the boundaries set by prejudice. These paths are carved into cities based on decisions that prioritize categories of people through a lens of economic value. Once completed, highways may have been inaccessible to the lower-income communities they had segregated.

Likewise, when biases based on gender, race, and other social traits are baked into LLMs, they perpetuate social and economic inaccessibility in ways indistinguishable from racism or sexism. The word choices predicted by a large language model are paths, too, and are directed by probabilities. Whatever progress has been made in minimizing the bias of these models has been contingent on pressure from policymakers and regulations to ensure it was a priority.

Eryk Salvaggio in Tech Policy Press

2 Comments

  1. GI/GO – garbage in/garbage out
    The fact that models can be trained to say things that the underlying training data doesn’t support, IS the problem.

    It means the answers we get from LLMs can ultimately be determined by the secret decisions of unknown people that we have no control over.

    I use ChatGPT almost daily. But I have come to understand that it is one of the most dangerous technologies ever invented – right up there with massed nuclear weapons systems and a global technology based on fossil fuels.

Leave a Reply

Your email address will not be published. Required fields are marked *