
Computing and conceptual cul-de-sacs
On "garbage in, garbage out", the walk to Dodger Stadium, and the problem with turning to AI tools without first examining the human assumptions
Many, if not most, people are familiar with the phrase "Garbage in, garbage out", which is an idea that has guided computer programming since its mechanical origins. People tend to be much less familiar with another principle which deserves equal notoriety: That all computer coding embeds human assumptions and values, even if we aren't aware of it. "Beware algorithmic bias", unfortunately, isn't quite as catchy as "GIGO".
â– California Governor Gavin Newsom has announced statewide plans to put generative artificial intelligence ("Gen AI") to work within state government. In particular, he seeks fanfare for wanting Gen AI to help fix traffic congestion in the notoriously traffic-heavy state.
â– This, regrettably, sets up a case study in the problem of assuming that we just need to get the code right in order for all of our problems to be solved. It's possible, of course, that data analysis and clever algorithms might shave off a few minutes of travel time on the margins. But the fundamental problem in places like Los Angeles is that California has a deeply-embedded car culture and even the most obvious places for good mass transit options to be delivered (like Dodger Stadium) are still effectively designed to be hostile to pedestrians.
â– There is no way for artificial intelligence to conclusively resolve the underlying problem, and any assumption that it can is an assumption that stands in the way of making things better. Induced demand, or the new traffic that shows up to fill new lanes of roads, only compounds the issue. Suppose AI really could make car commutes faster -- that would only encourage more people to commute by car.
â– Technology can do ever so many things to make life easier, better, and safer. But it can't overcome human judgment, particularly if we use technology to mask bad assumptions and values with perverse consequences. We have to be willing to admit that we need to critique our own basic principles before hoping that a computer will program its way out of a conceptual cul-de-sac.