As artificial intelligence technology gains prominence, new ways to deploy it could help or hamper progress toward equity in city transportation.
AI could make transportation more accessible, affordable and safe, said Kofi Nyarko, electrical and computer engineering professor at Morgan State University, at a webinar last week hosted by the National Academy of Sciences, Engineering, and Medicine’s Transportation Research Board. For example, AI could optimize routes for uses such as demand-responsive transit serving people in rural areas.
But Nyarko emphasized “responsible AI” is critical because “AI has the potential to either exacerbate or alleviate existing biases and discrimination in transportation.” Because AI systems learn from data, he said, biases in AI training data provided to the algorithms being trained could lead to systems perpetuating existing biases and inequity. Therefore, cities must ensure that their datasets are diverse, there is robust community involvement in program design and rollout, and they monitor systems to make improvements long after implementation, Nyarko said.
Ziping Wang, an MSU information science associate professor, provided an example of her team’s drone delivery project in rural areas. Wang lives in a rural area in northern Maryland, so she knows rural truck delivery is rare because it is expensive and inefficient.
Wang explained the process of gathering input on rural residents’ delivery preferences, sharing an important oversight that will inform future research. The team did not consider the views of a particular group — in this case, farmers — to be different from the rest of the broader community. “No bias is bias,” Wang said. Farmers, unlike the rest of the community, were skeptical of drone intrusion on their land. Wang explained that the next steps of her research would be to develop a more inclusive understanding of delivery demand in rural areas.
Jamie Morgenstern, assistant professor of computer science at the University of Washington, explained how to minimize bias in data from large, diverse communities and ensure the data’s conclusions are reliable. The main takeaway is that data from human sources change over time, she said. “Modeling those changes will be crucial for making sure the systems work as we hope,” Morgenstern said. For instance, traffic patterns change daily, and neighborhood density may vary by income.
Before committing to massive AI projects, cities may want to deploy “small-scale studies” like pilots, Morgenstern said. She cautioned against making sweeping generalizations based on surveys of program participants because the people who strongly support or oppose a project are most likely to respond to them. Morgenstern added that such bias could be mitigated by hosting small focus groups to improve qualitative data.