A new partnership between UT-Austin and the city plans to automate the data-gathering process for traffic studies, while making the results immediately available.
In theory, it's simple: Take the video from the city's 360 traffic cameras and make a computer count all the bikes and buses and pedestrians and cars on a given roadway.
In reality, it's not that simple.
Typically, this sort of research is used for initiatives like the Guadalupe Corridor Plan, which used traffic counts to provide the Austin Transportation Department and city planners a clearer picture of how traffic functions on the Drag. The research was rolled into a plan that detailed how the city could reduce congestion along that strip.
But, those traffic counts and modeling efforts use data that's collected over a regimented period of time, and they rely on recorded footage, rather than a constant stream of video. So, the result is a snapshot, more or less.
Not only that, but the results won't be fully analyzed for months, and it'll take even longer for any subsequent plans and insights to reach decision-makers' desks. In the case of the Guadalupe plan, for example, Austin City Council commissioned a study of the street way back in 2014; the subsequent plan wasn't released until last week.
Jen Duthie, an engineer working on the new project, says the city doesn't currently use the footage for research outside of those instances, but it does use it for traffic monitoring.
"There’s a lot of useful information that we could be extracting from those streams that we’re not today," she said. "So, what we’re working with UT on is, ‘Can we come up with algorithms or work with them to extract useful data?' [The idea is to] get a better sense of traffic patterns and how they’re changing all the time, as opposed to these kind of one-off, focused data-collection efforts that happen for things like the corridor studies."
The project uses an AI created by UT's Advanced Computing Center that will mine the feeds and will – again, theoretically – get better and better at identifying everything on a road at any given over the course of the project, which is expected to last a year.
The key to the project is the real-time analysis, says Weijia Xu, who leads the Data Mining & Statistics group at the Texas Advanced Computing Center. The city won't have to store hours and hours of footage just to study where people are jaywalking along Lamar or how many cars drive the wrong way down Sixth Street.
"We do not need to record the actual videos, and we can just let the computers tell, 'Oh, here’s a car coming. Here’s a cyclist coming this way. Here’s a pedestrian trying to cross,'" Xu says. "So, I think our approach is trying to help in resolving that issue.”
Xu says an early benchmark for the project is to train the AI to focus on pedestrian-vehicle interaction, allowing the Austin Transportation Department the ability to pull real-time insights from a video feed to identify less-than-pedestrian-friendly intersections.
The project is using UT's Stampede 2 – one of the fastest supercomputers in the world – to process the data. It has, so far, identified bikes, buses, pedestrians and cars with a 95 percent accuracy at a handful of intersections in Austin.
Duthie says the transportation department hopes to have a beta to work with by the project's end. After that, she says, the city could start using the AI on traffic cameras across the city. Then, the transportation department could partner with a third-party like Alphabet's Sidewalk Labs or Numina to both make the data anonymous and provide insights for transportation and traffic planning – while also helping city planners and engineers prepare for future needs like infrastructure for self-driving cars.
DaLyah Jones contributed to this report.