Today I read an article about how my hometown, Austin Texas, has the 10th worst traffic bottleneck in the USA for a 3-5 mile stretch of the interstate through downtown.
It is in a lot of news studies, but they are all derived from an article by the American Highway Users Alliance.
My first thought is that this must be wrong. In my role at Smarter Solutions, I travel a lot with a rental car in most cities. Austin does not have a top 10 traffic problem.
So this posting is not about traffic, but about metrics. Many Lean Six Sigma practitioners are faced with problem statements or performance metrics that we are handed that do not seem to be what our observations seem to support. This traffic statistic is one of those metrics, to me.
Behind the metric
As I looked at the study, the word bottleneck stuck out. What is a bottleneck with respect to traffic? I could not find a specific traffic definition, so I will consider the Theory of Constraints (TOC) definition, a point in a process that controls the overall throughput of a process. Another word is a constraint. I think it is a place to start. In the TOC method you look for a bottleneck by finding a big queue in front of the process step, which does not fit well for looking at a highway, because there is no real queue, just delays. So we can consider the time impact of a constraint as a process step that a transaction spends a lot of time in that step. A longer time as compared to the value-added time of the process.
So what do we know about traffic delay data. It turns out that there are systems that track cellphone movement on highways. watching how phones move from cell tower to cell tower to judge the speed. The phones may also provide GPS position and speed data to systems that can be used to track delays. I find that the American Highway Users Alliance uses gps trackers to monitor cars too. OK, so they know the speed of cars on a road. Given this information, how would a bottleneck be estimated? My guess would be the ratio of the slowest speed (longest time) for a stretch of road to the highest speed (shortest time) for the same stretch. I would probably use a time ratio, since a bigger number would be a greater constraint, plus it is how a driver would view a bottleneck pain. I could not find a definite reference indicating that this is how the referenced study decided on bottlenecks, but I found a number of inferences that it is how it is done plus one of my MBB students, who has a background in traffic studies, who swears that this is how it is done.
Austin Traffic characteristics
If you have ever been to Austin, the traffic is generally free flowing, all be it slow, in the city. But the Interstate freeway downtown during rush hour, both ways and both evening and morning, comes to a near complete stop for an hour or so twice a day. But during non-commuting hours and when downtown events are not occurring, it flows mostly at the speed limit. A speed limit of of 70 mph is published by the state for I-35 in downtown, but I believe it is currently set at 65 mph. You may be surprised, but this is in Texas where the state thinks we are entitled to go fast. We have a stretch of highway southeast of Austin with a speed limit of 85 mph.
I do not think that there are many of the nation’s big cities that set their downtown freeway speeds at 70 mph. I do know, from experience, that nearly every big city has moments when the freeway speeds drop to near zero during heavy use periods. I think of Chicago, Los Angeles, and New York City, of which I have drove in this year. Those freeways (called throughways in NYC) are running slow all day and all night. That is slow to a texan, which is around 50 to 60 mph. Yes there are speeders, but the top speeds are truly slower than on I-35 in Austin at night.
Bottleneck reporting through time ratios
If we accept that a bottleneck generates a high ratio of time to travel between congestion (numerator) and non-congestion (denominator) then we can consider all of the bit city traffic. The numerator could could get real large if traffic stops, but this number is self limiting because people will go on other paths when the delays are too high. It is the denominator that is more fixed. The free-flowing traffic travel time is a function of the set speed limit. If you set a high speed, the duration of time on the highway is less per mile. Since Austin has such a high speed limit, its denominator should have one of the smallest duration times per mile in the entire US. This would make its bottleneck ratio larger than a city with the same speed during congestion, which is what I believe is happening with Austin has the 10th worst bottleneck.
A better bottleneck metric
What might be better? How about not using the ratio of the worst time to the best time? Better formulations should address the issue as a driver sees it, possibly using the longest duration over an hour average divided by the averaged time found during daytime driving for the entire US major cities. Do not include the times at night which are not experienced by most drivers, but also compare all cities using the same denominator. Now the metric would represent the worst congestion related delays with the highest value.
Another way to measure congestion could be the overall daytime average travel time divided by the time take to travel at the speed limit. This metric would be a good measure of the overall system and would penalize locations that had all day congestion when compared to locations with momentary congestion at rush hour.
As you can assume there are many ways to create a congestion metric, all are valid, but which one you choose can change the rank ordering of the worst offenders. This also happens in Lean Six Sigma problems.
Your lesson
When ever you see a metric used to make a statement or decision and they user of the metric does not operationally define how it is calculated, you should question the metric. Too many people in the news and in politics purposely use odd definitions to make issues worse than they really are