Encourage agencies collecting data to use the collected data to improve their own agency functions.
National experience using archived traffic detector data for monitoring highway performance.
Made Public Date


United States

Lessons Learned: Monitoring Highway Congestion and Reliability Using Archived Traffic Detector Data


The Mobility Monitoring Program (http://mobility.tamu.edu/mmp/) provides valuable insights with respect to using archived traffic detector data for monitoring highway performance (e.g., traffic congestion and travel reliability). The Mobility Monitoring Program was initiated in 2000 using archived freeway detector data from 10 cities. By 2004, the Program had grown to include nearly 30 cities with about 3,000 miles of freeway. Over the first four years of the Program, the project team gained valuable experience in the course of gathering archived data from State and local agencies. These experiences were captured in the report "Lessons Learned: Monitoring Highway Congestion and Reliability Using Archived Traffic Detector Data." The lessons documented in this report focus on three general areas: analytical methods, data quality, and institutional issues. They are useful to the Federal Highway Administration (FHWA) as it expands the national congestion monitoring program and to State and local agencies as they develop their congestion monitoring capabilities.

Lessons Learned

A vested interest in data collection is one of the best motivators for quality data. Poor data quality can sometimes result when data collectors are physically or institutionally distant from the data users. A common example of this situation is State agencies collecting data to meet Federal reporting requirements. Another example could be a division within a state DOT charged with collecting data of primary interest to another division or department within the state DOT. A vested interest occurs when the data collectors are also data users or are directly affected by decisions made with the data they collect. To the extent such a vested interest can be created, data quality will improve.

  • Recognize that setting quality criteria does not always improve data quality. A data user's first response when he or she encounters poor quality data collected by another agency is typically to notify the data collectors of such problems since in many cases the data collectors may not be aware of certain quality problems as they do not use the data. When quality problems are repeatedly obvious, the next response is typically to encourage or require data collectors to meet some data quality criteria. This response may yield some improvement, but some agencies may "game" the system or "post-process" data to meet certain quality checks without inherently improving the data collection process.
  • Encourage the agency collecting data to use the data within its own agency in order to improve agency functions. An example of this practice comes from the Highway Performance Monitoring System (HPMS), in which the state DOTs report various highway and travel data to the FHWA. Originally developed in 1978, the HPMS and its reporting requirements were seen by some state DOTs as simply another requirement that did not result in usable data for their own agency. As a result, the quality of HPMS data in its early years suffered in some states. In the 1990s, many state DOTs began to integrate the HPMS data collection into their own data programs and began to supplement their own agency analyses with data collected for HPMS. The net result has been more scrutiny of the HPMS data by state DOTs with fewer concerns about data quality.
  • Another example comes from the use of archived traffic detector data. In some cities, users of the archived data were lamenting its poor quality for their particular application. A typical response was to let the traffic operations center know about the poor data quality. In some instances, this may have resulted in some improvements to quality. However, many traffic operations centers have become more interested in improving detector data quality because they want to use the archived data to perform additional functions within their workgroup (such as performance monitoring, ramp metering, and travel time or traveler information). As more traffic operations centers use archived data for new and more sophisticated applications, greater attention will be paid to data quality. Such applications include the posting of estimated travel times on dynamic message signs and performance monitoring.

When data collectors are no longer physically or institutionally distant from the data users, or if they become the data users themselves, the quality of the data should improve. With the improvement of data quality, the customer satisfaction level of the data users will also increase.