The iFlorida Model Deployment, which was started in May 2003, called for the Florida Department of Transportation (FDOT) District 5 (D5) to complete the design, build, and integration of the infrastructure required to support operations in 2 years. The required infrastructure was extensive, spanned numerous stakeholders, and included many technologies that were new to FDOT D5, such as sophisticated traffic management center (TMC) operations software, a wireless network deployed along I-4, an interface to Florida Highway Patrol Computer Aided Dispatch (FHP CAD) data, statewide traffic monitoring, and many others. The iFlorida plans also called for deployment of these technologies in ways that required coordination among more than 20 stakeholders. It was an ambitious plan that would result in dramatically different traffic management operations for FDOT D5 and other transportation stakeholders in the Orlando area.
In implementing the iFlorida plan, FDOT faced many challenges ranging from higher failure rates than expected for some field hardware to difficulties with the Condition Reporting System (CRS) and Central Florida Data Warehouse (CFDW) software. "Despite these challenges, it can be readily claimed that the overall iFlorida Model Deployment was successful," noted in the final evaluation report for the iFlorida Model Deployment, published in January 2009.
The difficulties associated with the iFlorida Model Deployment provided many opportunities to identify lessons learned from the experiences they had. The most important of these are presented below in a series of lessons learned articles.
As noted previously, FDOT experienced significant difficulties in reaching its objectives with the iFlorida Model Deployment. Early in the deployment, the problems were centered on the deployed field equipment, with a large number of the arterial toll tag readers failing by the time the software systems were in place to use that data. As FDOT brought the field equipment back online, problems with the CRS software became more apparent. From November 2005 through November 2007, FDOT's efforts focused on eliminating the problems with the CRS software. Through this process, FDOT identified a number of lessons learned that might benefit others attempting to deploy a new (or upgrade an existing) traffic management system.
- Ensure that experienced staff oversees the development of a complex software system like the CRS. FDOT D5 had no ITS staff with that experience and declined FHWA's offer to provide a software training course for the FDOT iFlorida staff.
- Beware that one of the biggest sources of problems in a complex system is the interfaces between subsystems. With the CRS, long-standing problems occurred with the interfaces between the CRS and the FHP CAD system, the weather provider, the travel time server, and the DMS signs. Approaches for reducing the risk associated with these interfaces include:
- Adopt ITS standards that have been used effectively in other, similar applications.
- Develop the interfaces early in the development process and test the interfaces independently of the other parts of the system.
- Include interface diagnostic tools that could sample data passing through the interface and assess whether the interface was operating correctly.
- Include tools for diagnosing problems that might occur in individual subsystems.
- When errors occurred with the arterial travel time system, it was difficult to identify whether the error was caused by the readers, the travel time server that computed travel times from the reader data, the CRS that used the computed travel times, or the interfaces and network connections between these systems.
- When errors occurred with updating DMS messages, it was difficult to identify whether the CRS was sending incorrect data to the system that interfaced with the signs or the sign interface was not updating the signs correctly.
- Incorporate alternate methods for accessing data and updating signs and 511 messages in case the primary software tools for doing so are not functioning correctly.
- The I-4 loop detector data was available through the Cameleon 360 software after the CRS failed, enabling RTMC operators to continue to estimate I-4 travel times.
- The backup interface for updating DMS messages allowed FDOT to disable the signs in the CRS and manage the sign messages through the backup interface when the CRS interface for updating the sign messages proved unreliable.
- The backup interface for updating 511 messages allowed FDOT to continue to provide 511 services when the CRS failed.
- Consider testing as a critical part of a software development project. While testing is primarily the responsibility of the contractor, the lead agency may want to review the test plans and documentation of test results during development, including unit and integration testing. The testing should include both the software and configuration information used to initialize the system. To achieve this, the contractual process must include details related to the visibility of the testing process.
- Errors in the CRS travel time calculations were not discovered until OOCEA reviewed the resulting travel times, and tools to test them (such as the static tests) were not available until more than 6 months after the problem was first discovered. Better testing for configuration data validity might have prevented errors from reaching the production software. For example, tools could have been developed to verify computed travel times independently. Tools that generated a map-based display of the configuration data would have provided an alternate means of testing that data.
- More detailed tests of the component that related FHP CAD incidents to roadways should have identified the fact that the software sometimes miscalculated incident locations.
- Tests of the CRS interface to the DMSs should have revealed some of the problems that FDOT experienced with this interface, such as the fact that the Cameleon 360 and CRS software had an opposite interpretation of the meaning of sign priorities.
- Include software system requirements related to configuration and administration of the system.
- With the CRS, the configuration was performed by the CRS contractor. When errors with the configuration were identified, FDOT discovered that the configuration was too complex to correct without assistance from the CRS contractor. Even the CRS contractor failed to eliminate all of the configuration errors in the CRS. Requirements that described the configuration process might have resulted in a simpler, less error-prone configuration process.
The deployment experience also highlighted the challenges of taking a "top-down" approach rather than a "bottom-up" approach to development. FDOT wavered in its leadership over the contractors after expressing only a vision for the system operation. Guidance provided to contractors by lower level FDOT staff was often over-ridden by upper management, and lower-level staff had little or no voice in expressing concerns. FDOT also did not provide a well-developed document to describe the traffic and travel management operations the iFlorida systems were expected to support. This meant that some iFlorida contractors were provided with limited documentation regarding the requirements of the systems they were developing and received contradictory feedback from different FDOT staff. Many critical client-contractor relations suffered as a result, and the continued miscommunication magnified the errors of each successive phase of the development until it became too difficult to manage.
Despite these failures in developing the CRS, the existence of other methods of performing key operations, such as updating 511 and DMS messages, meant that FDOT did continue to perform these traffic management operations in spite of the failings of the CRS.
(Our website has many links to other organizations. While we offer these electronic linkages for your convenience in accessing transportation-related information, please be aware that when you exit our website, the privacy and accessibility policies stated on our website may not be the same as that on other websites.)