Motor Carrier Safety: Federal Safety Agency Identifies Many	 
High-Risk Carriers but Does Not Assess Maximum Fines as Often as 
Required by Law (28-AUG-07, GAO-07-584).			 
                                                                 
The Federal Motor Carrier Safety Administration (FMCSA) has the  
primary federal responsibility for reducing crashes involving	 
large trucks and buses. FMCSA uses its "SafeStat" tool to target 
carriers for reviews of their compliance with the agency's safety
regulations based on their crash rates and safety violations. As 
requested, this study reports on (1) the extent to which FMCSA's 
policy for prioritizing compliance reviews targets carriers with 
a high risk of crashes, (2) how FMCSA ensures compliance reviews 
are thorough and consistent, and (3) the extent to which FMCSA	 
follows up with carriers with serious safety violations. To	 
complete this work, GAO reviewed FMCSA's regulations, policies,  
and safety data and contacted FMCSA officials in headquarters and
nine field offices.						 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-07-584 					        
    ACCNO:   A75228						        
  TITLE:     Motor Carrier Safety: Federal Safety Agency Identifies   
Many High-Risk Carriers but Does Not Assess Maximum Fines as	 
Often as Required by Law					 
     DATE:   08/28/2007 
  SUBJECT:   Fines (penalties)					 
	     Internal controls					 
	     Motor carriers					 
	     Motor vehicle safety				 
	     Policy evaluation					 
	     Program management 				 
	     Risk assessment					 
	     Safety regulation					 
	     Safety standards					 
	     Transportation safety				 
	     Policies and procedures				 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-07-584

   

     * [1]Results in Brief
     * [2]Background
     * [3]FMCSA's Policy for Prioritizing Compliance Reviews Targets M

          * [4]FMCSA's Policy for Prioritizing Compliance Reviews Leads the
          * [5]A Regression Model Performs Better Than Current SafeStat Mod
          * [6]FMCSA Has Acted to Address Data Quality Problems That Potent
          * [7]FMCSA Is Considering Replacing SafeStat with a New Tool by 2

     * [8]FMCSA's Management of Its Compliance Reviews Promotes Thorou

          * [9]FMCSA Communicates Its Compliance Review Policies and Proced
          * [10]FMCSA Investigators Use an Information System to Document th
          * [11]FMCSA Monitors the Performance of Its Compliance Reviews and
          * [12]Each of the Major Applicable Areas of the Safety Regulations

     * [13]FMCSA Follows Up with Many Carriers with Serious Safety Viol

          * [14]FMCSA Followed Up with Almost All Carriers That Received a P
          * [15]FMCSA Monitors Carriers to Identify Those That Are Violating
          * [16]The Safety Board Recently Concluded That FMCSA Is Making Ade
          * [17]FMCSA Has Reduced the Number of Carriers Rated Conditional T
          * [18]FMCSA Is Developing a New Safety Rating Methodology
          * [19]Policy Change Gives FMCSA Appropriate Discretion in Performi
          * [20]FMCSA Has Substantially Reduced Its Backlog of Enforcement C
          * [21]FMCSA Does Not Assess Maximum Fines Against All of the Serio

     * [22]Conclusions
     * [23]Recommendations for Executive Action
     * [24]Agency Comments

          * [25]Assessments of SafeStat's Predictive Capability

               * [26]Predictive Capability of SafeStat Compared with Random
                 Selec
               * [27]Application of Regression Models to Safety Data

          * [28]Late Reporting Had a Small Effect on SafeStat's Ability to I
          * [29]Incomplete Data from States Limit SafeStat's Identification
          * [30]Inaccurate Data Potentially Limit SafeStat's Ability to Iden
          * [31]FMCSA Has Undertaken Efforts to Improve Crash Data Quality
          * [32]Relationship of Motor Carrier Characteristics and Crash Risk

               * [33]Carrier Financial Performance
               * [34]Other Carrier Characteristics

          * [35]Relationship of Driver Characteristics and Crash Risk

               * [36]Driver Convictions for Traffic Violations
               * [37]Driver Age and Experience
               * [38]Driver Pay
               * [39]Frequency of Job Changes

     * [40]GAO Contact
     * [41]Staff Acknowledgments
     * [42]GAO's Mission
     * [43]Obtaining Copies of GAO Reports and Testimony

          * [44]Order by Mail or Phone

     * [45]To Report Fraud, Waste, and Abuse in Federal Programs
     * [46]Congressional Relations
     * [47]Public Affairs

Report to the Chairman, Committee on Transportation and Infrastructure,
House of Representatives

United States Government Accountability Office

GAO

August 2007

MOTOR CARRIER SAFETY

Federal Safety Agency Identifies Many High-Risk Carriers but Does Not
Assess Maximum Fines as Often as Required by Law

GAO-07-584

Contents

Letter 1

Results in Brief 5
Background 9
FMCSA's Policy for Prioritizing Compliance Reviews Targets Many High-Risk
Carriers, but Changes to the Policy Could Target Carriers with Even Higher
Risk 17
FMCSA's Management of Its Compliance Reviews Promotes Thoroughness and
Consistency 25
FMCSA Follows Up with Many Carriers with Serious Safety Violations but
Does Not Assess Maximum Fines against All of the Serious Violators
Required by Law 31
Conclusions 45
Recommendations for Executive Action 46
Agency Comments 46
Appendix I Other Assessments of SafeStat's Ability to Identify High-Risk
Motor Carriers 48
Appendix II FMCSA's Crash Data Used to Compare Methods for Identifying
High-Risk Carriers 51
Appendix III Review of Studies on Predictors of Motor Carrier and Driver
Crash Risk 57
Appendix IV Scope and Methodology 62
Appendix V GAO Contact and Staff Acknowledgments 67

Tables

Table 1: SafeStat Categories 15
Table 2: How FMCSA Determines Carrier Safety Ratings Based on Ratings in
Six Safety Areas 16
Table 3: Crash Rates of Motor Carriers in Various SafeStat Categories in
the 18 Months following the June 2004 SafeStat Categorization 18
Table 4: Regression Model Approach Compared with Refined Prioritization
Approach and with Current SafeStat Approach 21
Table 5: Percentages of Compliance Reviews for Fiscal Years 2001 through
2006 That Covered Each of the Major Applicable Areas of the Safety
Regulations 30
Table 6: Time Elapsed before Carriers Rated Conditional Received Follow-up
Compliance Reviews, Fiscal Years 2002 through 2004, as of September 2006
37
Table 7: Number of Motor Carriers That Would Have Been Subject to Maximum
Fines under Various Definitions of a Pattern of Serious Violations, Fiscal
Years 2004 through 2006 43
Table 8: Number of Motor Carriers That Would Have Been Subject to Maximum
Fines under Two Strikes and Three Strikes Repeat Violator Policies, Fiscal
Years 2004 through 2006 44

Figures

Figure 1: Commercial Motor Vehicle Fatality Rate, 1975 to 2005 10
Figure 2: FMCSA's Safety Oversight Approach 12
Figure 3: Percentage of Crashes Submitted to MCMIS within 90 Days of
Occurrence, Fiscal Years 2000 through 2006 52

Abbreviations

FMCSA Federal Motor Carrier Safety Administration
MCMIS Motor Carrier Management Information System
PRISM Performance Registration and Information System Management
SafeStat Motor Carrier Safety Status Measurement System

This is a work of the U.S. government and is not subject to copyright
protection in the United States. The published product may be reproduced
and distributed in its entirety without further permission from GAO.
However, because this work may contain copyrighted images or other
material, permission from the copyright holder may be necessary if you
wish to reproduce this material separately.

United States Government Accountability Office
Washington, DC 20548

August 28, 2007

The Honorable James L. Oberstar
Chairman
Committee on Transportation and Infrastructure
House of Representatives

Dear Mr. Chairman:

About 5,500 people die each year as a result of crashes involving large
commercial trucks or buses,^1 and about 160,000 more people are injured.
These crashes may result from errors by truck, bus, or passenger vehicle
drivers, vehicle condition, and other factors. The Federal Motor Carrier
Safety Administration (FMCSA) within the U.S. Department of Transportation
shoulders the primary federal responsibility for reducing crashes,
injuries, and fatalities involving large trucks and buses. FMCSA's primary
means of preventing these crashes is to develop and enforce regulations to
help ensure that drivers and motor carriers are operating safely. FMCSA
uses several enforcement activities to ensure compliance with its safety
regulations, including detailed inspections of motor carriers' operations
at their places of business, called compliance reviews. FMCSA also funds
and oversees similar enforcement activities at the state level.

Because of resource constraints, each year FMCSA and its state partners
are able to conduct compliance reviews of only about 2 percent of the
nation's estimated 711,000 motor carriers that are subject to the federal
safety and hazardous materials regulations.^2 FMCSA targets these reviews
toward those carriers that its Motor Carrier Safety Status Measurement
System (SafeStat) identifies as having the greatest potential for being
involved in crashes and assigns these carriers to its two highest priority
categories--SafeStat categories A and B. SafeStat's assessments are based
on indicators such as crash rates and safety violations identified during
roadside inspections of vehicles and drivers and during prior compliance
reviews. To be given high priority for a compliance review, a carrier must
score among the worst 25 percent of carriers^3 in at least two of
SafeStat's four evaluation areas (the four areas are accident, driver,
vehicle, and safety management; the scores for the last three of these are
based on a carrier's violations). As a result, carriers that score poorly
in a single area often do not necessarily receive a compliance review.

^1Large trucks are those with a gross vehicle weight greater than 10,000
pounds. A bus is a motor vehicle that is used to carry more than eight
passengers (including the driver).

^2According to FMCSA, this is the number of commercial motor carriers
registered in its Motor Carrier Management Information System, as of
February 2007. It includes an unidentified number of carriers that are
registered, but are no longer in business. Furthermore, it includes only
carriers classified as interstate carriers (about 696,000 carriers) or
intrastate carriers of hazardous materials (about 15,000 carriers). For
the sake of simplicity, we refer to these carriers collectively as
"interstate carriers."

Federal law requires FMCSA to determine whether carriers are fit to
operate safely and to place those carriers that it finds unfit out of
service. Out-of-service carriers cannot come back into service until FMCSA
determines that they have corrected the conditions that rendered them
unfit. FMCSA determines safety fitness by conducting compliance reviews,
and it assigns unfit carriers a rating of "unsatisfactory." It also
requires follow-up compliance reviews on carriers that it rates
"conditional."^4 FMCSA can assess fines against carriers for violations of
the safety regulations, and federal law requires FMCSA to assess the
maximum allowable fine^5 for each serious violation^6 for those carriers
whose performance demonstrates a pattern of serious violations or
violations that are the same as or related to a previous serious violation
(we call these "repeat" violations).

You asked us to examine how FMCSA identifies and takes action against the
freight and passenger commercial motor carriers that are the most
egregious offenders of federal motor carrier safety regulations.
Accordingly, this report focuses on

^3Within each safety evaluation area, this includes only those carriers
for which FMCSA had sufficient data to calculate a value.

^4A conditional safety rating means a motor carrier, as a result of not
having adequate safety management controls, has had serious violations of
the safety regulations.

^5We use the term "fine" to refer to civil fines as opposed to criminal
fines.

^6We use the term "serious violations" to refer to acute or critical
violations. Acute violations are so severe that FMCSA will require
immediate corrective actions by a motor carrier regardless of the overall
safety status of the motor carrier. An example of an acute violation is a
carrier failing to implement an alcohol or drug testing program for
drivers. Critical violations are less severe than acute violations and
most often point to gaps in carrier management or operational controls,
such as not maintaining records of driver medical certificates.

           o the extent to which FMCSA's policy for prioritizing compliance
           reviews targets carriers that subsequently have high crash rates,
           o how FMCSA ensures that its compliance reviews are conducted
           thoroughly and consistently, and
           o the extent to which FMCSA follows up with carriers with serious
           safety violations.

           You also asked us to review other studies on SafeStat's ability to
           identify motor carriers with high crash risks and the impact of
           data quality on SafeStat's predictive ability. This report
           presents the findings on those issues from our June 2007 report.
           ^7 (See apps. I and II.) Finally, as you requested, this report
           discusses studies on predictors of motor carrier and driver crash
           risk. (See app. III.)

           In our June 2007 report, we assessed the extent to which changes
           in the SafeStat model, by using regression modeling techniques,
           could improve FMCSA's ability to identify commercial motor
           carriers that pose high crash risks. In contrast, this report
           assesses whether changes in how FMCSA prioritizes carriers for
           compliance reviews based on their scores in SafeStat's four
           evaluation areas could target carriers with higher aggregate crash
           risks.

           To determine the extent to which FMCSA's policy for prioritizing
           compliance reviews targets carriers that subsequently have high
           crash rates, we analyzed data from FMCSA's Motor Carrier
           Management Information System (MCMIS) on the June 2004 SafeStat
           assessment of carriers and on the assessed carriers' crashes in
           the 18 months (July 2004 through December 2005) following the
           SafeStat assessment.^8 We defined various groups of carriers for
           analysis, including those to which FMCSA gave high priority, as
           well as those based on alternatives to FMCSA's prioritization
           policy. We then calculated the aggregate crash rate in the 18
           months following the SafeStat assessment for each of these groups
           and compared crash rates among the various groups to determine
           whether there were any groups with substantially higher rates than
           the carriers in SafeStat categories A or B. We also talked to
           FMCSA officials about how FMCSA developed SafeStat, their views on
           other evaluations of SafeStat, and FMCSA's plans to replace
           SafeStat with a new tool.
			  
^7GAO, Motor Carrier Safety: A Statistical Approach Will Better Identify
Commercial Carriers That Pose High Crash Risks Than Does the Current
Federal Approach, [48]GAO-07-585 (Washington, D.C.: June 11, 2007). Our
findings are summarized in the section of this report dealing with FMCSA's
policy for prioritizing compliance reviews.

^8FMCSA requires that states report crashes within 90 days. Sometimes
states report crashes late. To allow for this occurrence, we analyzed data
on crashes occurring from June 2004 through December 2005, but which may
have been reported as late as June 2006.

           To assess how FMCSA ensures that its compliance reviews are
           completed thoroughly and consistently, we identified our key
           internal control standards related to the communication of policy,
           documentation of results, and monitoring and reviewing of
           activities and findings.^9 In our view, these standards are
           critical to maintaining the thoroughness and consistency of
           compliance reviews. We gathered information on these key internal
           controls through discussions with FMCSA officials, reviews of
           policy documents and reports, and reviews of FMCSA information
           systems used to communicate policy, document findings, and review
           findings. We interviewed investigators who conduct compliance
           reviews and their managers in FMCSA's headquarters office, as well
           as in 7 of FMCSA's 52 field division offices that work with
           states, two of its four regional service centers that support
           division offices, and three state offices that partner with 3 of
           the FMCSA division offices in which we did our work.^10 The
           division offices we reviewed partner with states that received 30
           percent of the grant funds that FMCSA awarded to all states in
           fiscal year 2005 (the latest year for which data were available)
           through its primary grant program, the Motor Carrier Safety
           Assistance Program. Because we chose the seven states judgmentally
           (representing the largest grantees), we cannot project our
           findings nationwide. Reviewing a larger number of grantees would
           not have been practical due to resource constraints. We assessed
           the extent to which FMCSA conducts vehicle inspections and covers
           applicable safety regulations during compliance reviews by
           analyzing FMCSA data.

           To assess the extent to which FMCSA follows up with carriers with
           serious violations, we reviewed regulations and FMCSA policies
           directing how FMCSA must follow up and track these violators,
           analyzed data to determine if FMCSA had met these requirements,
           and held discussions with FMCSA officials. We also used data from
           MCMIS to assess the timeliness of FMCSA's follow-up compliance
           reviews. To assess FMCSA's implementation of the requirement to
           assess the maximum fine in certain cases, we compared FMCSA's
           policy with the language of the act, held discussions with FMCSA
           officials, estimated the number of carriers that could have been
           assessed the maximum fine based on different definitions of a
           "pattern" of violations, and reviewed the Department of
           Transportation Inspector General's report on the implementation of
           the policy.^11 In assessing these various areas, we used the most
           recent data available at the time we conducted our fieldwork. The
           period of analysis varies depending on the time permitted by law,
           policy, or our judgment for FMCSA's follow-up.
			  
^9GAO, Internal Control: Standards for Internal Control in the Federal
Government, [49]GAO/AIMD-00-21 .3.1 (Washington, D.C.: November 1999).

^10We did not interview managers or investigators in three of the seven
states because they do not conduct compliance reviews of interstate
carriers, and we did not interview managers or investigators in one state
because they did not respond to our attempts to contact them.

           As part of our review, we assessed internal controls and the
           reliability of FMCSA's data on motor carriers' safety history and
           its compliance review and enforcement activities that were
           pertinent to this effort. While there are known problems with the
           quality of the crash data reported to FMCSA for use in SafeStat,
           we determined that the data were sufficiently reliable for our
           use, which was to assess whether different approaches to
           prioritizing carriers could lead to better targeting of carriers
           that subsequently have high crash rates. We conducted our work
           from February 2006 through August 2007 in accordance with
           generally accepted government auditing standards. (See app. IV for
           additional information on our scope and methodology.)
			  
			  Results in Brief
			  
           By and large, FMCSA does a good job of identifying carriers that
           pose high crash risks for subsequent compliance reviews, ensuring
           the thoroughness and consistency of those reviews, and following
           up with high-risk carriers.

           FMCSA's policy for prioritizing carriers for compliance reviews
           based on their SafeStat scores leads FMCSA to conduct compliance
           reviews on many high-risk carriers but not on other higher risk
           ones. Our analysis indicates that modifications to the policy
           could result in the selection of carriers with a higher aggregate
           crash risk than are selected using the current policy. Currently,
           carriers must score among the worst 25 percent of carriers in at
           least two of SafeStat's four evaluation areas to receive high
           priority for a compliance review. Using data from FMCSA's June
           2004 SafeStat categorization, we found that the 492 carriers that
           scored among the worst 5 percent of carriers in the accident
           safety evaluation area--an area that, by itself, FMCSA gives low
           priority for compliance reviews--had an aggregate rate of
           subsequent crashes that was more than twice as high as that of the
           4,989 carriers to which FMCSA gave high priority.^12 This suggests
           that FMCSA could target a higher risk group of carriers for
           compliance reviews by changing its prioritization policy so that
           high priority is also assigned to carriers that score among the
           worst 5 percent of carriers in the accident area. We recognize
           that giving such carriers high priority for a compliance review
           would increase FMCSA's and the states' compliance review workloads
           unless FMCSA were to make another change to its prioritization
           rules that resulted in removing the same number of carriers from
           the high-priority categories A and B that had lower crash rates
           than the ones added. FMCSA officials told us that the agency plans
           to assess whether giving high priority to carriers that perform
           very poorly in the accident evaluation area alone would be an
           effective use of its resources. Furthermore, as part of a reform
           initiative aimed at improving how the agency identifies and deals
           with unsafe carriers, called the Comprehensive Safety Analysis
           2010, FMCSA is considering replacing SafeStat with a new tool by
           2010. While the new tool may use some of the same data included in
           SafeStat, such as carriers' crash rates and driver and vehicle
           violations identified during compliance reviews and roadside
           inspections, it may also consider additional information from
           crash reports, such as whether driver fatigue or a lack of driver
           experience was cited as a causal or contributing factor.
			  
^11U.S. Department of Transportation Office of Inspector General,
Significant Improvements in Motor Carrier Safety Program Since 1999 Act
but Loopholes for Repeat Violators Need Closing, Report MH-2006-046
(Washington, D.C.: Apr. 21, 2006).

           FMCSA's management of its compliance reviews meets our standards
           for internal controls, thereby promoting thoroughness and
           consistency. FMCSA records its compliance review policies and
           procedures in an electronic operations manual and distributes the
           manual to investigators and managers in FMCSA's 52 division
           offices and in the offices of its 56 state and territorial
           partners (hereafter called state partners).^13 FMCSA also provides
           training to investigators on these policies and procedures,
           including initial classroom training, on-the-job training, and ad
           hoc training on new policies and procedures. Many investigators we
           spoke with generally found both the electronic manual and the
           training to be effective means of communicating policies and
           procedures. FMCSA and state investigators use an information
           system to document the results of the compliance reviews. This
           information system supports thoroughness and consistency by
           alerting investigators if they are not following key policies or
           if data appear suspect; the system also provides managers with
           readily available data to review. Managers in the division
           offices, states, and FMCSA's service centers use monthly activity
           reports to monitor performance at the investigator level,
           including the number of reviews completed and the number and types
           of violations identified. The service centers also conduct
           triennial reviews of the compliance review activities of each
           division and state office. In 2002, FMCSA performed an agencywide
           review of its compliance review program and made several
           improvements based on the findings of this review. One such
           improvement was to discourage repeat visits to high-risk motor
           carriers that had received an unsatisfactory rating during their
           last compliance review within the past 12 months because the
           agency believed that not enough time had elapsed to show whether
           safety improvements had taken effect. For the most part, FMCSA and
           state investigators cover the nine major applicable areas of the
           safety regulations (e.g., driver qualifications and vehicle repair
           and maintenance) in 95 percent or more of compliance reviews,
           demonstrating thoroughness and consistency.
			  
^12We applied the SafeStat model to retrospective data. Because of changes
to the MCMIS crash file over the past 2 years, our number does not
correspond exactly to the number of carriers identified by FMCSA as high
risk on June 25, 2004. Had all crash data been reported within 90 days of
when the crashes occurred, 182 of the carriers identified by SafeStat as
highest risk would have been excluded (because other carriers had higher
crash risks), and 481 carriers that were not originally designated as
posing high crash risks would have scored high enough to be considered
high risk, resulting in a net addition of 299 carriers.

           FMCSA follows up with many carriers with serious safety
           violations, but it does not assess maximum fines against all of
           the serious violators that we believe the law requires. FMCSA
           followed up with almost all the 1,196 carriers that received a
           proposed safety rating of unsatisfactory following a compliance
           review that was completed in fiscal year 2005 to ensure that these
           carriers either made safety improvements that resulted in an
           upgraded final safety rating or were placed out of service. For
           example, FMCSA upgraded the safety ratings of 881 carriers
           primarily on the basis of safety improvements it identified during
           follow-up compliance reviews and reviews of documentary evidence
           of improvements submitted by carriers. FMCSA assigned a final
           rating of unsatisfactory to 312 of the remaining carriers, and
           placed 309 of them out of service. FMCSA monitors carriers to
           identify those that are violating out-of-service orders, but in
           fiscal years 2005 and 2006, it cited only 36 of 677 carriers that
           its monitoring showed had a roadside inspection or crash while
           subject to an out-of-service order. An FMCSA official told us that
           some of the 677 carriers, such as carriers that were operating
           intrastate,^14 may not have been violating the out-of-service
           order, and that FMCSA did not have enough resources to determine
           whether each of the carriers was violating the out-of-service
           order. With regard to fines against carriers, we found that while
           FMCSA assesses maximum fines against carriers that repeat a
           serious violation, it does not, as we believe federal law
           requires, assess maximum fines against carriers with a pattern of
           serious violations. The law requires FMCSA to assess maximum fines
           against carriers in both situations. The annual number of carriers
           that would be subject to maximum fines under a definition of
           pattern that is consistent with the law varies greatly depending
           on the definition--for the eight definitions that we assessed, the
           number of such carriers in fiscal year 2006 varied from 7 to
           3,348.^15 In addition, FMCSA assesses maximum fines only for the
           third instance of a violation. We read the statute as requiring
           FMCSA to assess the maximum fine if a serious violation is
           repeated once--not only after it is repeated twice.
			  
^13FMCSA partners with each of the 50 states, the District of Columbia,
and the U.S. territories of American Samoa, Guam, the Northern Marianas,
Puerto Rico, and the Virgin Islands.

           We are recommending that FMCSA (1) select carriers with very poor
           scores in the accident safety evaluation area for compliance
           reviews, regardless of their scores in the other areas; (2)
           establish reasonable time frames within which it conducts
           follow-up compliance reviews on carriers rated conditional; and
           (3) revise its implementation of the requirement to assess maximum
           fines to meet our interpretation of the applicable law. We
           provided a draft of this report to the Department of
           Transportation for its review and comment. The department did not
           offer overall comments on the draft report. It said that it would
           assess the efficacy of the first recommendation, but it did not
           comment on the other recommendations. It offered several technical
           comments, which we incorporated where appropriate.
			  
^14Except for carriers of hazardous materials, FMCSA does not have the
authority to prohibit motor carriers from operating intrastate.

^15These eight definitions were chosen to illustrate the effect of
different potential definitions of pattern.

           Background

           The interstate commercial motor carrier industry, primarily the
           trucking industry, is an important part of the nation's economy.
           Trucks transport over 11 billion tons of goods, or about 60
           percent of the total domestic tonnage shipped.^16 Buses also play
           an important role, transporting an estimated 860 million
           passengers in 2005. FMCSA estimates that there are 711,000
           interstate commercial motor carriers, about 9 million trucks and
           buses, and about 10 million drivers. Most motor carriers are
           small; about 51 percent operate one vehicle, and another 31
           percent operate two to four vehicles. Carrier operations vary
           widely in size, however, and some of the largest motor carriers
           operate upwards of 58,000 vehicles. Carriers continually enter and
           exit the industry. Since 1998, the industry has increased in size
           by an average of about 29,000 interstate carriers per year.

           In the United States, commercial motor carriers account for fewer
           than 5 percent of all highway crashes, but these crashes result in
           about 13 percent of all highway deaths, or about 5,500 of the
           approximately 43,000 highway fatalities that occur nationwide
           annually. In addition, on average, about 160,000 of the
           approximately 3.2 million highway injuries per year involve motor
           carriers. The fatality rate for trucks has generally decreased
           over the past 30 years but has been fairly stable since 2002. The
           fatality rate for buses decreased slightly from 1975 to 2005, but
           it has more annual variability than the fatality rate for trucks
           due to a much smaller total number of vehicle miles traveled. (See
           fig. 1.)

^16This figure is from 2002, the most recent year for which data are
available.

           Figure 1: Commercial Motor Vehicle Fatality Rate, 1975 to 2005

           Notes: Fewer buses are involved in fatal or nonfatal accidents
           than large trucks, but bus accidents tend to involve more people.

           The latest year for which data were available was 2005.

           In an attempt to reduce the number and severity of crashes
           involving large trucks, FMCSA was established by the Motor Carrier
           Safety Improvement Act of 1999. FMCSA assumed almost all of the
           responsibilities and personnel of the Federal Highway
           Administration's Office of Motor Carriers. The agency's primary
           mission is to reduce the number and severity of crashes involving
           large trucks and buses. It carries out this mission by (1)
           issuing, administering, and enforcing federal motor carrier safety
           regulations and hazardous materials regulations; (2) providing
           education and outreach for motor carriers and drivers on the
           safety regulations and hazardous materials regulations; (3)
           gathering and analyzing data on motor carriers, drivers, and
           vehicles; (4) developing information systems to improve the
           transfer of data; and (5) researching new methods and technologies
           to enhance motor carrier safety.

           FMCSA relies heavily on the results of compliance reviews to
           determine whether carriers are operating safely and, if not, to
           take enforcement action against them. (See fig. 2.) FMCSA conducts
           these on-site reviews to determine carriers' compliance with
           safety regulations that address areas such as testing drivers for
           alcohol and drugs, insurance coverage, crashes, driver
           qualifications, driver hours of service, vehicle maintenance and
           inspections, and transportation of hazardous materials. Due to
           resource constraints, FMCSA and its state partners are able to
           conduct compliance reviews on only about 2 percent of the nation's
           estimated 711,000 interstate motor carriers each year. It is
           FMCSA's policy to target these reviews at carriers that have been
           assessed by SafeStat as having the highest risk of crashes, have
           been the subject of a safety-related complaint submitted to FMCSA,
           have been involved in a fatal accident, have requested an upgraded
           safety rating based on safety improvements, or have been assigned
           a safety rating of conditional following a previous compliance
           review.

Figure 2: FMCSA's Safety Oversight Approach

Based largely on the number and severity of violations that it identifies
during compliance reviews, FMCSA assigns carriers safety ratings that
determine whether they are allowed to continue operating. FMCSA can take a
range of enforcement actions against carriers with violations, including

           o issuing notices of violation informing carriers of identified
           violations and indicating that additional enforcement action may
           be taken if the violations are not corrected;
           o issuing compliance orders directing carriers to perform certain
           actions that FMCSA considers necessary to bring the carrier into
           compliance with regulations;
           o assessing fines for violations of the safety regulations; fines
           require carriers to pay a specific dollar amount to FMCSA;
           o placing carriers or drivers out of service for unsatisfactory
           safety performance, failure to pay a fine, or imminently hazardous
           conditions or operations;
           o revoking the operating authority of carriers for failure to
           carry the required amount of insurance coverage;
           o pursuing criminal penalties in some instances when knowing and
           willful violations can be proved; and
           o seeking injunctions from a court for violations of a final order
           such as an out-of-service order.

           FMCSA has 52 division offices that partner with the 56 recipients
           of its Motor Carrier Safety Assistance Program grants. FMCSA also
           funds and oversees enforcement activities, including compliance
           reviews, at the state level through this grant program. The
           program was appropriated $188 million, or about 38 percent, of
           FMCSA's $501 million appropriation for fiscal year 2006. In fiscal
           year 2006, FMCSA conducted 9,719 compliance reviews, and its state
           partners conducted 5,463 compliance reviews.

           SafeStat assesses carriers' risks relative to all other carriers
           based on safety indicators such as their crash rates and safety
           violations identified during roadside inspections and during prior
           compliance reviews. A carrier's score is calculated on the basis
           of its performance in the following four safety evaluation areas:

           o The accident area reflects a carrier's crash history relative to
           other motor carriers based on data from states and MCMIS.
           o The driver area reflects a carrier's driver-related safety
           performance and compliance relative to other motor carriers based
           on driver violations identified during roadside inspections and
           compliance reviews.
           o The vehicle area reflects a carrier's vehicle-related safety
           performance and compliance relative to other motor carriers based
           on vehicle-related violations identified during roadside
           inspections and compliance reviews.
           o The safety management area reflects the carrier's safety
           management performance relative to other motor carriers based on
           safety-management-related violations (such as failing to implement
           a drug or alcohol testing program) and hazardous-materials-related
           violations identified during compliance reviews and on closed
           enforcement cases resulting from compliance reviews.

           A motor carrier's score is based on the carrier's relative
           ranking, indicated as a value, in each of the four safety
           evaluation areas. This value can range from 0 to 100 in each area,
           and any value of 75 or greater is considered deficient. Any value
           of less than 75 is not considered deficient and is not used in
           calculating a SafeStat score. FMCSA assigns categories to carriers
           ranging from A to H according to their performance in each of the
           safety evaluation areas. (See table 1.) Although a carrier may
           receive a value in any of the four safety evaluation areas, the
           carrier receives a SafeStat score only if it is deficient in two
           or more safety evaluation areas. The calculation used to determine
           a motor carrier's SafeStat score is

           SafeStat score = 2 x accident value + 1.5 x driver value + vehicle
           value + safety management value

           As shown in the formula, the accident and driver areas have 2.0
           and 1.5 times the weight, respectively, of the vehicle and safety
           management areas. FMCSA assigned more weight to these areas
           because accidents and driver violations correlate relatively
           better with future crash risk. In consultation with state
           transportation officials, insurance industry representatives,
           safety advocates, and the motor carrier industry, FMCSA used its
           expert judgment and professional knowledge to assign these
           weights, rather than determining them through a statistical
           approach, such as regression modeling.

Table 1: SafeStat Categories

                                                            Priority for      
Category Condition                                       compliance review 
Deficient in two or more areas                                             
A        Deficient in all four safety evaluation areas   High              
            or deficient in three safety evaluation areas                     
            that result in a weighted SafeStat score of 350                   
            or more                                                           
B        Deficient in three safety evaluation areas that High              
            result in a weighted SafeStat score of less                       
            than 350 or deficient in two safety evaluation                    
            areas that result in a weighted SafeStat score                    
            of 225 or more                                                    
C        Deficient in two safety evaluation areas that   Medium            
            result in a weighted SafeStat score of less                       
            than 225                                                          
Deficient in one area only                                                 
D        Deficient in the accident safety evaluation     Low               
            area (area value between 75-100)                                  
E        Deficient in the driver safety evaluation area  Low               
            (area value between 75-100)                                       
F        Deficient in the vehicle safety evaluation area Low               
            (area value between 75-100)                                       
G        Deficient in the safety management safety       Low               
            evaluation area (area value between 75-100)                       
Not deficient in any area                                                  
H        Not deficient in any of the safety evaluation   Low               
            areas                                                             

Source: GAO summary of FMCSA data.

Based on the results of a compliance review, FMCSA assigns the carrier a
safety rating of satisfactory, conditional, or unsatisfactory. The safety
rating, which is distinct from a carrier's SafeStat category, reflects
FMCSA's determination of a carrier's fitness to operate safely. FMCSA
issues out-of-service orders to carriers rated unsatisfactory, and these
carriers are not allowed to resume operating until they make improvements
that result in an upgraded safety rating. Carriers rated conditional are
allowed to continue operating, but FMCSA aims to conduct follow-up
compliance reviews on these carriers. FMCSA assigns safety ratings based
on a carrier's performance in six areas. (See table 2.) One area is the
carrier's accident rate, and the other five areas involve its compliance
with regulations. The five regulation-based areas are (1) minimum
insurance coverage and procedures for handling and evaluating accidents;
(2) drug and alcohol use and testing, commercial driver's license
standards, and driver qualifications; (3) driver hours of service; (4)
vehicle parts and accessories necessary for safe operation; inspection,
repair, and maintenance of vehicles; and (5) transportation of hazardous
materials.

Table 2: How FMCSA Determines Carrier Safety Ratings Based on Ratings in
Six Safety Areas

                                                               this number of 
                                 this number of                conditional    
A carrier receives a if it    unsatisfactory safety         safety area    
safety rating of     receives area ratings              and ratings        
Satisfactory                  0                             2 or fewer     
Conditional                   0                             more than 2    
Conditional                   1                             2 or fewer     
Unsatisfactory                1                             more than 2    
Unsatisfactory                2 or more                     0 or more      

Source: GAO presentation of FMCSA information.

Regardless of a carrier's safety rating, FMCSA can assess a fine against a
carrier with violations, and it is more likely to assess higher fines when
these violations are serious. FMCSA uses a tool to help it determine the
dollar amounts of its fines. Federal law requires FMCSA to assess the
maximum allowable fine against a carrier for each serious violation of
federal motor carrier safety and commercial driver's license laws if the
carrier is found to have a pattern of such violations or a record of
previously committing the same or a related serious violation.

FMCSA's Policy for Prioritizing Compliance Reviews Targets Many High-Risk
Carriers, but Changes to the Policy Could Target Carriers with Even Higher Risk

SafeStat identifies many carriers that pose high crash risks.^17 However,
modifications to FMCSA's policy that carriers have to score among the
worst 25 percent of carriers in two or more safety evaluation areas to
receive high priority for a compliance review and focusing more on crash
risk could result in the selection of carriers with a higher aggregate
crash risk.^18 FMCSA recognizes that SafeStat can be improved, and as part
of its Comprehensive Safety Analysis 2010 reform initiative, which is
aimed at improving its processes for identifying and dealing with unsafe
carriers, the agency is considering replacing SafeStat with a new tool by
2010.

FMCSA's Policy for Prioritizing Compliance Reviews Leads the Agency to Conduct
Compliance Reviews on Many High-Risk Carriers but Not on Other Higher Risk Ones

FMCSA's policy for prioritizing carriers for compliance reviews based on
their SafeStat scores results in FMCSA's conducting compliance reviews on
carriers with a higher aggregate crash risk than carriers that are not
selected. As a result, FMCSA's prioritization policy has value as a method
for targeting high-risk carriers. But changes to the policy could result
in targeting carriers with an even higher aggregate crash risk. According
to our analysis of SafeStat's June 2004 categorization of carriers, the
4,989 carriers that received high priority for a compliance review
(SafeStat categories A or B) had a higher aggregate crash risk (102
crashes per 1,000 vehicles in the 18 months following the SafeStat
categorization) than the remaining 617,034 carriers (27 crashes per 1,000
vehicles). (See table 3.) However, the 2,464 carriers that scored among
the worst 25 percent of carriers in the accident evaluation area alone
(SafeStat category D) had a slightly higher aggregate crash risk (112
crashes per 1,000 vehicles) than did the carriers in SafeStat categories A
or B. Furthermore, the 1,090 carriers that scored among the worst 10
percent and the 492 carriers that scored among the worst 5 percent of
carriers in the accident area (and did not score among the worst 25
percent of carriers in any other area) had even higher aggregate rates of
148 and 213 crashes per 1,000 vehicles, respectively.

^17We found that SafeStat is about twice as effective in identifying these
high-risk carriers than is randomly selecting them for compliance reviews.
See [50]GAO-07-585 .

^18We are defining "crash risk" as the number of crashes for the carrier
per 1,000 vehicles in the 18 months following the SafeStat categorization.
By "aggregate" crash risk, we mean the total number of crashes for all
carriers in the group per 1,000 vehicles in the 18 months following the
SafeStat categorization.

Table 3: Crash Rates of Motor Carriers in Various SafeStat Categories in
the 18 Months following the June 2004 SafeStat Categorization

                                                     Priority for      Number 
SafeStat                                   Crash  compliance       ofmotor 
category(ies)    Description              rate^a  review        carriers^b 
A                Deficient in three or       107  High                 631 
                    four safety evaluation                                    
                    areas; SafeStat score                                     
                    350 or more                                               
B                Deficient in two or         101  High               4,358 
                    three safety evaluation                                   
                    areas; SafeStat score                                     
                    225 or more, and less                                     
                    than 350                                                  
Subtotal A+B     See above                   102  High               4,989 
C                Deficient in two safety      48  Medium             3,683 
                    evaluation areas;                                         
                    SafeStat score less than                                  
                    225                                                       
D                Accident safety             112  Low                2,464 
                    evaluation area value 75                                  
                    or more                                                   
Subset of D      Accident safety             148  Low                1,090 
                    evaluation area value 90                                  
                    or more                                                   
Subset of D      Accident safety             213  Low                  492 
                    evaluation area value 95                                  
                    or more                                                   
All categories                                27  Medium or low    617,034 
other than A and                                                           
B                                                                          

Source: GAO analysis of FMCSA data.

^aCrash rates are crashes per 1,000 vehicles in the 18 months following
the June 2004 SafeStat categorization. As discussed in appendix IV, we
used data from FMCSA's June 2004 SafeStat categorization because these
were the latest available data that we could use at the time of our
analysis to obtain relatively complete data on carriers' numbers of
crashes in the 18 months following the categorization.

^bThe table includes only those carriers listed as having one or more
vehicles.

Our analysis suggests that FMCSA's targeting of high-risk carriers could
be enhanced by giving high priority for a compliance review to carriers
that score among the worst 25, 10, or 5 percent of carriers in the
accident evaluation area alone. We recognize that giving such carriers
high priority for a compliance review would increase FMCSA's and the
states' compliance review workloads unless FMCSA were to make another
change to its prioritization policy that resulted in removing the same
number of carriers from the high-priority categories A and B.^19 For
example, if FMCSA had given high priority to the 492 carriers that scored
among the worst 5 percent of carriers in the accident evaluation area in
June 2004, it could have removed the 492 carriers in categories A or B
with the lowest SafeStat score in order to hold its and the states'
compliance review workloads constant. The lowest-scoring carriers in
categories A and B had an aggregate crash risk of 65 crashes per 1,000
vehicles, less than one-third the crash risk of the carriers that could
have replaced them (214 crashes per 1,000 vehicles).

^19To give a sense of FMCSA's and the states' compliance review workload,
in fiscal year 2006, FMCSA and the states conducted 15,182 compliance
reviews; about half of these were on carriers that were in SafeStat
categories A or B.

We also found that carriers that scored among the worst 25 percent, 10
percent, or 5 percent of carriers in either the driver, vehicle, or safety
management areas (and did not score among the worst 25 percent of carriers
in any other area) had a lower aggregate crash risk than carriers in
SafeStat categories A or B. Of these various groups of carriers with poor
performance in a single area, the carriers that scored among the worst 10
percent of carriers in the driver area had the highest aggregate crash
risk (70 crashes per 1,000 vehicles).

A Regression Model Performs Better Than Current SafeStat Model and the
Prioritization Approach We Developed

In our June 2007 report, we estimated that FMCSA could improve SafeStat's
performance by about 9 percent by using a statistical regression model
approach to weight the accident, driver, vehicle, and safety management
evaluation areas instead of its current approach, which is based on expert
judgment.^20 Employing this approach would have allowed FMCSA to identify
carriers with almost twice as many crashes in the following 18 months as
those carriers identified under its current approach. We found that
although the driver, vehicle, and safety management evaluation area scores
are correlated with the future crash risk of a carrier, the accident
evaluation area correlates the most with future crash risk and should be
weighted more heavily than the current SafeStat formula weights this area.
These results corroborate studies performed by the Volpe National
Transportation Systems Center and Oak Ridge National Laboratory, the
latter of which also employed statistical approaches. (See app. I for a
discussion of these studies.)

We believe that our regression model approach from our June 2007 report is
preferable to the prioritization approach we developed in this report
because it provides for a systematic assessment of the relative
contributions of accidents and driver, vehicle, and safety management
violations. That is, by its very nature, the regression model approach
looks for the "best fit" in identifying the degree to which prior
accidents and driver, vehicle, and safety management violations identify
the likelihood of carriers having crashes in the future, compared with the
current SafeStat approach and the prioritization approach we developed for
this report, both of which use expert judgment to establish the
relationship among the four evaluation areas. In addition, because the
regression model could be run monthly--as is the current SafeStat
model--any change in the degree to which accidents and driver, vehicle,
and safety management violations better identify future crashes will be
automatically considered as different weights are assigned to the four
evaluation areas. This is not the case with the current SafeStat model, in
which the evaluation area weights generally remain constant over time.^21
Thus, the systematic assessment and the automatic updating of evaluation
area weights using a regression model approach better ensure the targeting
of carriers that pose high crash risks--both currently and in the future.

^20 [51]GAO-07-585 .

We compared the performance of our regression model approach to the
current SafeStat model and to two alternative approaches that employ the
current SafeStat model approach (with the current weighting of evaluation
areas) but give higher priority to some carriers in category D (carriers
that scored among the worst 25 percent of carriers in only the accident
evaluation area). The two alternatives were substituting carriers in the
worst 5 percent of the accident evaluation area for carriers in SafeStat
categories A and B with (1) the lowest accident area scores and (2) the
lowest overall SafeStat numerical scores.^22 The regression model approach
performed better than the current SafeStat approach and at least as well
as the alternatives discussed in this report, in terms of identifying
carriers that experienced a higher aggregate crash rate or a greater
number of crashes. (See table 4.) For example, the regression model
approach identified carriers with an average of 111 crashes per 1,000
vehicles over an 18-month period compared with the current SafeStat
approach that identified carriers for compliance reviews with an average
of 102 crashes per 1,000 vehicles. The regression model approach also
performed at least as well as the alternatives discussed in this report in
terms of identifying carriers with the highest aggregate crash rate and
much better than the alternatives in identifying carriers with the
greatest number of crashes. Finally, the alternatives discussed in this
report were superior to the results of FMCSA's current prioritization
policy in terms of identifying carriers with both a higher aggregate crash
rate and a greater number of crashes.

^21The weights on the safety evaluation areas have remained unchanged
since September 1999, when the weight on the driver area was increased
from 1.0 to 1.5.

^22These alternatives are for use as examples only. FMCSA could choose
other cut points, such as carriers in the worst 10, 15, or 20 percent of
the accident evaluation area. Our analyses show that these other
alternatives provided superior results to the current SafeStat approach.

Table 4: Regression Model Approach Compared with Refined Prioritization
Approach and with Current SafeStat Approach

                                                            Number of crashes 
Approach                                    Crash rate^a      in 18 months 
Regression model approach                          111.4            19,580 
Refined prioritization approach alternative                                
1: substitute SafeStat category D                                          
(accident) carriers for category A and B                                   
carriers with the lowest overall SafeStat                                  
scores                                             111.0            10,682 
Refined prioritization approach alternative                                
2: substitute SafeStat category D                                          
(accident) carriers for category A and B                                   
carriers with the lowest accident area                                     
scores                                             107.8            10,887 
Current SafeStat approach                          102.2            10,076 

Source: GAO analysis of FMCSA data.

Note: The relationship between the number of crashes and the crash rate is
not linear because the different analyses identified carriers with
different fleet sizes as posing a high crash risk.

^aCrash rates are crashes per 1,000 vehicles in the 18 months following
the June 2004 SafeStat categorization.

FMCSA officials told us that the agency plans to assess whether the
approach developed in this report--giving high priority to carriers that
perform very poorly in only the accident evaluation area (such as those
that scored among the worst 5 percent)--would be an effective use of its
resources. However, FMCSA officials expressed concern that adopting our
regression model approach would reduce the effectiveness of FMCSA's
compliance review program by targeting many compliance reviews at carriers
that, despite high crash rates, have good compliance records. FMCSA
believes that compliance reviews of such carriers, compared with
compliance reviews of carriers in SafeStat categories A or B (carriers
that, by definition, have a history of noncompliance), have less potential
to reduce accidents. FMCSA said that this is because compliance reviews
are designed to reduce crashes by identifying safety violations that some
carriers then correct, and compliance reviews of carriers with good
compliance records but high crash rates have historically identified fewer
serious violations than compliance reviews of carriers in SafeStat
categories A and B. FMCSA officials told us that, as part of its
Comprehensive Safety Analysis 2010 reform initiative, the agency is
evaluating the potential for new ways to address motor carriers that are
having crashes, but that it believes are not good candidates for the
compliance review tool. (See the discussion on FMCSA's Comprehensive
Safety Analysis 2010 reform initiative in a subsequent section.)

We agree with FMCSA that the use of our model could tilt enforcement
heavily toward carriers with high crash rates and away from carriers with
compliance problems. We believe that use of the model would enhance motor
carrier safety, even if it resulted in FMCSA reviewing carriers with good
compliance records. FMCSA's mission--and the ultimate purpose of
compliance reviews--is to reduce the number and severity of truck and bus
crashes. As previously discussed, we found that while driver, vehicle, and
safety management evaluation area scores are correlated with the future
crash risk of a carrier, high crash rates are a stronger predictor of
future crashes than is poor compliance with safety regulations. These
facts suggest that FMCSA would improve motor carrier safety more by
targeting carriers with high crash rates, even if they have better
compliance records, than by targeting carriers in SafeStat categories A
and B with significantly lower crash rates but with worse compliance
records. The missing piece in the puzzle is that FMCSA does not have a
good understanding of why some carriers, despite good compliance records,
have high crash rates; how compliance reviews affect their crash rates;
and what other approaches may be effective in reducing their crash rates.
We believe that developing this understanding would be a natural outgrowth
of implementing our regression model approach.

FMCSA officials also said that placing more emphasis on the accident
evaluation area would increase emphasis on the least reliable type of data
used by SafeStat--crash data--and in so doing, it would increase the
sensitivity of the results to crash data quality issues. However, our June
2007 report found that FMCSA has made a considerable effort to improve the
reliability of crash data. That report also concluded that as FMCSA
continues its efforts to have states improve crash data, any sensitivity
of results from our regression model approach to crash data quality issues
should diminish.

FMCSA officials were also concerned that our issuing two reports on
SafeStat within several months of each other could be interpreted as an
indictment of SafeStat and of FMCSA's responsiveness to our June 2007
report on this issue. This is not the case. SafeStat does a good job of
identifying carriers that pose high crash risks. As we reported in June
2007, we found that SafeStat is nearly twice as effective (83 percent
better than) as random selection in identifying carriers that pose high
crash risks and, therefore, has value for improving safety. Nonetheless,
we found that FMCSA's policy for prioritizing compliance reviews could be
improved by applying either our regression model approach or one of the
prioritization approaches we developed in this report. While we believe
that the regression model approach provides somewhat better safety
results, we understand, as discussed in our June 2007 report, that it
could require FMCSA to re-educate the motor carrier industry and others,
such as safety advocates, insurers, and the public, about the new
approach. We would prefer that FMCSA implement our recommendation that it
use our regression model approach but adopting either our regression model
approach or one of the prioritization approaches we developed in this
report would, in our opinion, improve FMCSA's targeting of high-risk
carriers. The recommendation that we make in this report reflects this
conclusion. Finally, FMCSA has been very helpful and responsive during
both our--largely concurrent--reviews.

FMCSA Has Acted to Address Data Quality Problems That Potentially Hinder
SafeStat's Ability to Identify High-Risk Carriers

For our June 2007 report, we assessed the quality of the data used by
SafeStat and the degree to which the quality of the data affects
SafeStat's identification of high-risk carriers, and we identified actions
FMCSA has taken to improve the quality of the data used by SafeStat. We
found that crash data reported by the states from December 2001 through
June 2004 have problems in terms of timeliness, accuracy, and completeness
that potentially hinder FMCSA's ability to identify high-risk carriers.
Regarding timeliness, we found that including late-reported data had a
small impact on SafeStat--had all crash data been reported within 90 days
of when the crashes occurred, 182 of the carriers identified by SafeStat
as highest risk would have been excluded (because other carriers had
higher crash risks), and 481 carriers that were not originally designated
as posing high crash risks would have scored high enough to be considered
high risk, resulting in a net addition of 299 carriers (or 6 percent) to
the original 4,989 carriers that the SafeStat model ranked as highest risk
in June 2004. We were not able to quantify the effect of incomplete or
inaccurate data on SafeStat's ability to identify carriers that pose high
crash risks, because doing so would have required us to gather crash
records at the state level--an effort that was impractical. FMCSA has
acted to improve the quality of SafeStat's data by completing a
comprehensive plan for data quality improvement, implementing an approach
to correct inaccurate data, and providing grants to states for improving
data quality, among other things. We could not quantify the effects of
FMCSA's efforts to improve the completeness or accuracy of the data for
the same reason as just mentioned. (See app. II for a more detailed
discussion of the quality of the data used by SafeStat.)

FMCSA Is Considering Replacing SafeStat with a New Tool by 2010

As part of its Comprehensive Safety Analysis 2010, a reform initiative
aimed at improving its processes for identifying and dealing with unsafe
carriers and drivers, FMCSA is considering replacing SafeStat with a new
tool by 2010. The new tool could take on greater importance in FMCSA's
safety oversight framework because the agency is considering using the
tool's assessments of carriers' safety to determine whether carriers are
fit to continue operating.^23 In contrast, SafeStat's primary use now is
in prioritizing carriers for compliance reviews, and determinations of
operational fitness are made only after the compliance reviews are
completed.

While the new tool may use some of the same data included in SafeStat,
such as carriers' crash rates and driver and vehicle violations identified
during compliance reviews and roadside inspections, it may also consider a
broader range of behavioral data related to crashes than does SafeStat.
For example, the new tool may consider information from crash reports,
such as whether driver fatigue, a lack of driver experience, a medical
reason, a mechanical failure, shifting loads, or spilled or dropped cargo,
were cited as causal or contributing factors. An FMCSA official told us
that the agency is analyzing the relationship between these factors and
crash rates to help it determine how the factors should be assessed and
the relative weights to place on the factors. We believe that, compared
with the expert-judgment-based approach that FMCSA used to select the
weights for SafeStat's evaluation areas, this analytical approach has the
potential to better identify high-risk carriers.

^23Based on results from its 2006 study of the causes of large truck
crashes, which indicated that driver behavior rather than vehicle
condition was the primary reason for most crashes, FMCSA also plans to
develop a tool to assess the safety status of individual drivers, along
with tools for dealing with unsafe drivers.

FMCSA's Management of Its Compliance Reviews Promotes Thoroughness and
Consistency

FMCSA manages its compliance reviews in a fashion that meets our standards
for internal control, thereby promoting thoroughness and consistency in
the reviews.^24 For example, it records its policies and procedures
related to compliance reviews in an operations manual. FMCSA also provides
investigators with classroom and on-the-job training on how to plan for
and conduct compliance reviews. In addition, it employs an information
system that documents the results of compliance reviews and allows FMCSA
and state managers to review the compliance reviews for thoroughness,
accuracy, and consistency. FMCSA uses several approaches to monitor its
compliance review program, including an agencywide review in 2002 that led
to several changes in the program.

FMCSA Communicates Its Compliance Review Policies and Procedures through an
Electronic Manual and Training

FMCSA's communication of its policies and procedures related to conducting
compliance reviews meets our standards for internal control. These
standards state that an organization's policies and procedures should be
recorded and communicated to management and others within the entity who
need it and in a form (e.g., clearly written and provided as a paper or
electronic manual) and within a time frame that enables them to carry out
their responsibilities. FMCSA records and communicates its policies and
procedures electronically through its "Field Operations Training Manual"
(hereafter called the operations manual), which it provides to all federal
and state investigators and their managers. The operations manual includes
guidance on how to prepare for a compliance review. For example, it tells
investigators that they must download and review a report that includes
information on the carrier's accidents, drivers, and inspections, and it
explains how this information can help the investigator focus the
compliance review. It also specifies the minimum number of driver and
vehicle maintenance records to be examined and the minimum number of
vehicle inspections to be conducted during a compliance review. FMCSA aims
to update its operations manual twice a year. It posts updates to the
operations manual that automatically download to investigators and
managers when they connect to the Internet. In between these updates,
FMCSA communicates policy changes by e-mail.

^24See [52]GAO/AIMD-00-21 .3.1. In assessing the extent to which FMCSA's
management of its compliance reviews is consistent with our internal
controls, we were not able to verify the statements made by FMCSA and
state officials and investigators about their performance and management
of compliance reviews because doing so was not practicable given our time
and resource constraints.

In addition to the operations manual, FMCSA provides training to
investigators on its policies and procedures related to compliance
reviews. FMCSA policy requires that investigators successfully complete
classroom training and examinations before they conduct a compliance
review. The training covers the safety and hazardous materials regulations
and software tools used during compliance reviews. According to FMCSA
officials, investigators then receive on-the-job training, which allows
them to accompany an experienced investigator during compliance reviews.
This training lasts until managers decide that the trainees are ready to
complete a compliance review on their own, typically after 3 to 6 months
on the job. Investigators can also take additional classroom training on
specialized topics throughout their careers. Furthermore, according to
FMCSA officials, FMCSA's division offices hold periodic and ad hoc
meetings to train investigators about policy changes related to compliance
reviews. In addition, in commenting on a draft of this report, FMCSA noted
that it has an annual safety investigator certification process to ensure
that only qualified personnel conduct compliance reviews.

FMCSA Investigators Use an Information System to Document the Results of
Compliance Reviews

FMCSA's documentation of compliance reviews meets our standards for
internal control. These standards state that all transactions and other
significant events should be clearly and promptly documented, and the
documentation should be readily available for examination. This applies to
the entire process or life cycle of a transaction or event from the
initiation and authorization through its final classification in summary
records. The standards also state that control activities, including
reviews of information and system edit checks, should help to ensure that
all transactions are completely and accurately recorded. FMCSA and state
investigators use an information system to document the results of their
compliance reviews, including information on crashes and any violations of
the safety regulations that they identify. This documentation is readily
available to FMCSA managers, who told us that they review it to help
ensure completeness and accuracy. FMCSA officials told us that the
information system also helps ensure thoroughness and consistency by
prompting investigators to follow FMCSA's policies and procedures, such as
requirements to meet a minimum sample size. The information system also
includes checks for consistency and reasonableness and prompts
investigators when the information they enter appears to be inaccurate. An
FMCSA manager told us that managers typically assess an investigator's
thoroughness by comparing the investigator's rate of violations identified
over the course of several compliance reviews with the average rate for
investigators in their division office; a rate that is substantially below
the average suggests insufficient thoroughness. Generally, FMCSA and state
investigators and managers said they found the information system to be
useful.

FMCSA Monitors the Performance of Its Compliance Reviews and Has Taken Actions
to Address Identified Issues

FMCSA's performance measurement and monitoring of compliance review
activities meet our standards for internal control. These standards state
that managers should compare actual performance to planned or expected
results and analyze significant differences. Monitoring of internal
controls should include policies and procedures for ensuring that the
findings of audits and other reviews are promptly resolved. According to
FMCSA and state managers and investigators, the managers review all
compliance reviews in each division office and state to ensure
thoroughness and consistency across investigators and across compliance
reviews. The investigators we spoke with generally found these reviews to
be helpful, and several investigators said that the reviews helped them
learn policies and procedures and ultimately perform better compliance
reviews. FMCSA and state managers told us that they also use monthly
reports to track the performance of investigators using measures such as
the numbers of reviews completed and the rates of violations found.
Managers generally found that these reports provide useful information on
investigators' performance, and several managers said that they use the
reports to help identify specific areas where an investigator needs
additional coaching or training. However, several state managers said that
monitoring of their investigators' performance would be enhanced if they
had access to FMCSA's monthly report on their investigators; currently,
states rely on their own custom reports. FMCSA told us that it plans to
make its monthly report on state investigators available to state managers
by October 2007.

In addition to assessing the performance of individual investigators,
FMCSA periodically assesses the performance of FMCSA division offices and
state agencies, and it conducted an agencywide review of its compliance
review program in 2002. According to officials at one of FMCSA's service
centers, the service centers lead triennial reviews of the compliance
review and enforcement activities of each division office and its state
partner. These reviews assess whether the division offices and state
partners are following FMCSA policies and procedures, and they include an
assessment of performance data for items such as number of compliance
reviews conducted, rate of violations identified, and number of
enforcement actions taken. The officials said that some reviews identify
instances of deviations by division offices from FMCSA's compliance review
policies, but that only minor adjustments by the division offices are
needed. The officials also said that the service centers compile best
practices identified during the reviews and share these among the division
offices and state partners. To ensure that concerns identified during the
reviews are addressed, the officials said that the service centers monitor
the quality of individual compliance reviews that lead to enforcement
cases and the monthly reports on division office and state activities. The
officials said that the service centers also check on responses to
previously identified concerns during the triennial reviews.

FMCSA's agencywide review indicated that inconsistencies and bottlenecks
in the compliance review process were reducing its efficiency and
effectiveness, and FMCSA made several changes in 2003 aimed at improving
compliance review policies, procedures, training, software, and supporting
motor carrier data. Examples of problems identified and actions taken are
as follows:

           o FMCSA discouraged repeat visits to high-risk motor carriers that
           had received unsatisfactory ratings during their last compliance
           review within the past 12 months because the agency believed that
           not enough time had elapsed to show whether safety improvements
           had taken effect.
           o FMCSA discouraged safety investigators from their earlier
           practice of favoring violations of drug and alcohol regulations
           over violations of hours-of-service regulations when they choose
           which violations to document for enforcement because crash data
           and FMCSA's survey of its field staff suggest that compliance with
           hours-of-service regulations is more important for safety.
           o FMCSA revised its operations manual to encourage FMCSA's
           division offices to document the maximum number of areas of the
           regulations where major safety violations are discovered, rather
           than penalizing motor carriers for a few violations in a
           particular area at the expense of other areas.

           FMCSA's review also concluded that most investigators were not
           following FMCSA's policy requiring them to perform vehicle
           inspections as part of a compliance review if the carrier has not
           already received the required number of roadside vehicle
           inspections.^25 FMCSA has since changed its policy so that
           inspecting a minimum number of vehicles is no longer a strict
           requirement--if an investigator is unable to inspect the minimum
           number of vehicles, he or she must explain why in the compliance
           review report.^26 FMCSA told us that, as part of their review of
           individual compliance reviews, division office managers ensure
           that when compliance reviews have fewer than the minimum number of
           vehicle inspections, investigators provide adequate justification
           in their reports. We did not verify this statement because we did
           not have enough time or resources. We did, however, assess the
           extent to which compliance reviews included the minimum number of
           vehicle inspections. In fiscal year 2005, FMCSA and its state
           partners conducted 7,436 compliance reviews on carriers that had
           not already received the minimum number of vehicle inspections; of
           these, only 254 compliance reviews (3 percent) included the
           minimum number of vehicle inspections.

^25The required number of inspections was based on the number of vehicles
operated by the carrier.

           FMCSA's review also found that investigators considered
           inspections to be the one aspect of compliance reviews, other than
           licensing and insurance verification, that had the smallest effect
           on carriers' safety performance. FMCSA's review team recommended
           that FMCSA establish new criteria for conducting vehicle
           inspections during compliance reviews, and suggested that
           inspections could be made optional. In contrast, in 2002, the
           National Transportation Safety Board (the Safety Board)
           recommended that FMCSA require that all compliance reviews include
           vehicle inspections. The Safety Board based its recommendation on
           its belief that the vehicles that receive roadside inspections may
           be less likely to have violations than the vehicles that could be
           inspected during a compliance review. In July 2006, FMCSA
           responded that implementing this recommendation would be imprudent
           because it would divert attention from driver and other safety
           factors, and FMCSA's recent study of the causes of large truck
           crashes indicates the importance of driver factors, such as
           driving too fast for conditions and driver fatigue. FMCSA has not
           changed its policy, but an FMCSA official told us that under the
           operational model that FMCSA has proposed for its Comprehensive
           Safety Analysis 2010 reform initiative, vehicle inspections during
           compliance reviews would be optional. FMCSA also told us that it
           is developing a policy that would allow investigators conducting
           compliance reviews to inspect vehicles that operate in intrastate
           commerce. FMCSA believes that this policy will increase the number
           of compliance reviews with the minimum number of vehicle
           inspections.
			  
^26An inspector would not be able to inspect the minimum number of
vehicles if, for example, fewer than the minimum number of vehicles were
available on-site for inspection.

           Finally, FMCSA's review found that although investigators
           generally sampled the number of carrier records required by
           FMCSA's policies, the number of undersized samples of drivers'
           work hour logs was a cause for concern. The review said that a
           lack of clarity in FMCSA's requirements for how carriers must
           document drivers' hours was likely resulting in some carriers
           having too few records to sample. FMCSA is working to clarify its
           documentation requirements, but it has not set a date for
           completing this task.
			  
			  Each of the Major Applicable Areas of the Safety Regulations Is
			  Covered by Most Compliance Reviews

           From fiscal year 2001 through fiscal year 2006, each of the nine
           major applicable areas of the safety regulations was covered by
           most of the approximately 76,000 compliance reviews conducted by
           FMCSA and the states. (See table 5.)

           Table 5: Percentages of Compliance Reviews for Fiscal Years 2001
           through 2006 That Covered Each of the Major Applicable Areas of
           the Safety Regulations
			  
Regulatory area                                            Percentage 
Procedures for handling and evaluating accidents                  97% 
Drivers' qualifications                                            96 
Drivers' hours of service                                          96 
Inspection, repair, and maintenance of vehicles                    96 
Drug and alcohol use and testing                                   95 
Commercial driver's license standards                              95 
Driving of motor vehicles                                          94 
Minimum insurance coverage                                         90 
Vehicle parts and accessories necessary for safe operation         80 			  

           Source: GAO analysis of FMCSA data.

           An FMCSA official told us that not every compliance review is
           required to cover all nine areas and cited the following reasons:

           o Follow-up compliance reviews of carriers rated unsatisfactory or
           conditional are sometimes streamlined to cover only the area or
           areas of the regulations in which the carrier had violations.
           o Commercial driver's license standards and drug and alcohol use
           and testing regulations apply primarily to those carriers that
           operate one or more vehicles weighing over 26,000 pounds (gross
           vehicle weight rating), that haul hazardous material, or that
           transport more than 15 passengers.
           o Minimum insurance coverage regulations apply only to for-hire
           carriers and private carriers of hazardous materials; they do not
           apply to private passenger and nonhazardous materials carriers.

           However, according to an FMCSA official, the area of these
           regulations that had the lowest rate of coverage--vehicle parts
           and accessories necessary for safe operation--is required for all
           compliance reviews except streamlined reviews that exclude this
           area. Vehicle inspections are supposed to be a key investigative
           technique for assessing compliance with this area, and the FMCSA
           official said that the lower rate of coverage for this area likely
           reflects the small number of vehicle inspections that FMCSA and
           the states conduct during compliance reviews.

           In addition to the safety regulations, compliance reviews of
           hazardous materials carriers, shippers, and cargo tank facilities
           must cover hazardous materials regulations. In fiscal years 2005
           and 2006, FMCSA conducted about 6,000 compliance reviews of
           hazardous materials operators. Collectively, these compliance
           reviews covered between 40 percent and 80 percent of the various
           individual areas of these regulations. However, none of these
           compliance reviews was required to cover all areas of the
           hazardous materials regulations; the required areas vary with the
           type of operator. Because the categories that MCMIS uses to
           classify hazardous materials operators are different from the
           categories used to determine which areas of the regulations must
           be covered, we could not determine, for the different types of
           operators, the extent to which FMCSA's compliance reviews covered
           the required areas.
			  
			  FMCSA Follows Up with Many Carriers with Serious Safety Violations
			  but Does Not Assess Maximum Fines against All of the Serious
			  Violators Required by Law

           FMCSA placed many carriers rated unsatisfactory in fiscal year
           2005 out of service and followed up with nearly all of the rest to
           determine whether they had improved. In addition, FMCSA monitors
           carriers to identify those that are violating out-of-service
           orders. However, it does not take additional action against many
           of the violators of out-of-service orders that it identifies.
           Furthermore, FMCSA does not assess the maximum fines against all
           of the serious violators that we believe the law requires, partly
           because FMCSA does not distinguish between carriers with a pattern
           of serious safety violations and those that repeat a serious
           violation.
			  
			  FMCSA Followed Up with Almost All Carriers That Received a Proposed
			  Safety Rating of Unsatisfactory

           FMCSA followed up with 1,193 of 1,196 carriers (99.7 percent) that
           received a proposed safety rating of unsatisfactory following a
           compliance review that was completed in fiscal year 2005. FMCSA's
           follow-up generally ensured that these carriers either made safety
           improvements that resulted in an upgraded final safety rating
           or--as required for carriers that also receive a final safety
           rating of unsatisfactory--were placed out of service. More
           specifically, FMCSA used the following approaches to follow up
           with these carriers:

           o Follow-up compliance review. Based on such reviews, FMCSA
           upgraded the final safety ratings of 663 carriers (329 to
           satisfactory, and 334 to conditional).
           o Assignment of a final rating of unsatisfactory and issuance of
           an out-of-service order. FMCSA assigned a final rating of
           unsatisfactory to 312 carriers and issued an out-of-service order
           to 309 (99 percent) of them. An FMCSA official told us that it did
           not issue an out-of-service order to 2 carriers because it could
           not locate them, and it did not issue an out-of-service order to
           another carrier because the carrier was still subject to an
           out-of-service order that FMCSA issued several years prior to the
           2005 compliance review.
           o Review of evidence of corrective action. Carriers can request an
           upgraded safety rating by submitting evidence of corrective action
           to FMCSA. Based on reviews of such evidence, FMCSA upgraded the
           final safety ratings of 217 carriers (23 to satisfactory, and 194
           to conditional).
           o Administrative review. Carriers that believe FMCSA made an error
           in assigning their proposed safety rating may request the agency
           to conduct an administrative review. Based on the administrative
           review, FMCSA upgraded the final safety rating of 1 carrier to
           conditional.

           FMCSA did not assign final safety ratings to the remaining 3
           carriers. For 1 of these carriers, MCMIS indicates that the
           compliance review that resulted in the proposed rating of
           unsatisfactory did not identify any violations, even though
           carriers without violations are not supposed to receive a proposed
           unsatisfactory rating. For another of the carriers, MCMIS shows
           crashes, inspections, and a compliance review while also
           indicating that the carrier is inactive. FMCSA has been unable to
           locate the final carrier, and MCMIS indicates that the carrier is
           inactive.

           Unless FMCSA upgrades a proposed unsatisfactory safety rating or
           grants a carrier an extension, the agency is required under its
           policy to assign the carrier a final rating of unsatisfactory and
           to issue it an out-of-service order on the 46th day after the date
           of FMCSA's notice of a proposed unsatisfactory rating for carriers
           of hazardous materials or passengers and on the 61st day for other
           types of carriers.^27 Of the 309 out-of-service orders that FMCSA
           issued to carriers rated unsatisfactory following compliance
           reviews conducted in fiscal year 2005, 276 (89 percent) were
           issued on time, 28 (9 percent) were issued between 1 and 10 days
           late, and 5 (2 percent) were issued more than 10 days late. FMCSA
           also assigned final upgraded safety ratings within these time
           frames in 837 (95 percent) of the 881 cases in which it upgraded
           these ratings. FMCSA assigned 20 upgrades (2 percent) between 1
           and 10 days late, and it assigned another 20 (2 percent) more than
           10 days late. MCMIS did not have information on the timing of the
           other 4 upgrades. An FMCSA official told us that when an
           out-of-service order was issued more than 1 week late, the primary
           reason for the delay was that the responsible FMCSA division
           office had difficulty scheduling a follow-up compliance review and
           thus waited to issue the orders. The official said that other
           delays were caused by clerical errors; extended periods during
           which certain division offices operated without a person serving
           in the position with primary responsibility for ensuring that
           out-of-service orders are issued on time; a lack of complete
           compatibility between MCMIS and FMCSA's enforcement database; and,
           in one service center whose policy is to personally serve
           out-of-service orders to carriers, insufficient advance
           notification by the service center to its division offices that an
           order was to be served. The official noted that the last two
           issues have been addressed and said that FMCSA plans to more
           closely monitor the timeliness of the issuance of out-of-service
           orders in all of FMCSA's division offices.

^27FMCSA may allow a carrier with a proposed rating of unsatisfactory
(unless the carrier is transporting passengers or hazardous materials) to
continue to operate in interstate commerce for up to 60 days beyond the 60
days specified in the proposed rating if FMCSA determines that the carrier
is making a good faith effort to improve its safety. For carriers of
passengers or hazardous materials, FMCSA may extend by up to 10 days the
45-day period before which the proposed safety rating becomes final, but
it may not extend the 45-day period before which these carriers are to be
placed out of service.

           FMCSA Monitors Carriers to Identify Those That Are Violating Out-of-Service
Orders, but It Does Not Take Additional Action against Many of the Violators It
Identifies

           FMCSA uses two primary means to try to ensure that carriers that
           have been placed out of service do not continue to operate. First,
           FMCSA partners with states to help them suspend, revoke, or deny
           vehicle registration to carriers that have been placed out of
           service. FMCSA refers to these partnerships as the Performance and
           Registration Information Systems Management program (PRISM). PRISM
           links FMCSA databases with state motor vehicle registration
           systems and roadside inspection personnel to help identify
           vehicles operated by carriers that have been issued out-of-service
           orders. As of January 2007, 45 states had been awarded PRISM
           grants, and 27 states were operating with PRISM capabilities.
           FMCSA officials told us that some states have not applied for
           PRISM grants because they do not want to bear the costs that are
           not covered by the grants or they have not made the legislative
           changes required to implement PRISM. According to an FMCSA
           official, FMCSA has also begun working with PRISM states to enable
           them to receive automated notifications of carriers that have been
           placed out of service. PRISM can also identify carriers that
           attempt to register vehicles under a different carrier name, and
           FMCSA provided us with information on two out-of-service carriers
           that Connecticut, using PRISM, had caught trying to register
           vehicles by using a new company name. In addition, in commenting
           on a draft of this report, FMCSA said that during the first 6
           months of fiscal year 2007, states that reported data to FMCSA
           indicated that at least 104 motor carriers had their state vehicle
           registrations suspended, revoked, or denied based on an FMCSA
           order to cease interstate operations.

           FMCSA and its state partners also monitor carriers for
           indicators--such as roadside inspections, moving violations, and
           crashes--that the carriers may be violating an out-of-service
           order. First, FMCSA recently began to require the state partners
           that receive Motor Carrier Safety Assistance Program grants to
           check during roadside inspections whether carriers are operating
           under revoked authority and to take enforcement action against any
           that are. Second, FMCSA visits some suspect carriers that it
           identifies by monitoring crash and inspection data to examine
           their records to determine whether they did indeed violate the
           order. FMCSA told us it is difficult for it to verify that such
           carriers were operating in violation of out-of-service orders
           because its resources do not allow it to visit each carrier or
           conduct roadside inspections on all vehicles, and we agree. In
           fiscal years 2005 and 2006, 677 of 1,741 carriers (39 percent)
           that were subject to an out-of-service order had a roadside
           inspection or crash; FMCSA cited only 36 of these 677 carriers for
           violating the out-of-service order. An FMCSA official told us that
           some of these carriers, such as carriers that were operating
           intrastate or leasing vehicles to other carriers, may not have
           been violating the out-of-service order. The official said that
           the agency did not have enough resources to determine whether each
           of the carriers was violating the out-of-service order. He also
           said that FMCSA recently completed a pilot program in which the
           agency cited obvious violators such as carriers that have a
           roadside inspection outside their home state. In commenting on a
           draft of this report, FMCSA said that it is developing new
           policies and procedures intended to establish a uniform national
           approach for follow-up, as well as additional enforcement action
           against motor carriers that have violated an out-of-service order.
			  
			  The Safety Board Recently Concluded That FMCSA Is Making Adequate
			  Progress in Ensuring That Carriers Do Not Operate under Revoked
			  Authority

           In 2006, the Safety Board assessed FMCSA's approach to ensuring
           that carriers whose operating authority has been revoked do not
           operate and concluded that it was inadequate.^28 The Safety Board
           recommended that FMCSA establish a program to address this issue.
           In response to this recommendation, FMCSA noted that, because the
           numbers of carriers that have been placed out of service or have
           had their operating authority revoked has significantly increased
           in recent years, it is difficult to ensure that these carriers do
           not continue to operate. An FMCSA official attributed this
           difficulty to FMCSA's lack of resources to visit each carrier or
           conduct roadside inspections on all vehicles--the same reason
           FMCSA cites for not following up on all carriers that may be
           violating an out-of-service order. Despite this difficulty, FMCSA
           responded that it (1) is linking its licensing and insurance
           database to its primary carrier database to improve the ability of
           roadside inspection personnel in all states and registration
           offices in PRISM states to identify carriers that have had their
           operating authority revoked and (2) has directed division office
           managers to assess fines when data accessed during roadside
           inspections indicate that carriers were operating under revoked
           authority. In March 2007, the Safety Board said that FMCSA was
           making acceptable progress on the recommendation, but expressed
           concern that some states will choose not to implement PRISM and
           that, based on the program's rate of implementation thus far, it
           will take too long to become fully operational in many other
           states. The Safety Board, therefore, encouraged FMCSA to implement
           PRISM more rapidly in all states. An FMCSA official told us that
           the agency is already making a concerted effort to encourage the 5
           states without PRISM to adopt the program and the 18 PRISM states
           that do not yet have full PRISM capabilities to achieve them.

^28FMCSA's policy calls for the agency to revoke the operating authority
of any carrier that does not have the minimum required amount of insurance
coverage; the minimum amount depends on whether the carrier is for-hire or
private, whether it transports commodities or passengers, and what type of
commodity or number of passengers is transported. Operating without the
minimum required amount of insurance coverage is a serious violation of
the safety regulations.

           FMCSA Has Reduced the Number of Carriers Rated Conditional That Need
			  Follow-up Compliance Reviews, but the Timeliness of These Reviews Is
			  Difficult to Assess

           FMCSA's policy requires the agency to conduct follow-up compliance
           reviews on all carriers rated conditional and, over the last
           several years, the agency has reduced the number of such carriers
           needing review. After the Department of Transportation Inspector
           General reported in 1999 that FMCSA allowed motor carriers with
           less than satisfactory ratings to continue operations for extended
           periods of time, FMCSA began requiring follow-up compliance
           reviews on all carriers rated conditional. In fiscal years 2005
           and 2006, respectively, FMCSA conducted 2,537 and 2,692 follow-up
           reviews of carriers rated conditional or unsatisfactory,^29
           exceeding its annual goal of 2,500 follow-up reviews.^30 In
           addition, from fiscal year 2000 through fiscal year 2006, the
           number of carriers rated conditional that needed a follow-up
           review decreased from about 40,000 to about 30,000.

           While FMCSA has reduced the number of carriers rated conditional
           that need a follow-up review, it is difficult to assess the
           agency's timeliness in conducting these reviews because FMCSA's
           policy does not specify a time frame for following up on carriers
           with conditional safety ratings. The policy does discourage
           follow-up reviews within 12 months because FMCSA believes that
           more time is needed to show the effects of safety improvements.
           Yet the policy also gives FMCSA's division office administrators
           the discretion to determine whether a follow-up review should be
           conducted within 12 months. Almost half of all carriers that
           received a conditional rating from fiscal year 2002 through fiscal
           year 2004 received a follow-up review within 12 months; however,
           because of the policy's allowance for discretion, we could not
           determine how many, if any, of these follow-up reviews, occurred
           too soon. (See table 6.) In addition, because FMCSA does not
           specify a deadline for conducting follow-up reviews, we could not
           determine whether any of the reviews occurred too late. Our
           analysis of the timing of follow-up reviews shows that from fiscal
           year 2002 through fiscal year 2004, 66 percent of the carriers
           that received a conditional rating received a follow-up review
           within 24 months, while 7 percent received a follow-up review more
           than 24 months after they received their conditional rating.
           Another 27 percent of the carriers still needed a review as of
           September 2006.
			  
^29FMCSA also aims to conduct follow-up compliance reviews of carriers
rated unsatisfactory (1) that request a follow-up review or (2) that
received their ratings before November 20, 2000, when FMCSA's regulation
requiring the agency to place carriers rated unsatisfactory out of service
became effective.

^30FMCSA's goal for follow-up reviews includes only those follow-up
reviews conducted by FMCSA. An FMCSA official told us that the agency
chose not to include reviews conducted by the states as part of the goal
because FMCSA receives an appropriation that covers its follow-up reviews.
However, follow-on compliance reviews conducted by states are funded
through a separate appropriation, the Motor Carrier Safety Assistance
Program. FMCSA could choose to have states establish goals when applying
for these funds.

           Table 6: Time Elapsed before Carriers Rated Conditional Received
           Follow-up Compliance Reviews, Fiscal Years 2002 through 2004, as
           of September 2006
			  
                                  2002         2003         2004        Total 
Time elapsed before       Number of    Number of    Number of    Number of 
follow-up compliance      follow-up    follow-up    follow-up    follow-up 
review                      reviews      reviews      reviews      reviews 
0 to 12 months          1,203 (51%)  1,132 (42%)  1,021 (42%)  3,356 (45%) 
More than 12 months to                                                     
18 months                  311 (13)     413 (15)     398 (16)   1,122 (15) 
More than 18 months to                                                     
24 months                    86 (4)      163 (6)      191 (8)      440 (6) 
More than 24 months         180 (8)     274 (10)       86 (4)      540 (7) 
Still need a review        568 (24)     722 (27)     723 (30)   2,013 (27) 
Total                  2,348 (100%) 2,704 (100%) 2,419 (100%) 7,471 (100%) 

           Source: GAO analysis of FMCSA data.
			  
			  FMCSA Is Developing a New Safety Rating Methodology

           In 1999, the Safety Board recommended that FMCSA lower its
           threshold for rating a carrier unsatisfactory to include carriers
           with an unsatisfactory rating in either the driver or vehicle
           factor of the rating scheme. The Safety Board has classified this
           recommendation as one of its "most wanted" safety improvements
           since 2000.^31 Although FMCSA has not yet decided whether it will
           implement this recommendation, it is developing a new rating
           methodology as part of its Comprehensive Safety Analysis 2010
           reform initiative, and it plans to implement the methodology in
           2010. As mentioned previously, the new methodology would base
           determinations of whether carriers are fit to continue operating
           on assessments made by the tool that FMCSA is developing to
           replace SafeStat, rather than on the results of compliance
           reviews. FMCSA believes that the new approach will enable the
           agency to assess the safety fitness of a larger share of the motor
           carrier industry.
			  
^31The Safety Board's most wanted list, which is drawn up from issued
safety recommendations, is intended to emphasize the transportation safety
issues the Safety Board deems most critical.

           FMCSA is also considering determining the safety fitness of
           drivers, and applying interventions to those that it deems need
           them. FMCSA believes that the increased focus that this would
           bring to the safety of drivers is important because the results of
           its recent study on the causes of large truck crashes indicate
           that drivers of large trucks and other vehicles involved in truck
           crashes are 10 times more likely to be the cause of the crash than
           other factors, such as weather, road conditions, and vehicle
           performance. In addition, FMCSA is considering eliminating the
           conditional rating and using only two ratings--"continue to
           operate" and "unfit." An FMCSA official told us that FMCSA may
           eliminate the conditional rating because the agency feels that the
           current satisfactory rating is being misinterpreted by some
           government agencies and members of the public that hire carriers
           as FMCSA's seal of approval. The official said that the agency
           believes that the "continue to operate" rating, which would be
           given to all carriers that are allowed to continue to operate, is
           less likely to be viewed as a seal of approval than the
           satisfactory rating, which indicates a level of safety that is
           greater than the conditional rating that also allows carriers to
           continue operating. Depending on their safety performance,
           carriers or drivers allowed to continue operating could be subject
           to interventions, such as Web-based education, warning letters,
           requests for submission of documents, targeted roadside
           inspections, focused on-site reviews, comprehensive on-site
           reviews (similar to compliance reviews), and enforcement actions.
			  
			  Policy Change Gives FMCSA Appropriate Discretion in Performing
			  Statutorily Required Reviews of High-Risk Carriers

           From August 2006 through February 2007, data from MCMIS indicate
           that FMCSA performed compliance reviews on 1,136 of the 2,220 (51
           percent) carriers that were covered by FMCSA's mandatory
           compliance review policy.^32 Under the Safe, Accountable,
           Flexible, Efficient Transportation Equity Act: A Legacy for Users,
           FMCSA is required to conduct compliance reviews on carriers rated
           in SafeStat categories A or B for 2 consecutive months. In
           response to this requirement, in June 2006, FMCSA implemented a
           policy requiring a compliance review within 6 months for any such
           carrier unless the carrier had received a compliance review within
           the previous 12 months.^33 An FMCSA official told us that the
           agency did not have enough resources to conduct compliance reviews
           on all of the 2,220 carriers within the first 6-month period.

           In April 2007, FMCSA revised the policy because the agency
           believes that it required compliance reviews for some carriers
           that did not need them, leaving FMCSA with insufficient resources
           to conduct compliance reviews on other carriers that did need
           them. The carriers that did not need compliance reviews were those
           that had already had a compliance review and had corrected
           identified violations, but these violations continued to adversely
           affect their SafeStat rating because SafeStat penalizes carriers
           for violations regardless of whether they have been corrected.
           This unnecessary targeting drained resources, leaving FMCSA
           without the means to conduct compliance reviews of carriers that
           had never received such a review, but, in FMCSA's view, should
           have received one because of current safety performance issues
           that led to their placement in SafeStat categories C, D, or E. The
           new policy requires compliance reviews within 6 months for
           carriers that have been in SafeStat categories A or B for 2
           consecutive months and received their last compliance review 2 or
           more years ago (or have never received a compliance review).^34 In
           addition, compliance reviews are recommended for carriers that
           have been in SafeStat categories A or B for 2 consecutive months
           and received their last compliance review more than 1 year ago but
           less than 2 years ago. FMCSA division offices can decide not to
           conduct a compliance review on such a carrier if (1) its SafeStat
           category changes to a category other than A or B or (2) its safety
           evaluation area values are based largely on prior compliance
           review violations that have been corrected or on accidents or
           inspections that occurred prior to the carrier's last compliance
           review. We believe that these changes are consistent with the
           act's requirement and give FMCSA appropriate discretion in
           allocating its compliance review resources.
			  
^32An FMCSA official told us that the agency believes that using MCMIS
data results in an overestimate of the number of carriers that were
required to receive, but did not receive, a compliance review, primarily
because the agency has indications that some carriers listed as active in
MCMIS are actually inactive. The official said that FMCSA's eastern
service center examined the cases of 95 of the 162 carriers that MCMIS
indicated did not receive a compliance review even though one was required
and found that 39 of them did not require a compliance review, and 7
actually did receive a compliance review.

^33The first group of carriers to be affected by this policy was the 2,220
carriers in SafeStat categories A or B in both July and August 2006 that
did not receive a compliance review in the previous 12 months (another
2,887 carriers that were in SafeStat categories A or B in both July and
August 2006 did receive a compliance review in the previous 12 months).

           FMCSA Has Substantially Reduced Its Backlog of Enforcement Cases

           From October 2005 through October 2006, FMCSA reduced its backlog
           of enforcement cases that had been open for 6 months or more by
           about 70 percent (from 807 to 247).^35 As the Department of
           Transportation Inspector General has noted, a large backlog of
           enforcement cases negatively affects the integrity of the
           enforcement process for two reasons. First, because FMCSA
           considers only closed enforcement cases when targeting motor
           carriers for a compliance review, high-risk motor carriers are
           less likely to be selected if they have an open enforcement case.
           Second, because FMCSA assesses smaller fines against carriers with
           open cases than against those with closed cases, it may not assess
           appropriate fine amounts against carriers with multiple
           enforcement cases (the number of prior enforcement cases is one of
           the criteria that FMCSA uses to determine fine amounts). FMCSA's
           2002 review of its compliance review program also found that
           delays in closing enforcement cases were negatively affecting the
           integrity of the agency's enforcement process. An FMCSA official
           told us that in response to this review, the agency assigned a
           second attorney to work on enforcement cases. In 2005, we
           recommended that FMCSA establish a goal specifying how much it
           would like to reduce the enforcement backlog and by what date. In
           March 2007, FMCSA implemented this recommendation by establishing
           goals to (1) close, by the end of 2007, its backlog of 63
           enforcement cases in its division offices that had been open for
           270 days or more and (2) close, by August 31, 2007, its backlog of
           14 cases pending before its Assistant Administrator for
           Enforcement for more than 18 months, without adding other cases to
           this backlog.
			  
^34For the carriers that have received a prior compliance review, FMCSA
would be able to extend the deadline to 12 months if it has applied an
alternative intervention, such as a consent agreement. A consent agreement
is an agreement between FMCSA and a carrier that can lower the amount of
an assessed fine in exchange for corrective action and additional safety
improvements by the carrier.

^35We defined the backlog as consisting of enforcement cases that had been
open for 6 months or more to be consistent with our and the Inspector
General's earlier work on the backlog. See GAO, Large Truck Safety:
Federal Enforcement Efforts Have Been Stronger Since 2000, but Oversight
of State Grants Needs Improvement, [53]GAO-06-156 (Washington, D.C.:
Dec.15, 2005) and U.S. Department of Transportation Office of Inspector
General, Motor Carrier Safety Program, Federal Highway Administration,
Report TR-1999-091 (Washington, D.C.: Apr. 26, 1999).We did not compare
how FMCSA closed the cases that were and were not backlogged because doing
so would have required too many resources.	

           FMCSA Does Not Assess Maximum Fines Against All of the Serious
			  Violators That the Law Requires

           FMCSA does not assess maximum fines against all of the serious
           violators that we believe the law requires. The law requires FMCSA
           to assess the maximum allowable fine for each serious violation by
           a carrier that is found (1) to have a pattern of committing such
           violations (pattern requirement) or (2) to have previously
           committed the same or a related serious violation (repeat
           requirement).^36 The legislative history of this provision
           provides evidence that FMCSA must assess maximum fines in these
           two distinct situations.^37 However, FMCSA's policy on maximum
           fines does not fully meet these requirements. FMCSA enforces both
           requirements using what is known as the "three strikes rule,"
           applying the maximum allowable fine when it finds that a motor
           carrier has violated the same regulation three times within 6
           years. FMCSA officials said they interpret both parts of the act's
           requirements to refer to repeat violations, and because they
           believe that having two distinct policies on repeat violations
           would confuse motor carriers, FMCSA has chosen to address both
           requirements with its single three strikes policy. According to
           FMCSA officials, FMCSA developed the three strikes policy in
           response to a provision in the Motor Carrier Safety Act of
           1984,^38 which permitted FMCSA's predecessor to assess a fine of
           up to $1,000 per offense (capped at $10,000) if the agency
           determined that "a serious pattern of safety violations" existed
           or had occurred. FMCSA officials told us that when Congress in
           1999 enacted the current "pattern of violations" language in the
           Motor Carrier Safety Improvement Act, the agency interpreted it to
           be similar to the previous language and to mean three strikes.^39

^36Motor Carrier Safety Improvement Act of 1999, Pub. L. No. 106-159, S
222(b)(2), 113 Stat. 1748, 1769 (49 U.S.C.A. S 521 Note).

^37See  statement of Congressman Oberstar, then ranking member of the
Committee on Transportation and Infrastructure, explaining, along with
then-Chairman Shuster, the Motor Carrier Safety Improvement Act of 1999,
145 Cong. Rec. H12868-12870 (Daily ed. Nov. 9, 1999). After observing that
prior federal efforts at motor carrier oversight had proved to have major
deficiencies, he stated:

"The bill makes numerous programmatic changes to improve safety by keeping
dangerous drivers off the roads and enhancing oversight....

"Violators of safety laws and regulations will face penalties high enough
to promote future compliance. Maximum fines will be assessed for repeat
offenders as well as a pattern of violations of our safety laws and
regulations." (Emphasis added.)

While the congressional committees did not submit reports on this
legislation, the Chairman introduced materials to serve as the joint
statement of managers for the legislation. Those materials and other floor
statements also referred to repeat offenders or a pattern of violations.
Id. at H.12874.

           FMCSA's interpretation does not carry out the statutory mandate to
           impose maximum fines in two different cases. In contrast to FMCSA,
           we read the statute's use of the distinct terms "a pattern of
           violations" and "previously committed the same or a related
           violation" as requiring FMCSA to implement two distinct policies.
           A basic principle of statutory interpretation is that distinct
           terms should be read as having distinct meanings. In this case,
           the statute not only uses different language to refer to the
           violations for which maximum fines must be imposed, but it also
           sets them out separately and makes either type of violation
           subject to the maximum penalties. Therefore, one carrier may
           commit a variety of serious violations and another carrier may
           commit a serious violation that is the same as, or substantially
           similar to, a previous serious violation; the language on its face
           requires FMCSA to assess the maximum allowable fine in both
           situations--for a pattern of violations, as well as a repeat
           offense.

^38Pub. L. No. 98-554, title II, 98 Stat. 2832, 2842 (1984).

^39In making its argument, FMCSA is referring to the Office of Motor
Carriers, which was an office within the Federal Highway Administration
until 1999, the year when FMCSA was created with the adoption of the Motor
Carrier Safety Improvement Act. That act strengthened and transferred to
FMCSA the functions previously assigned to the Office of Motor Carriers.
Furthermore, section 222(b)(2) not only used different language in the
requirements for the imposition of fines; it also made the imposition of
the maximum fines mandatory and specifically included repeat, as well as
patterns of violations of critical or acute regulations. In this context,
we do not agree that section 222(b)(2) was just a continuation of earlier,
less specific, discretionary authority. Section 222(b)(2), along with
other changes, was part of a congressional design to remedy what Congress
viewed as serious shortcomings in the Office of Motor Carriers. Congress
denied funding to that office under section 338 of the Department of
Transportation and Related Agencies Appropriations Act, 2000, Pub. L. No.
106-69, 113 Stat. 986 (1999), with responsibility for trucking safety
being temporarily transferred to the Office of the Secretary. Only
thereafter was FMCSA created as a separate administration within the
Department of Transportation.

           FMCSA could define a pattern of serious violations in numerous
           ways that are consistent with the act's pattern requirement. Our
           application of eight potential definitions shows that the number
           of carriers that would be subject to maximum fines depends greatly
           on the definition. (See table 7.) For example, a definition
           calling for two or more serious violations in each of at least
           four different regulatory areas during a compliance review would
           have made 38 carriers subject to maximum fines in fiscal year
           2006. In contrast, a definition calling for one or more serious
           violations in each of at least three different regulatory areas
           would have made 1,529 carriers subject to maximum fines during
           that time.^40

Table 7: Number of Motor Carriers That Would Have Been Subject to Maximum
Fines under Various Definitions of a Pattern of Serious Violations, Fiscal
Years 2004 through 2006

             Number of carriers in  Number of carriers in  Number of carriers in
                   2004 with              2005 with              2006 with
 Number of                                                                       
 regulatory   1 or more  2 or more   1 or more  2 or more   1 or more  2 or more 
 areas with     serious    serious     serious    serious     serious    serious 
 serious     violations violations  violations violations  violations violations 
 violations    per area   per area    per area   per area    per area   per area 
 2 or more        2,935        177       3,004        158       3,348        225 
 3 or more        1,372         64       1,430         58       1,529        114 
 4 or more          494         16         557         25         530         38 
 5 or more           83          2         115          9         115          7 

Source: GAO analysis of FMCSA data.

We also interpret the statutory language for the repeat requirement as
calling for a "two strikes" rule as opposed to FMCSA's three strikes rule.
FMCSA's interpretation imposes the maximum fine only after a carrier has
twice previously committed a serious violation. The language of the
statute does not allow FMCSA's interpretation; rather it requires FMCSA to
assess the maximum allowable fine for each serious violation against a
carrier that has previously committed the same serious violation.^41 In
addition, in 2006, the Department of Transportation Inspector General
found that FMCSA's implementation of its three strikes rule had allowed
many third strike violators to escape maximum fines.^42 Specifically, of
the 533 third strike violators of the hours of service or the drug and
alcohol regulations between September 2000 and October 2004, 33 (6
percent) third strike violators were assessed the maximum fine. The
Inspector General found that FMCSA did not consider many of these
violators to be third strike violators because the agency, in keeping with
its policy, did not count the carriers' violations as strikes unless a
violation resulted in the assessment of a fine. FMCSA does not always
notify carriers of serious violations without fines and, therefore, FMCSA
believes that counting such violations as strikes would violate the due
process rights of carriers. The Inspector General agreed and recommended
that FMCSA assess a no-dollar-amount fine or use another appropriate
mechanism to legally notify a motor carrier of the violation and the
policy that future violations will result in the maximum fine amount. An
FMCSA official said that the agency is developing a policy designed to
address this recommendation and plans to consider the related
recommendation in this report as it develops the policy. FMCSA plans to
implement the policy by June 2008.

^40Our definitions are for analysis purposes only. We are neither
suggesting which, if any, of these pattern definitions FMCSA should adopt
as its policy, nor is our exclusive focus on patterns involving only
violations identified during a single compliance review meant to suggest
that the pattern definitions could not require that serious violations
occur over multiple compliance reviews.

^41The statute (section 222(c)) does allow the Secretary to determine and
document that extraordinary circumstances merit a lower-than-maximum fine
in a particular case if, for example, a carrier can establish that
repetition was not a result of its failure to take appropriate remedial
action.

In fiscal years 2004 through 2006, there were more than four times as many
carriers with a serious violation that constituted a second strike than
there were carriers with a third strike. (See table 8.) For example, in
fiscal year 2006, 1,320 carriers had a serious violation that constituted
a second strike, whereas 280 carriers had a third strike.^43

Table 8: Number of Motor Carriers That Would Have Been Subject to Maximum
Fines under Two Strikes and Three Strikes Repeat Violator Policies, Fiscal
Years 2004 through 2006

Policy           2004  2005  2006 Total 
Two strikes     1,251 1,292 1,320 3,863 
Three strikes^a   269   284   280   833 

Source: GAO analysis of FMCSA data.

^aFMCSA's policy currently assesses the maximum fine for three violations
in the same regulatory area.

^42Office of Inspector General, Report MH-2006-046.

^43These figures count all serious violations as strikes, regardless of
whether they resulted in a fine. This is consistent with the policy that
FMCSA is developing in response to the Inspector General's recommendation.

Carriers with a pattern of violations may also commit a second strike
violation. For example, three of the seven carriers that had two or more
serious violations in each of at least five different regulatory areas
also had a second strike in fiscal year 2006. Were FMCSA to make policy
changes along the lines discussed here, we believe that the new policies
should address how to deal with carriers with serious violations that both
are part of a pattern and repeat the same or similar previous violations.

Conclusions

FMCSA's policy for prioritizing carriers for compliance reviews based on
their SafeStat scores furthers motor carrier safety because it targets
many carriers that pose high crash risks and thus has value for reducing
both the number and severity of motor carrier crashes. However, the policy
does not always target the carriers that have the highest crash risks.
Modifications to the policy that we identified could improve FMCSA's
targeting of high-risk carriers, thereby leading to compliance reviews
that would have a greater potential to avoid crashes and their associated
injuries and fatalities. Our June 2007 report found that a regression
model approach would better identify carriers that pose high crash risks
than does SafeStat, enabling FMCSA to better target its resources. We
recommended in that report that FMCSA implement such an approach. However,
if FMCSA does not implement this recommendation, the analysis presented in
this report suggests an alternative approach that would also better target
carriers that pose high crash risks. This approach would give high
priority for compliance reviews to carriers with very poor scores (such as
the worst 5 percent) in the accident safety evaluation area.

While FMCSA follows up with most carriers with serious safety violations,
it has not established a time frame for carriers rated conditional to
receive a follow-up compliance review. As a result, many carriers with
conditional ratings can continue to operate for 2 years or more without a
follow-up compliance review, posing safety risks to themselves and the
public.

Finally, we found that FMCSA assesses maximum fines against carriers that
twice repeat a serious violation. However, because of FMCSA's
interpretation of the statutory requirement to assess maximum fines
against serious violators, many carriers that continue to accrue serious
violations do not have the maximum fine assessed against them. Therefore,
neither the statutory requirement nor FMCSA's enforcement is as effective
as possible in deterring unsafe practices and, as a result, additional
accidents could occur.

Recommendations for Executive Action

In our June 2007 report on the effectiveness of SafeStat, we recommended
that FMCSA use a regression model approach to identify carriers that pose
high crash risks rather than its expert judgment approach. Should the
Secretary of Transportation decide not to implement that recommendation,
we recommend that the Secretary of Transportation direct the FMCSA
Administrator to take the following action:

           o to improve FMCSA's targeting of carriers that pose high crash
           risks, modify FMCSA's policy for prioritizing compliance reviews
           so that carriers with very poor scores (such as the worst 5
           percent) in the accident safety evaluation area will be selected
           for compliance reviews, regardless of their scores in the other
           areas.

           We also recommend that the Secretary of Transportation direct the
           FMCSA Administrator to take the following two actions:

           o to help ensure that carriers rated conditional make safety
           improvements in a timely manner, establish a reasonable time frame
           within which FMCSA should conduct follow-up compliance reviews on
           such carriers and
           o to meet the Motor Carrier Safety Improvement Act's requirement
           to assess maximum fines and improve the deterrent effect of these
           fines, revise FMCSA's related policy to include (1) a definition
           for a pattern of violations that is distinct from the repetition
           of the same or related violations and (2) a two strikes rule
           rather than a three strikes rule.
			  
			  Agency Comments

           We provided a draft of this report to the Department of
           Transportation for its review and comment. The department did not
           offer overall comments on the draft report. It said that it would
           assess the efficacy of the first recommendation, but it did not
           comment on the other recommendations. It offered several technical
           comments, which we incorporated where appropriate.

           As agreed with your office, unless you publicly announce the
           contents of this report earlier, we plan no further distribution
           until 30 days from the report date. At that time, we will send
           copies of this report to congressional committees and
           subcommittees with responsibilities for commercial motor vehicle
           safety issues; the Secretary of Transportation; the Administrator,
           FMCSA; and the Director, Office of Management and Budget. We also
           will make copies available to others upon request. In addition,
           the report will be available at no charge on the GAO Web site at
           http://www.gao.gov.

           If you have any questions about this report, please contact me at
           (202) 512-2834 or [email protected]. Contact points for our Offices
           of Congressional Relations and Public Affairs may be found on the
           last page of this report. Staff who made key contributions to this
           report are listed in appendix V.

           Sincerely yours,

           Susan A. Fleming
			  Director, Physical Infrastructure Issues
			  
			  Appendix I: Other Assessments of SafeStat's Ability to
           Identify High-Risk Motor Carriers

           Several studies by the Volpe National Transportation Systems
           Center (Volpe), the Department of Transportation Office of
           Inspector General, the Oak Ridge National Laboratory (Oak Ridge),
           and others have assessed the predictive capability of the Motor
           Carrier Safety Status Measurement System (SafeStat) model and the
           data used by that model. In general, studies that assessed the
           predictive power of SafeStat offered suggestions to increase that
           power, and studies that assessed data quality found weaknesses in
           the data that the Federal Motor Carrier Safety Administration
           (FMCSA) relies upon.

			  Assessments of SafeStat's Predictive Capability

           The studies we reviewed compared SafeStat with random selection to
           determine which does a better job of selecting carriers that pose
           high crash risks and assessed whether statistical approaches could
           improve that selection and whether carrier financial positions or
           driver convictions are associated with crash risk.
			  
			    Predictive Capability of SafeStat Compared with Random Selection

           In its 2004 and 1998 studies of the SafeStat model,^1 Volpe
           analyzed retrospective data to determine how many crashes carriers
           in SafeStat categories A and B experienced over the following 18
           months. The 2004 study used the carrier rankings from an
           application of the SafeStat model on March 21, 2001. Volpe then
           compared the SafeStat carrier safety ratings with state-reported
           data on crashes that occurred between March 22, 2001, and
           September 21, 2002, to assess the model's performance. For each
           carrier, Volpe calculated a total number of crashes, weighted for
           time and severity, and then estimated a rate per 1,000 vehicles
           for comparing carriers in SafeStat categories A and B with the
           carriers in other SafeStat categories. The 1998 Volpe study used a
           similar methodology. Each study used a constrained subset of
           carriers rather than the full list contained in the Motor Carrier
           Management Information System (MCMIS).^2 Both studies found that
           the crash rate for the carriers in SafeStat categories A and B was
           substantially higher than for the other carriers during the 18
           months after the particular SafeStat run. On the basis of this
           finding, Volpe concluded that the SafeStat model worked.

^1David Madsen and Donald Wright, Volpe National Transportation Systems
Center, An Effectiveness Analysis of SafeStat (Motor Carrier Safety Status
Measurement System), Paper No. 990448, November 1998 and John A. Volpe
National Transportation Systems Center, Motor Carrier Safety Assessment
Division, SafeStat Effectiveness Study Update, March 2004.

^2Volpe included only carriers which met one or more of the following
conditions: two or more reported crashes; three or more roadside
inspections during the preceding 30 months; an enforcement action within
the past 6 years; or a compliance review within the previous 18 months.
This is consistent with the SafeStat minimum event requirements.

           In response to a recommendation by the Department of
           Transportation Office of Inspector General,^3 FMCSA contracted
           with Oak Ridge to independently review the SafeStat model. Oak
           Ridge assessed the SafeStat model's performance and used the same
           data set (for March 21, 2001) provided by Volpe, which Volpe had
           used in its 2004 evaluation. Perhaps not surprisingly, Oak Ridge
           obtained a similar result for the weighted crash rate of carriers
           in SafeStat categories A and B over the 18-month follow-up period.
           Like the Volpe studies, the Oak Ridge study was constrained
           because it was based on a limited data set rather than the entire
           MCMIS data set.
			  
			    Application of Regression Models to Safety Data

           While SafeStat does better than simple random selection in
           identifying carriers that pose high crash risks, other methods can
           also be used. Oak Ridge extended Volpe's analysis by applying
           regression models to identify carriers that pose high crash risks.
           Specifically, Oak Ridge applied a Poisson regression model and a
           negative binomial model using the safety evaluation area scores as
           independent variables to a weighted count of crashes that occurred
           in the 30 months before March 21, 2001.^4

           In addition, Oak Ridge applied the empirical Bayes method to the
           negative binomial regression model and assessed the variability of
           carrier crash counts by estimating confidence intervals.^5 Oak
           Ridge found that the negative binomial model worked well at
           identifying carriers that pose high crash risks. However, the data
           set Oak Ridge had to use did not include any carriers with one
           reported crash in the 30 months before March 21, 2001. Because the
           data included only carriers with zero or two or more reported
           crashes, the distribution of crashes was truncated.
			  
^3U.S. Department of Transportation Office of Inspector General,
Improvements Needed in the Motor Carrier Safety Status Measurement System,
Report MH-2004-034 (Washington, D.C.: Feb. 13, 2004).

^4Both the Poisson model and the negative binomial model are statistically
appropriate for use when modeling counts are positive and integer valued.
The two models differ in their assumptions about the mean and variance.
Whereas the Poisson model assumes that the mean and the variance are
equal, the negative binomial model assumes that the mean is not equal to
the variance.

^5The empirical Bayes method takes a weighted average of the rate of
crashes for a carrier from a prior period of time and the predicted mean
number of crashes from the negative binomial regression. This method
optimizes the identification of carriers with the highest number of future
crashes.

           Since the Oak Ridge regression model analysis did not cover
           carriers with safety evaluation area data and one reported crash,
           the findings from the study are limited in their
           generalizeability. However, other modeling analyses of crashes at
           intersections and on road segments have also found that the
           negative binomial regression model works well.^6 In addition, our
           analysis, using a more recent and more comprehensive data set,
           supports the finding that the negative binomial regression model
           performs better than the SafeStat model.

           The studies carried out by other authors advocate the use of the
           empirical Bayes method in conjunction with a negative binomial
           regression model to estimate crash risk. Oak Ridge also applied
           this model to identify motor carriers that pose high crash risks.
           We applied this method to the 2004 SafeStat data and found that
           the empirical Bayes method best identified the carriers with the
           largest number of crashes in the 18 months after June 25, 2004.
           However, the crash rate per 1,000 vehicles was much lower than
           that for carriers in SafeStat categories A and B. We analyzed this
           result further and found that although the empirical Bayes method
           best identifies future crashes, it is not as effective as the
           SafeStat model or the negative binomial regression model in
           identifying carriers with the highest future crash rates. The
           carriers identified with the empirical Bayes method were
           invariably the largest carriers. This result is not especially
           useful from a regulatory perspective. Companies operating a large
           number of vehicles often have more crashes over a period of time
           than smaller companies. However, this does not mean that the
           larger company is necessarily violating more safety regulations or
           is less safe than the smaller company. For this reason, we do not
           advocate the use of the empirical Bayes method in conjunction with
           the negative binomial regression model as long as the method used
           to calculate the safety evaluation area values remains unchanged.
           If changes are made in how carriers are rated for safety, this
           method may in the future offer more promise than the negative
           binomial regression model alone.
			  
^6Ezra Hauer, Douglas Harwood, and Michael Griffith, The Empirical Bayes
Method for Estimating Safety: A Tutorial, Transportation Research Record
1784, National Academies Press, 2002, 126-131.
 
           Appendix II: FMCSA's Crash Data Used to Compare Methods for
			  Identifying High-Risk Carriers

           The quality of crash data is a long-standing problem that hinders
           FMCSA's ability to accurately identify carriers that pose high
           crash risks.^1 Despite the problems of late-reported crashes and
           incomplete and inaccurate data on crashes, the data were of
           sufficient quality for our use, which was to assess whether
           different approaches to categorizing carriers could lead to better
           identification of carriers that subsequently have high crash
           rates. Our reasoning is based on our use of the same data set to
           compare the crash risk of carriers in SafeStat categories A or B
           and of carriers that score among the worst 25, 10, or 5 percent in
           an individual safety evaluation area. Limitations in the data
           would apply equally to both results. FMCSA has undertaken a number
           of efforts to improve crash data quality.
			  
  			  Late Reporting Had a Small Effect on SafeStat's Ability to
			  Identify High-risk Carriers

           FMCSA's guidance requires states to report all crashes to MCMIS
           within 90 days of their occurrence. Late reporting can cause
           SafeStat to miss some of the carriers that should have received a
           SafeStat score. Moreover, since SafeStat scoring involves a
           relative ranking of carriers, a carrier may receive a SafeStat
           score and have to undergo a compliance review because crash data
           for a higher risk carrier were reported late and not included in
           the calculation.

           Late reporting affected SafeStat's ability to identify all
           high-risk carriers to a small degree--missing about 6
           percent---for the period that we studied. Late reporting of
           crashes by states also affected the safety rankings of more than
           600 carriers, both positively and negatively. When SafeStat
           analyzed the 2004 data, which did not include the late-reported
           crashes, it identified 4,989 motor carriers as highest risk,
           meaning they received a category A or B ranking. With the addition
           of late-reported crashes, 481 carriers moved into the highest risk
           category, and 182 carriers dropped out of the highest risk
           category, resulting in a net increase of 299 carriers (6 percent)
           in the highest risk category. After the late-reported crashes were
           added, 481 carriers that originally received a category C, D, E,
           F, or G SafeStat rating received an A or B rating. These carriers
           would not originally have been given a high priority for a
           compliance review because the SafeStat calculation did not take
           into account all of their crashes. On the other hand, a number of
           carriers would have fared better if the late-reported crashes had
           been included in their score. Specifically, 182 carriers--or fewer
           than 4 percent of those ranked--fell from the A or B category into
           the C, D, E, F, or G category once the late-reported crashes were
           included.^2 These carriers would have avoided a compliance review
           if all crashes had been reported on time. Overall, however, the
           vast majority of carriers (96 percent) were not negatively
           affected by late reporting.
			  
^1For another assessment of data quality, see U.S. Department of
Transportation Office of Inspector General, Improvements Needed in the
Motor Carrier Safety Status Measurement System, Report MH-2004-034
(Washington, D.C.: Feb. 13, 2004).
			  
           The timeliness of crash reporting seems to be improving. The
           median number of days it took states to report crashes to MCMIS
           dropped from 225 days in calendar year 2001 to 57 days in 2005
           (the latest data available at the time of our analysis).^3 In
           addition, the percentage of crashes reported by states within 90
           days of occurrence has jumped from 32 percent in fiscal year 2000
           to 89 percent in fiscal year 2006. (See fig. 3.)

           Figure 3: Percentage of Crashes Submitted to MCMIS within 90 Days
           of Occurrence, Fiscal Years 2000 through 2006
			  
^2These 182 carriers were no longer in the worst 25 percent for the
accident safety evaluation area after the addition of the late-reported
crashes.

^3One reason for the improvement in the timeliness of reporting for the
most recent year is that an unknown number of crashes that occurred in
2005 had still not been reported, as of June 2006, the date we obtained
these data.

           Incomplete Data from States Limit SafeStat's Identification of All
			  Carriers That Pose High Crash Risks

           FMCSA uses a motor carrier identification number, which is unique
           to each carrier, as the primary means of linking inspections,
           crashes, and compliance reviews to motor carriers. Approximately
           184,000 (75 percent) of the 244,000 crashes reported to MCMIS
           between December 2001 and June 2004 involved interstate carriers.
           Of these 184,000 crashes, nearly 24,000 (13 percent) were missing
           this identification number. As a result, FMCSA could not match
           these crashes to motor carriers or use data from them in SafeStat.
           In addition, the carrier identification number could not be
           matched to one listed in MCMIS for 15,000 (8 percent) other
           crashes that involved interstate carriers. Missing data or data
           that cannot be matched to carriers for nearly one quarter of the
           crashes for the period of our review potentially have a large
           impact on a motor carrier's SafeStat score because SafeStat treats
           crashes as the most important source of information for assessing
           motor carrier crash risk. Theoretically, information exists to
           match crash records to motor carriers by other means, but such
           matching would require too much manual work to be practicable.

           We were not able to quantify the actual effect of the missing data
           and the data that could not be matched for MCMIS overall. To do
           so, we would have had to gather crash records at the state
           level--an effort that was impractical. For the same reason, we
           cannot quantify the effects of FMCSA's efforts to improve the
           completeness of the data (discussed later). However, the
           University of Michigan Transportation Research Institute issued a
           series of reports analyzing the completeness of the data submitted
           to MCMIS by the states.^4 One of the goals of the research was to
           determine the states' crash reporting rates. Reporting rates
           varied greatly among the 14 states studied, ranging from 9 percent
           in New Mexico in 2003 to 83 percent in Missouri in 2005. It is not
           possible to draw wide-scale conclusions about whether states'
           reporting rates are improving over time because only 2 of the
           states--Missouri and Ohio--were studied in multiple years.
           However, the reporting rates of these 2 states did improve.
           Missouri experienced a large improvement in its reporting rate,
           with 61 percent of eligible crashes reported in 2001, and 83
           percent reported in 2005. Ohio's improvement was more modest,
           increasing from 39 percent in 2000 to 43 percent in 2005.
			  
^4The University of Michigan Transportation Research Institute's reports
on state crash reporting can be found at
[61]http://www.umtri.umich.edu/publicationList.php ?
divID=4&t=8uFEHJI&plc=63|9||5|CHRON||||. State reports issued by the
University of Michigan Transportation Research Institute cover California,
Florida, Illinois, Iowa, Louisiana, Maryland, Michigan, Missouri, New
Jersey, New Mexico, North Carolina, Ohio, Washington, and Nebraska. We
included all of these reports in our review.

           The University of Michigan Transportation Research Institute's
           reports also identified a number of factors that may affect
           states' reporting rates. One of the main factors affecting
           reporting rates is the reporting officer's understanding of crash
           reporting requirements. The studies note that reporting rates are
           generally lower for less serious crashes and for crashes involving
           smaller vehicles, which may indicate that there is some confusion
           about which crashes are reportable. Some states, such as Missouri,
           aid the officer by explicitly listing reporting criteria on the
           police accident reporting form, while other states, such as
           Washington, leave it up to the officer to complete certain
           sections of the form if the crash is reportable, but the form
           includes no guidance on reportable crashes. Other states, such as
           North Carolina and Illinois, have taken this task out of officers'
           hands and include all reporting elements on the police accident
           reporting form. Reportable crashes are then selected centrally by
           the state, and the required data are transmitted to MCMIS.
			  
			  Inaccurate Data Potentially Limit SafeStat's Ability to Identify
			  Carriers That Pose High Crash Risks

           Inaccurate data, such as information on nonqualifying crashes
           reported to FMCSA, potentially have a large impact on a motor
           carrier's SafeStat score because SafeStat treats crashes as the
           most important source of information for assessing motor carrier
           crash risk. The University of Michigan Transportation Research
           Institute's reports on state crash reporting show that, among the
           14 states studied, incorrect reporting of crash data is
           widespread. This inaccuracy limits SafeStat's ability to identify
           carriers that pose high crash risks. In the most recent reports,
           the researchers found that, in 2005, Ohio incorrectly reported
           1,094 (22 percent) of the 5,037 cases it reported, and Louisiana
           incorrectly reported 137 (5 percent) of the 2,699 cases it
           reported. In Ohio, most of the incorrectly reported crashes did
           not qualify because they did not meet the crash severity
           threshold. In contrast, most of the incorrectly reported crashes
           in Louisiana did not qualify because they did not involve vehicles
           eligible for reporting. Other states studied by the institute had
           similar problems with reporting crashes that did not meet the
           criteria for reporting to MCMIS. The addition of these
           nonqualifying crashes could cause some carriers to exceed the
           minimum number of crashes required to receive a SafeStat rating
           and result in SafeStat's mistakenly identifying carriers as posing
           high crash risks. Because each report focuses on reporting in one
           state in a particular year, it is not possible to identify the
           number of cases that have been incorrectly reported nationwide
           and, therefore, it is not possible to determine the impact of
           inaccurate reporting on SafeStat's calculations.

           We also found examples of crashes that are reported to MCMIS but
           cannot be used by SafeStat because of data errors. Specifically,
           we found that the carrier's identification number cannot be
           matched to an identification number in MCMIS in 8 percent of
           reported crashes. FMCSA cannot link these crashes to specific
           carriers without an accurate identification number and, therefore,
           cannot use these crashes in the SafeStat model to identify
           carriers that pose high crash risks.

           As noted in the University of Michigan Transportation Research
           Institute's reports, states may be unintentionally submitting
           incorrect data to MCMIS because of difficulties in determining
           whether a crash meets the reporting criteria. For example, in
           Missouri, pickups are systematically excluded from MCMIS crash
           reporting, which may cause the state to miss some reportable
           crashes. This may occur because, in recent years, a number of
           pickups have been equipped with rear axles that may increase their
           weight above the reporting threshold and make crashes involving
           them eligible for reporting. There is no way for the state to
           determine which crashes involving pickups qualify for reporting
           without examining the characteristics of each vehicle. In this
           case, the number of omissions is likely to be relatively small,
           but this example demonstrates the difficulty states may face when
           identifying reportable crashes.

           In addition, in some states, the information contained in the
           police accident report may not be sufficient for the state to
           determine if a crash meets the accident severity threshold. It is
           generally straightforward to determine whether a fatality occurred
           as a result of a crash, but it may be difficult to determine
           whether an injured person was transported for medical attention or
           a vehicle was towed because of disabling damage. In some states,
           such as Illinois and New Jersey, an officer can indicate on the
           form if a vehicle was towed by checking a box, but there is no way
           to identify whether the reason for towing was disabling damage. It
           is likely that such uncertainty results in overreporting because
           some vehicles may be towed for other reasons.

           FMCSA has taken steps to try and improve the quality of crash data
           reporting. As we noted in November 2005, FMCSA has undertaken two
           major efforts to help states improve the quality of crash data.^5
           One program, the Safety Data Improvement Program, has provided
           funding to states to implement or expand activities designed to
           improve the completeness, timeliness, accuracy, and consistency of
           their data. FMCSA has also used a data quality rating system to
           rate and display ratings for the quality of states' crash and
           inspection data. Because these ratings are public, this system
           creates an incentive for states to improve their data quality.

           To further improve these programs, FMCSA has awarded additional
           grants to several states and implemented our recommendations to
           (1) establish specific guidelines for assessing states' requests
           for funding to support data improvement in order to better assess
           and prioritize the requests and (2) increase the usefulness of its
           state data quality map as a tool for monitoring and measuring
           commercial motor vehicle crash data by ensuring that the map
           adequately reflects the condition of the states' commercial motor
           vehicle crash data.

           In February 2004, FMCSA implemented Data Q's, an online system
           that allows for challenging and correcting erroneous crash or
           inspection data. Users of this system include motor carriers, the
           general public, state officials, and FMCSA. In addition, in
           response to a recent recommendation by the Department of
           Transportation Inspector General, FMCSA is planning to conduct a
           number of evaluations of the effectiveness of a training course on
           crash data collection that it will be providing to states by
           September 2008.

           While the quality of crash data is sufficient for use in assessing
           whether different approaches to categorizing carriers could lead
           to better identification of carriers that subsequently have high
           crash rates and has started to improve, commercial motor vehicle
           crash data continue to have some problems with timeliness,
           completeness, and accuracy. These problems have been
           well-documented in several studies, and FMCSA is taking steps to
           address the problems through studies of each state's crash
           reporting system and grants to states to fund improvements. As a
           result, we are not making any recommendations in this area.

^5GAO, Highway Safety: Further Opportunities Exist to Improve Data on
Crashes Involving Commercial Motor Vehicles, [62]GAO-06-102 (Washington,
D.C.: Nov. 18, 2005).

           Appendix III: Revof Motor Carrier and Driver Crash Risk

           Several studies have identified relationships between certain
           characteristics of motor carriers and drivers and their crash
           risks. Theses characteristics include carrier financial
           performance, carrier size, driver pay, and driver age.
			  
			  Relationship of Motor Carrier Characteristics and Crash Risk

           The studies we reviewed assessed whether financial performance or
           other characteristics of carriers, such as size, are associated
           with crash risk.
			  
			    Carrier Financial Performance

           Our 1991 study developed a model that linked changes in economic
           conditions to declining safety performance in the trucking
           industry.^1 The study hypothesized that a decline in economic
           performance among motor carriers leads to a decline in safety
           performance in one or more of the following ways: (1) a lowering
           of the average quality of driver performance; (2) downward wage
           pressures encouraging driver noncompliance with safety
           regulations; (3) less management emphasis on safety practices; (4)
           deferred truck maintenance and replacement; and/or (5) the
           introduction of larger, heavier, multitrailer trucks. Using data
           on 537 carriers drawn from the Department of Transportation and
           the Interstate Commerce Commission, we found that seven financial
           ratios show promise as predictors of truck firms' safety. For five
           of the seven financial variables we examined, firms in the weakest
           financial position had the highest subsequent accident rates. For
           example, weakness in any of three measures of
           profitability--return on equity, operating ratio, and net profit
           margin--was associated with subsequent safety problems as measured
           by accident rates.

           On behalf of FMCSA, a study carried out by Corsi, Barnard, and
           Gibney in 2002 examined how data on carriers' financial
           performance correlate with a carrier's safety rating following a
           compliance review.^2 The authors selected motor carriers from
           MCMIS in December 2000 with complete data for the accident,
           driver, vehicle, and safety management safety evaluation areas.
           Using these data, the authors then matched a total of 700 carriers
           to company financial statements in the annual report database of
           the American Trucking Associations.^3 The authors found that
           carriers that received satisfactory ratings following a compliance
           review performed better on two financial measures--operating ratio
           and return on assets--than carriers that received lower ratings.
			  
^1GAO, Freight Trucking: Promising Approach for Identifying Carriers'
Safety Risks, [63]GAO/PEMD-91-13 (Washington, D.C.: Apr. 4, 1991).

^2T. Corsi, R. Barnard, and J. Gibney, "Motor Carrier Industry Profile:
Linkages Between Financial and Safety Performance Among Carriers in Major
Industry Segments," Robert H. Smith School of Business at the University
of Maryland, October 2002.

           Two practical considerations limit the applicability of the
           findings from these two studies to SafeStat. First, the studies'
           samples of 537 and 700 carriers, respectively, are not
           representative of the motor carriers that FMCSA oversees. For
           example, our sample included only the largest for-hire interstate
           carriers because these were the only carriers that were required
           to report financial information to the federal government. The
           carriers selected for the Corsi and others' study were also not
           representative because a very small percentage of the carriers
           evaluated by the SafeStat model in June 2004 had scores for all
           four safety evaluation areas. About 2 percent had a score for the
           the safety management safety evaluation area, and of these, not
           all had complete data for the other three safety evaluation areas.
           Second, FMCSA does not receive annual financial statements from
           carriers and, according to an FMCSA official, it is unlikely that
           the agency could obtain the authority it would need to require
           financial statements from all carriers. In addition, because the
           relationships identified by our study are based on data and
           economic conditions that are almost 20 years old, the
           relationships would need to be reanalyzed within current
           conditions to determine whether they still exist. As part of its
           Comprehensive Safety Analysis 2010 reform initiative, discussed
           earlier in this report, FMCSA decided not to use financial data to
           help assess the safety risk of firms because of the limited
           availability of these data.
			  
			    Other Carrier Characteristics

           A 1994 study by Moses and Savage found that crash rates decline as
           firm size increases; the largest 10 percent of firms have an
           accident rate that is one-third the rate of the smallest 10
           percent of firms.^4 Our 1991 study found that the smallest
           carriers, as a group, had an accident rate that exceeded the rate
           for all firms by 20 percent. The study by Moses and Savage also
           found that (1) private fleets that serve the needs of their parent
           companies, such as manufacturers and retailers, have accident
           rates that are about 20 percent lower than the rates of carriers
           that offer for-hire trucking; (2) carriers of hazardous materials
           have accident rates that are 22 percent higher than the rates of
           carriers that do not transport these goods; and (3) general
           freight carriers have accident rates that are 10 percent higher
           than the rates of other freight carriers. We believe that Moses
           and Savage's findings are reasonable given their study's design,
           data, and methodology, but because the findings are based on data
           and economic conditions that are about 15 to 20 years old, current
           data would need to be reanalyzed within current conditions to
           determine whether the findings are still valid. As mentioned
           above, our study shares this limitation and is further limited by
           an unrepresentative sample of motor carriers. An FMCSA official
           told us that the agency would not want to rely directly on data on
           the size of the carrier to assess safety risk because the agency
           believes that its data on indicators of carrier size, such as
           revenue, number of drivers, and number of power units, are not of
           sufficient quality. Similarly, the agency would not want to
           distinguish between private and for-hire carriers or between
           carriers that carry different types of freight because it does not
           believe that its data are sufficiently reliable.
			  
^3The American Trucking Associations is an association of trucking
associations. Its mission is to serve and represent the interests of the
trucking industry.

^4L.N. Moses and I. Savage, "The Effect of Firm Characteristics on Truck
Accidents," Accident Analysis and Prevention 26, no. 2 (1994).

           Relationship of Driver Characteristics and Crash Risk

           The studies we reviewed assessed whether driver
           characteristics--including convictions for traffic violations, age
           and experience, pay, or frequency of job changes--are associated
           with crash risk.
			  
			    Driver Convictions for Traffic Violations

           A series of studies by Lantz and others examined the effect of
           incorporating conviction data from the state-run commercial driver
           license data system into the calculation of carriers' safety
           management safety evaluation area scores.^5 The studies found that
           the resulting driver conviction measure is weakly correlated with
           the crash-per-vehicle rate.^6 However, the studies did not
           calculate new safety management safety evaluation area scores with
           the proposed driver conviction measure and then use the updated
           measure to estimate new SafeStat scores for carriers. FMCSA uses
           data on driver convictions to help target its roadside
           inspections, and it is considering using such data in the tool it
           is developing to replace SafeStat as part of its Comprehensive
           Safety Analysis 2010 reform initiative.
			  
^5B. Lantz and D. Goettee, An Analysis of Commercial Vehicle Driver
Traffic Conviction Data to Identify Higher Safety Risk Motor Carriers,
Upper Great Plains Transportation Institute and FMCSA, March 2004. B.
Lantz, Development and Implementation of a Driver Safety History Indicator
into the Roadside Inspection Selection System, FMCSA, April 2006.

^6Correlation = 0.085. (FMCSA, Development and Implementation of a Driver
Safety History Indicator into the Roadside Inspection Selection System,
April 2006, 14).

             Driver Age and Experience

           Campbell's 1991 study found that the risk of a fatal crash is
           significantly higher for younger truck drivers than for older
           drivers.^7 Campbell used data from surveys of fatal crashes and
           large truck travel to calculate fatal involvement rates per mile
           driven by driver age. Overall, fatal involvement rates remained
           high through age 26. The fatal crash rates for drivers under 19
           years of age were four times higher than the rate for all drivers,
           and the rates for drivers aged 19 to 20 years were six times
           higher. Our 1991 study found that younger, less experienced
           drivers posed greater-than-average accident risks. In particular,
           compared with drivers 40 to 49 years of age, drivers 21 to 39
           years of age have 28 percent greater odds of accident involvement.
           Compared with those for drivers over 50 years of age, the odds of
           the youngest group of drivers having an accident are about 60
           percent greater. The differences in accident risks between drivers
           with 0 to 13 years of experience, 14 to 20 years of experience,
           and 21 or more years of experience followed a very similar
           pattern. Although Campbell's study provides only limited
           information about the quality of the data it used, we believe that
           its findings are reasonable given the study's design and
           methodology, which relied on multiple kinds of analyses to
           substantiate a higher risk for younger drivers of large trucks. We
           believe that our 1991 findings are reasonable given our study's
           design, data, and methodology. An FMCSA official told us that, at
           this time, the agency would not be able to use driver age in
           SafeStat or in a similar model because the agency does not have
           access to data on all drivers. FMCSA said that it is exploring the
           possibility of gaining broader access to data on drivers, which
           are maintained by the states, so that the agency can use the data
           to help assess the safety of drivers as part of its Comprehensive
           Safety Analysis 2010 reform initiative.
			  
			    Driver Pay

           Belzer and others' 2002 study found that drivers with lower pay
           had higher crash rates.^8 Because economic theory predicts that
           low pay levels are associated with poorer performing workers, the
           study hypothesized that low pay levels for drivers are associated
           with unsafe driving. The study found that for every 10 percent
           more in average driver compensation (mileage rate, unpaid time,
           anticipated annual raise, safety bonus, health insurance, and life
           insurance), the carriers experienced 9.2 percent fewer crashes. We
           believe that this finding is reasonable given the study's design,
           data, and methodology. An FMCSA official told us that the agency
           could not use data on driver pay in SafeStat or in a similar model
           because such data are available only from studies or surveys that
           do not cover the full population of drivers.
			  
^7K. L. Campbell, "Fatal Accident Rates by Driver Age for Large Trucks,"
Accident Analysis and Prevention 23, no. 4 (1991).

^8M. H. Belzer, D. Rodriguez, and S.A. Sedo, "Paying for Safety: An
Economic Analysis of the Effect of Compensation on Truck Driver Safety,"
prepared for FMCSA, September 2002.

             Frequency of Job Changes

           Staplin and others' 2003 study for FMCSA found that drivers that
           average three or more jobs with different carriers each year have
           crash rates that are more than twice as high as drivers that
           average fewer job changes.^9 Although the study authors
           acknowledge several limitations in the data used in study, we
           believe that the data and the analysis approach were sufficiently
           reliable to support the study's finding of a relationship between
           the number of jobs and the number of crashes. An FMCSA official
           told us that, as for data on driver pay, the agency could not use
           data on the frequency of job changes in SafeStat or in a similar
           model because such data are available only from studies or surveys
           that do not cover the full population of drivers.
			  
^9L. Staplin, K. Gish, L. Decina, and R. Brewster, "Commercial Motor
Vehicle Driver Retention and Safety," FMCSA-RT-03-004 (Washington, D.C.:
March 2003).

           Appendix IV: Scope and Methodology

           To determine the extent to which FMCSA's policy for prioritizing
           compliance reviews targets carriers that subsequently have high
           crash rates, we analyzed data from FMCSA's MCMIS on the June 2004
           SafeStat assessment of carriers and on the assessed carriers'
           crashes in the 18 months following the SafeStat assessment. We
           selected June 2004 because this date enabled us to examine MCMIS
           data on actual crashes that occurred in the 18-month period from
           July 2004 through December 2005.^1 We defined various groups of
           carriers for analysis, such as those in each SafeStat category,
           those to which FMCSA gave high priority (i.e., those in categories
           A or B), and those in the worst 5 or 10 percent of carriers in a
           particular safety evaluation area without being in the worst 25
           percent of carriers in any other area. We then calculated the
           aggregate crash rate in the 18 months following the SafeStat
           assessment for each of these groups by dividing the total crashes
           experienced by all the carriers in a group during that time period
           by the total number of vehicles operated by those carriers, as
           reported on their motor carrier census form. We then compared
           crash rates among the various groups to determine whether there
           were any groups with substantially higher aggregate crash rates
           than the carriers in SafeStat categories A or B. We also talked to
           FMCSA officials about how FMCSA developed SafeStat, their views on
           other evaluations of SafeStat, and FMCSA's plans to replace
           SafeStat with a new tool.

           In assessing how FMCSA ensures that its compliance reviews are
           completed thoroughly and consistently, we reviewed our report on
           internal control standards for the federal government. We
           identified key standards in the areas that we believe are critical
           to maintaining the thoroughness and consistency of compliance
           reviews, namely the recording and communication of policy to
           management and others, the clear documentation of processes, and
           the monitoring and reviewing of activities and findings. We
           assessed the extent to which FMCSA's management of its compliance
           reviews is consistent with these internal control standards by
           interviewing FMCSA and state managers and investigators. We
           interviewed investigators who conduct compliance reviews and their
           managers in FMCSA's headquarters office, as well as in 7 of
           FMCSA's 52 field division offices that work with states, two of
           its four regional service centers that support division offices,
           and three state offices that partner with 3 of the FMCSA division
           offices in which we did our work.^2 We also interviewed two safety
           investigators in each of the same 7 division offices. The division
           offices and states that we reviewed--California, Georgia,
           Illinois, New York, Ohio, Pennsylvania, and Texas--received 30
           percent of all the of the grant funds that FMCSA awarded to the
           states in fiscal year 2005 (the latest year for which data were
           available) through its primary grant program, the Motor Carrier
           Safety Assistance Program. Because we chose the seven states
           judgmentally (representing the largest grantees), we cannot
           project our findings nationwide.^3 Reviewing a larger number of
           grantees would not have been practical because of resource
           constraints.
			  
^1We obtained crash data for this period that were reported to FMCSA
through June 2006. This allowed us to obtain data on late-reported crashes
for the July 2004 through December 2005 period.

           We gathered information on the recording and communication of
           policy from discussions with FMCSA officials, documents, and
           system software, including the electronic operations manual. We
           obtained information about how FMCSA documents the findings of
           compliance reviews through discussions with FMCSA officials and
           reviews of FMCSA documents. We obtained information on how FMCSA
           monitors and reviews the performance of its compliance reviews
           through discussions with FMCSA officials and reviews of FMCSA
           documents, including the 2002 report of FMCSA's Compliance Review
           Work Group. The data assessments of the number of vehicles
           inspected during compliance reviews and the percentage of
           applicable areas of the regulations covered by compliance reviews
           since 2001 were provided to us by FMCSA.

           In assessing the extent to which FMCSA follows up with carriers
           with serious violations, we reviewed regulations directing how
           FMCSA should follow up and track these violators and analyzed data
           to determine if FMCSA had met these policies. Particularly, we
           examined FMCSA policies and discussed with FMCSA officials the
           agency's policy to perform a follow-up compliance review on
           carriers in SafeStat categories A and B, its policy to place
           carriers rated unsatisfactory out of service, its policy to
           perform a follow-up compliance review on carriers with a
           conditional rating, and its reduction of its enforcement backlog.
           Additional analysis was performed--as of the end of each fiscal
           year from 2001 through 2006--using data from FMCSA's MCMIS to
           determine the total number of carriers with a conditional rating
           that had not received a follow-up compliance review. We also used
           MCMIS to determine how many carriers with a conditional rating
           received a follow-up compliance review and how soon after the
           original compliance review the second review occurred.
			  
^2We did not interview managers or investigators in three of the seven
states because they do not conduct compliance reviews of interstate
carriers, and we did not interview managers or investigators in one state
because they did not respond to our attempts to contact them.

^3Results from nonprobability samples cannot be used to make inferences
about a population, because in a nonprobability sample some elements of
the population being studied have no chance or an unknown chance of being
selected as part of the sample.

           To assess FMCSA's implementation of the statutory requirement to
           assess the maximum fine against any carrier with either a pattern
           of violations or previously committed violations, we compared
           FMCSA's policy with the language of the act and held discussions
           with FMCSA officials. In addition, we assessed the number of
           carriers that would have been assessed the maximum fine under
           differing definitions of a pattern of violations. We also reviewed
           the report of the Department of Transportation Inspector General
           on the implementation of the policy and documents pertaining to
           FMCSA's response to the Inspector General's report.

           In determining the reliability of FMCSA's data on compliance
           reviews, violations, and enforcement cases, we performed
           electronic testing for obvious errors in accuracy and
           completeness. As part of a recent evaluation of FMCSA's
           enforcement programs, we interviewed officials from FMCSA's data
           analysis office who are knowledgeable about the same data sources.
           We determined that the data were sufficiently reliable for the
           types of analysis we present in this report.

           To assess the extent to which the timeliness, completeness, and
           accuracy of MCMIS and state-reported crash data affect SafeStat's
           performance, we carried out a series of analyses with the MCMIS
           master crash file, and the MCMIS census file, as well as surveying
           the literature to assess other studies' findings on the quality of
           MCMIS data. To assess timeliness, we first measured how many days
           on average it was taking each state to report crashes to FMCSA by
           year for calendar years 2000 through 2005. We also recalculated
           SafeStat scores from June 25, 2004, to include crashes that had
           occurred more than 90 days previously but had not yet been
           reported to FMCSA by that date. We compared the number and
           rankings of carriers from the original SafeStat results with those
           obtained with the addition of late-reported crashes. In addition,
           we reviewed the University of Michigan Transportation Research
           Institute's studies of state crash reporting to MCMIS to identify
           the impact of late reporting in individual states on MCMIS data
           quality.

           To assess completeness, we attempted to match all crash records in
           the MCMIS master crash file for crashes occurring between December
           2001 and June 2004 to the list of motor carriers in the MCMIS
           census file. We used a variety of matching techniques to try and
           match the crash records without a carrier Department of
           Transportation number to carriers listed in the MCMIS census file.
           In addition, we reviewed the University of Michigan Transportation
           Research Institute's studies of state crash reporting to MCMIS to
           identify the impact of incomplete crash reporting in individual
           states on MCMIS data quality.

           To assess accuracy, we reviewed an audit by the Inspector General
           that tested the accuracy of electronic data by comparing records
           selected in the sample to source paper documents. In addition, we
           reviewed the University of Michigan Transportation Research
           Institute's studies of state crash reporting to MCMIS to identify
           the impact of incorrectly reported crashes in individual states on
           MCMIS data quality.

           We determined that the data reported to FMCSA for use in
           SafeStat--while not as timely, complete, or accurate as they could
           be--were of sufficient quality for our use. Through our analyses,
           we found that the data identify many carriers that pose high crash
           risks and are, therefore, useful for the purposes of this report.

           To understand what other researchers have found about how well
           SafeStat identifies motor carriers that pose high crash risks, we
           identified studies through a general literature review and by
           asking stakeholders and study authors to identify high-quality
           studies. The studies included in our review were (1) the 2004
           study of SafeStat done by Oak Ridge National Laboratory, (2) the
           SafeStat effectiveness studies done by the Department of
           Transportation Inspector General and Volpe Institute, (3) the
           University of Michigan Transportation Research Institute's studies
           of state crash reporting to FMCSA, and (4) the 2006 audit by the
           Department of Transportation Inspector General of data for new
           entrant carriers.^4 We assessed the methodology used in each study
           and identified which findings are supported by rigorous analysis.
           We accomplished this analysis by relying on information presented
           in the studies and, where possible, discussing the studies with
           the authors. When the studies' methodologies and analyses appeared
           reasonable, we used the findings from those studies in our
           analysis of SafeStat. We discussed with FMCSA and industry and
           safety stakeholders the SafeStat methodology issues and data
           quality issues raised by these studies. We also discussed the
           aptness of the respective methodological approaches with FMCSA.
           Finally, we reviewed FMCSA documentation on how SafeStat is
           constructed and assessments of SafeStat conducted by FMCSA.

^4Campbell, Schmoyer, and Hwang, Review of the Motor Carrier Safety Status
Measurement System (SAFESTAT), 2004; U.S. Department of Transportation
Office of Inspector General, Improvements Needed in the Motor Carrier
Safety Status Measurement System, Report MH-2004-034 (Washington, D.C.:
Feb. 13, 2004); Madsen and Wright, Volpe National Transportation Systems
Center, An Effectiveness Analysis of SafeStat, November 1998; Volpe
National Transportation Systems Center, SafeStat Effectiveness Study
Update, March 2004; University of Michigan Transportation Research
Institute MCMIS State Reports; U.S. Department of Transportation Office of
Inspector General, Significant Improvements in Motor Carrier Safety
Program Since 1999 Act but Loopholes for Repeat Violators Need Closing,
Report MH-2006-046 (Washington, D.C.: Apr. 21, 2006).

           To identify studies on predictors of motor carrier and driver
           crash risk, we conducted a general literature review. We shared
           this preliminary list of studies with the members of the
           Transportation Research Board's Committee on Truck and Bus Safety
           and requested them to identify additional relevant studies.^5 We
           selected those studies that assessed a relationship between one or
           more motor carrier or driver characteristics and crash risk. Based
           on information presented in the selected studies, we assessed the
           methodology used in each study and report only those findings that
           were based on sound methodology and analysis.
			  
^5The Transportation Research Board is a unit of the National Research
Council, a private, nonprofit institution that is the principal operating
agency of the National Academy of Sciences and the National Academy of
Engineering. The board's mission is to promote innovation and progress in
transportation by motivating and conducting research, facilitating the
dissemination of information, and encouraging the implementation of
research results.

           Appendix V: GAO Contact and Staff Acknowledgments
			  
			  GAO Contact

           Susan A. Fleming, (202) 512-2834, or [54][email protected]
			  
			  Staff Acknowledgments

           In addition to the individual named above, James Ratzenberger,
           Assistant Director; Carl Barden; Elizabeth Eisenstadt; David
           Goldstein; Ryan Gottschall; Laurie Hamilton; Eric Hudson; Bert
           Japikse; and Gregory Wilmoth made key contributions to this
           report.
			  
			  GAO's Mission

           The Government Accountability Office, the audit, evaluation and
           investigative arm of Congress, exists to support Congress in
           meeting its constitutional responsibilities and to help improve
           the performance and accountability of the federal government for
           the American people. GAO examines the use of public funds;
           evaluates federal programs and policies; and provides analyses,
           recommendations, and other assistance to help Congress make
           informed oversight, policy, and funding decisions. GAO's
           commitment to good government is reflected in its core values of
           accountability, integrity, and reliability.
			  
			  Obtaining Copies of GAO Reports and Testimony

           The fastest and easiest way to obtain copies of GAO documents at
           no cost is through GAO's Web site ( [55]www.gao.gov ). Each
           weekday, GAO posts newly released reports, testimony, and
           correspondence on its Web site. To have GAO e-mail you a list of
           newly posted products every afternoon, go to [56]www.gao.gov and
           select "Subscribe to Updates."
			  
			  Order by Mail or Phone

           The first copy of each printed report is free. Additional copies
           are $2 each. A check or money order should be made out to the
           Superintendent of Documents. GAO also accepts VISA and Mastercard.
           Orders for 100 or more copies mailed to a single address are
           discounted 25 percent. Orders should be sent to:

           U.S. Government Accountability Office 441 G Street NW, Room LM
           Washington, D.C. 20548

           To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax:
           (202) 512-6061
			  
			  To Report Fraud, Waste, and Abuse in Federal Programs

           Contact:

           Web site: [57]www.gao.gov/fraudnet/fraudnet.htm E-mail:
           [58][email protected] Automated answering system: (800) 424-5454 or
           (202) 512-7470
			  
			  Congressional Relations

           Gloria Jarmon, Managing Director, [59][email protected] (202)
           512-4400 U.S. Government Accountability Office, 441 G Street NW,
           Room 7125 Washington, D.C. 20548
			  
			  Public Affairs

           Susan Becker, Acting Manager, [60][email protected] (202) 512-4800
           U.S. Government Accountability Office, 441 G Street NW, Room 7149
           Washington, D.C. 20548

(541024)

[65]www.gao.gov/cgi-bin/getrpt?GAO-07-584 .

To view the full product, including the scope
and methodology, click on the link above.

For more information, contact Susan A. Fleming at (202) 512-2834 or
[email protected].

Highlights of [66]GAO-07-584 , a report to the Chairman, Committee on
Transportation and Infrastructure, House of Representatives

August 2007

MOTOR CARRIER SAFETY

Federal Safety Agency Identifies Many High-Risk Carriers but Does Not
Assess Maximum Fines as Often as Required by Law

The Federal Motor Carrier Safety Administration (FMCSA) has the primary
federal responsibility for reducing crashes involving large trucks and
buses. FMCSA uses its "SafeStat" tool to target carriers for reviews of
their compliance with the agency's safety regulations based on their crash
rates and safety violations.

As requested, this study reports on (1) the extent to which FMCSA's policy
for prioritizing compliance reviews targets carriers with a high risk of
crashes, (2) how FMCSA ensures compliance reviews are thorough and
consistent, and (3) the extent to which FMCSA follows up with carriers
with serious safety violations. To complete this work, GAO reviewed
FMCSA's regulations, policies, and safety data and contacted FMCSA
officials in headquarters and nine field offices.

[67]What GAO Recommends

GAO is making several recommendations, including that FMCSA (1) select
certain high-risk carriers in the accident safety evaluation area for
compliance reviews and (2) revise its policy for assessing maximum fines.
The Department of Transportation said that it would assess the efficacy of
the first recommendation, but it did not comment on the other
recommendations.

By and large, FMCSA does a good job of identifying carriers that pose high
crash risks for subsequent compliance reviews, ensuring the thoroughness
and consistency of those reviews, and following up with high-risk
carriers.

FMCSA's policy for prioritizing compliance reviews targets many high-risk
carriers but not other higher risk ones. Carriers must score among the
worst 25 percent of carriers in at least two of SafeStat's four evaluation
areas (accident, driver, vehicle, and safety management) to receive high
priority for a compliance review. Using data from 2004, GAO found that 492
carriers that performed very poorly in only the accident evaluation area
(i.e., those carriers that scored among the worst 5 percent of carriers in
this area) subsequently had an aggregate crash rate that was more than
twice as high as that of the 4,989 carriers to which FMCSA gave high
priority. FMCSA told GAO that the agency plans to assess whether giving
high priority to carriers that perform very poorly in only the accident
evaluation area would be an effective use of its resources.

FMCSA promotes thoroughness and consistency in its compliance reviews
through its management processes, which meet GAO's standards for internal
controls. For example, FMCSA uses an electronic manual to record and
communicate its compliance review policies and procedures and teaches
proper compliance review procedures through both classroom and on-the-job
training. Furthermore, its investigators use an information system to
document their compliance reviews, and its managers review these data,
helping to ensure thoroughness and consistency between investigators. For
the most part, FMCSA and state investigators cover the nine major
applicable areas of the safety regulations (e.g., driver qualifications
and vehicle condition) in 95 percent or more of compliance reviews,
demonstrating thoroughness and consistency.

FMCSA follows up with many carriers with serious safety violations, but it
does not assess maximum fines against all of the serious violators that
GAO believes the law requires. FMCSA followed up with more than 99 percent
of the 1,196 carriers that received proposed unsatisfactory safety ratings
from compliance reviews completed in fiscal year 2005, finding that 881 of
these carriers made safety improvements and placing 309 others out of
service. However, GAO found that FMCSA (1) does not assess maximum fines
against carriers with a pattern of varied serious violations as GAO
believes the law requires and (2) assesses maximum fines against carriers
for the third instance of a violation, whereas GAO reads the statute as
requiring FMCSA to assess the maximum fine for the second.

References

Visible links
  48. http://www.gao.gov/cgi-bin/getrpt?GAO-07-585
  49. http://www.gao.gov/cgi-bin/getrpt?GAO/AIMD-00-21
  50. http://www.gao.gov/cgi-bin/getrpt?GAO-07-585
  51. http://www.gao.gov/cgi-bin/getrpt?GAO-07-585
  52. http://www.gao.gov/cgi-bin/getrpt?GAO/AIMD-00-21
  53. http://www.gao.gov/cgi-bin/getrpt?GAO-06-156
  54. mailto:[email protected]
  55. http://www.gao.gov/
  56. http://www.gao.gov/
  57. http://www.gao.gov/fraudnet/fraudnet.htm
  58. mailto:[email protected]
  59. mailto:[email protected]
  60. mailto:[email protected]
  61. http://www.umtri.umich.edu/publicationList.php
  62. http://www.gao.gov/cgi-bin/getrpt?GAO-06-102
  63. http://www.gao.gov/cgi-bin/getrpt?GAO/PEMD-91-13
  65. http://www.gao.gov/cgi-bin/getrpt?GAO-07-584
  66. http://www.gao.gov/cgi-bin/getrpt?GAO-07-584
*** End of document. ***