The Pipeline and Hazardous Materials Safety Administration’s (PHMSA) Pipeline Safety Research and Development (R&D) Program held its first structured peer review of active research projects in February 2006 and the most recent peer review in April 2010. Mandates by the Office of Management and Budget (OMB) and the Office of the Secretary of Transportation (OST) govern these reviews and are keeping PHMSA “Green” in addressing research data quality. Conducting these peer reviews via teleconference and the Internet is saving time and resources. It is also working well with panelists, researchers, Agreement Officers’ Technical Representatives and project co-sponsors. Most impressively, the PHMSA approach facilitates attendance from all U.S. time zones, Canada and Europe.
The annual peer review continues to build on an already strong and systematic evaluation process developed by PHMSA’s Pipeline Safety R&D Program and certified by the Government Accountability Office. The peer review panel consisted of twelve government and industry experts. Two panelists were active Government representatives from the Bureau of Ocean Energy Management, Regulation, and Enforcement and one was an active Government representative from the National Institute of Standards and Technology. The remaining nine panelists are retired Government and retired and active industry personnel who play vital roles as peers for the American Petroleum Institute, the American Society of Mechanical Engineers, the National Association of Corrosion Engineers and other standards developing organizations.
Thirty-five active research projects were peer reviewed by expert panelists using 14 evaluation criteria. These criteria were grouped within the following five evaluation categories:
- Project relevance to the PHMSA mission.
- Project management.
- Approach taken for transferring results to end users.
- Project coordination with other closely related programs.
- Quality of project results.
The rating scale possibilities were "Ineffective," "Moderately Effective," "Effective," or "Very Effective." During the April 2010 review, the average program rating was “Very Effective” for each of the evaluation categories. For this year, 26 projects were rated “Very Effective” with 9 projects ranked as “Effective.” The average sub-criteria scoring were also rated very high and underpin these findings. The majority of peered projects and the overall program rating remain “Very Effective” since the initial reviews in 2006. Additional details are available in Section 7 and Tables 4, 5 and 6 of this report.
|