The Pipeline and Hazardous Materials Safety Administration’s (PHMSA) Pipeline Safety Research and Development (R&D) Program is holding annual structured peer reviews of active research projects since 2006 in accordance with mandates by the Office of Management and Budget (OMB) and the Office of the Secretary of Transportation (OST) to maintain research data quality. PHMSA holds these reviews virtually via teleconference and the Internet saving time and resources. This execution is also working well with panelists, researchers, Agreement Officers’ Technical Representatives and project co-sponsors. Most impressively, the PHMSA approach facilitates attendance from all U.S. time zones, Canada and Europe.
The annual peer review continues to build on an already strong and systematic evaluation process developed by PHMSA’s Pipeline Safety R&D Program and certified by the Government Accountability Office. The 2011 peer review panel consisted of nine government and independent experts. Three panelists were active Government representatives from the Bureau of Ocean Energy Management, Regulation, and Enforcement, the National Institute of Standards and Technology, and from the Department of Energy Biomass Program. The remaining six panelists were independent contractors and retired Government or retired pipeline operator employees some of which play vital roles as peers for the American Petroleum Institute, the American Society of Mechanical Engineers, the National Association of Corrosion Engineers and other standards developing organizations.
Thirty-three active research projects were peer reviewed by expert panelists using 13 evaluation criteria. These criteria were grouped within the following five evaluation categories:
- Project relevance to the PHMSA mission.
- Project management.
- Approach taken for transferring results to end users.
- Project coordination with other closely related programs.
- Quality of project results.
The rating scale possibilities were "Ineffective," "Moderately Effective," "Effective," or "Very Effective." During the April 2011 review, the average program rating between all the evaluation categories was “Very Effective.” For this year, 24 projects were rated “Very Effective” with 9 projects ranked as “Effective.” The average sub-criteria scoring were also rated very high and underpin these findings. The majority of peered projects and the overall program rating remain “Very Effective” since the initial reviews in 2006. Additional details are available in Section 7 and Tables 4, 5 and in Appendix C of the below report.
|