INTEGRATED MATERIALS PERFORMANCE EVALUATION PROGRAM

REVIEW OF THE [STATE] AGREEMENT STATE [OR NRC REGION] PROGRAM

DATE - DATE, YEAR

DRAFT REPORT

[GENERAL NOTES]:

  • when writing numerical values, if <10 or if the number starts the sentence--spell out number, if >or =10 write number;
  • do not start sentence with acronym, even if it’s been used and defined previously;
  • limit statements to facts affecting performance, not hearsay or assumptions;
  • provide enough detail especially when performance-based issues are found, for the next team to review thoroughly;
  • make Recommendations for issues involving specific problems within the indicator, not for issues that are basically required by the indicator(s)

Enclosure

EXECUTIVE SUMMARY

This report presents the results of the IMPEP review of the {STATE/REGION} Agreement State Program. The review was conducted during the period of [Month date-date, YEAR], by a review team composed of technical staff members from the U.S. Nuclear Regulatory Commission (NRC) and the State of [NAME].

Based on the results of this review, [STATE’s] performance was found[satisfactory/satisfactory, but needs improvement/unsatisfactory,] for the/all indicator(s) [LIST INDICATORS], and [satisfactory/satisfactory, but needs improvement/unsatisfactory] for the (LIST INDICTORS) performance indicator(s) reviewed. (INSERT OTHER NOTABLE FINDINGS: e.g. “The finding for the Compatibility Requirements indicator remains unchanged from the previous IMPEP review. / Progress has been made on the indicator (NAME), but the State has not yet addressed a number of outstanding NRC comments regarding earlier regulation packages/Two regulation amendments were overdue for adoption by the State, etc.).

[INSERT OTHER NOTABLE FINDINGS, e.g. Team recommends Monitoring, Heightened Oversight, etc.]

The review team did not make any recommendations OR made (number) recommendations regarding program performance by the State regarding (LIST BRIEF DESCRIPTION OF RECOMMENDATIONS) and determined that the recommendation(s) from the (YEAR) IMPEP review, regarding (regulation adoption/document security markings/development and implementation of a formal training program/ETC.), should be [closed/kept open/modified].

Accordingly, the review team recommends that the (STATE) Agreement State Program is adequate/adequate, but needs improvement/inadequate) to protect public health and safety and is [compatible/not compatible] with NRC's program. The review team recommends that the next IMPEP review take place in approximately (number) years [and that a periodic meeting be held in…yrs—(ADD THIS STATEMENT ONLY IF THE FREQUENCY OF THE PERIODIC MEETING IS BEING REDUCED or EXTENDED)].

.

Enclosure

(STATE, NRC REGION) Draft ReportPage 1

1.0INTRODUCTION

This report presents the results of the review of the (STATE/NRC REGION) Agreement State Program. The review was conducted during the period of [MONTH DATE-DATE, YEAR], by a review team composed of technical staff members from the U.S. Nuclear Regulatory Commission (NRC) and the State of [NAME]. Team members are identified in Appendix A. The review was conducted in accordance with the “Implementation of the Integrated Materials Performance Evaluation Program and Rescission of Final General Statement of Policy,” published in the Federal Register on October 16, 1997, and NRC Management Directive 5.6, “Integrated Materials Performance Evaluation Program (IMPEP),” dated February 26, 2004. Preliminary results of the review, which covered the period of (DATE) to (DATE), were discussed with [STATE/NRC REGION] managers on the last day of the review.

[A paragraph on the results of the Management Review Board (MRB) meeting will be included in the final report.]

The [STATE/NRC REGION] Agreement State Program is administered by the [EXAMPLE:Bureau of Radiation Control (the Bureau), which is located within the Division of Environmental Health (the Division). The Division is part of the Department of Health (the Department). Organization charts for the Department and the Bureau are included as Appendix B.]

At the time of the review, the [STATE] Agreement State Program regulated [NUMBER] specific licenses authorizing possession and use of radioactive materials. The review focused on the radioactive materials program as it is carried out under the Section 274b. (of the Atomic Energy Act of 1954, as amended) Agreement between NRC and the State of [STATE].

In preparation for the review, a questionnaire addressing the common and applicable non-common performance indicators was sent to the [Bureau/Program/Division, etc.] on [Date]. The [Bureau] provided its response to the questionnaire on [date]. A copy of the questionnaire response can be found in NRC’s Agencywide Documents Access and Management System (ADAMS) using the Accession NumberMLxxxxxxxx.

The review team's general approach for conduct of this review consisted of: (1) examination of the [Bureau]’s response to the questionnaire, (2) review of applicable [STATE] statutes and regulations, (3) analysis of quantitative information from the [Bureau]’s database, (4) technical review of selected regulatory actions, (5) field accompaniments of [number] inspectors, and
(6) interviews with staff and managers. The review team evaluated the information gathered against the established criteria for each common and the applicable non-common performance indicator and made a preliminary assessment of the [STATE] Agreement State Program’s performance.

IF PREVIOUS REPORT HAD RECOMMENDATIONS, INCLUDE THIS SENTENCE:

Section 2.0 of this report covers the State’s actions in response to recommendations made during previous reviews.

OR-

There were no recommendations made during the previous review.

Results of the current review of the common performance indicators are presented in Section [No. 2.0 OR 3.0 DEPENDING ON INCLUSION OF PREVIOUS SENTENCE. Section [3.0 or 4.0] details the results of the review of the applicable non-common performance indicators, and Section [4.0 OR 5.0] summarizes the review team's findings.

2.0STATUS OF ITEMS IDENTIFIED IN PREVIOUS REVIEWS

During the previous IMPEP review, which concluded on [DATE], the review team made (NUMBER) recommendation(s) regarding the [STATE] Agreement State Program’s performance. The status of the recommendation(s) is/are as follows:

LIST PREVIOUS RECOMMENDATIONS AND BRIEF EXPLANATION AS TO WHY THE RECOMMENDATION SHOULD BE CLOSED/MODIFIED/KEPT OPEN

EXAMPLES:

The review team recommends that the State evaluate the effectiveness of their existing procedures and policies for marking and handling sensitive information and modify the existing procedures or policies, if needed, to ensure that documents containing sensitive information are appropriately marked in a consistent manner. (Section 3.3 of the 2007 IMPEP Report)

Status: The State implemented a procedure to ensure that all outgoing documents containing sensitive information are appropriately marked. Internal documents were already being appropriately marked prior to the IMPEP review in 2007. The limitation on this procedure is that, in accordance with the State’s Sunshine Law, only security-related information pertaining to physical security systems (e.g., alarm systems, room diagrams) can be withheld from the public. The review team confirmed that license and inspection documents were marked appropriately, in accordance with the limitations noted above. This recommendation is closed.

------

The review team recommends that the State take additional actions, such as increasing salary and/or benefits, to stabilize staffing and ensure successful program implementation. (Section 3.1 of the 2009 IMPEP report)

Status: In an effort to address the high staff turnover rate experienced by the Program in recent years, management increased starting salaries and introduced flexible work hours, resulting in a better work-life balance. They have also modified management of the Program to give the staff more ownership of the process. Staff members are now part of the decision making process, are involved in the development of processes and procedures, and are involved in workload distribution. Overall management has responded in a positive manner to the issues facing the Program. This recommendation is closed.

The review team recommends that the State update its existing procedures and develop new procedures, if necessary, to institutionalize the policies and practices of the Agreement State program and to serve as a knowledge management tool. (Section 3.1 of the 2009 IMPEP report)

Status: The Program reviewed existing procedures to ensure they were current and accurately reflected any changes to the manner in which they conduct business. This review found that several of their existing procedures needed to be updated. The Program also noted that due to recent NRC operational changes, additional procedures needed to be developed to meet these changes. In response, the staff updated existing procedures and developed new procedures where needed. They then provided staff training on the procedures to ensure they had a common understanding. This recommendation is closed.

The review team recommends that the State evaluate current and future staffing needs and business processes to develop and implement a strategy that improves the effectiveness and efficiency of the Program and ensures its continued adequacy and compatibility. (Section 3.2)

Status: The review team found that during the time period covered by this review, the staffing issue and business process development had been addressed as evidenced by the improvement in the status of inspections. The program has created and utilized a database of license activities. From this database, the program can track inspection frequencies, which allowed the program to improve their inspection efficiency. This recommendation is closed.

3.0COMMON PERFORMANCE INDICATORS

Five common performance indicators are used to review NRC Regional and Agreement State radioactive materials programs. These indicators are: (1) Technical Staffing and Training,
(2) Status of Materials Inspection Program, (3) Technical Quality of Inspections, (4) Technical Quality of Licensing Actions, and (5) Technical Quality of Incident and Allegation Activities.

3.1Technical Staffing and Training

Issues central to the evaluation of this indicator include the Bureau’s staffing level and staff turnover, as well as the technical qualifications and training histories of the staff. To evaluate these issues, the review team examined the Bureau’s questionnaire response relative to this indicator, interviewed managers and staff, reviewed job descriptions and training records, and considered workload backlogs.

The Bureau is managed by the [DESCRIBE ORGANIZATIONAL STRUCTURE]. The Radioactive Materials Program is responsible for [materials inspection, licensing and compliance activities. emergency response activities, etc.].

At the time of the review, there were [Number.] technical staff members with various degrees of involvement in the radioactive materials program, totaling approximately [Number] full-time equivalents (FTE). [No OR NUMBER] positions were vacant at the time of this review. The review team determined that staffing levels were adequate for theAgreement State program.

The Bureau has a documented training plan for technical staff that is consistent with the requirements in the NRC/Organization of Agreement States Training Working Group Report and NRC’s Inspection Manual Chapter (IMC) 1246, “Formal Qualification Programs in the Nuclear Material Safety and Safeguards Program Area.” Staff members are assigned increasingly complex duties as they progress through the qualification process. The review team concluded that the Bureau’s training program is adequate to carry out its regulatory duties and noted that [STATE] management supports the Bureau training program.

Based on the IMPEP evaluation criteria, the review team recommends that [STATE/NRC REGION]’s performance with respect to the indicator, Technical Staffing and Training, be found [satisfactory, satisfactory, but needs improvement or unsatisfactory].

3.2Status of Materials Inspection Program

The review team focused on five factors while reviewing this indicator: inspection frequency, overdue inspections, initial inspections of new licenses, timely dispatch of inspection findings to licensees, and performance of reciprocity inspections. The review team’s evaluation was based on the Bureau’s questionnaire response relative to this indicator, data gathered from the Bureau’s database, examination of completed inspection casework, and interviews with management and staff.

The review team verified that [STATE]'s inspection frequencies for all types of radioactive material licenses are [at least as frequent as, more frequent as], similar license types listed in IMC 2800, “Materials Inspection Program.” [NUMBER] of the [NUMBER] license categories established by the Bureau were assigned inspection priority codes that prescribe a more frequent inspection schedule than those established in IMC 2800 for similar license types.

The Bureau conducted approximately [number] high priority (Priority 1, 2, and 3) inspections during the review period, based on the inspection frequencies established in IMC 2800. [NUMBER] of these inspections was conducted overdue by more than 25 percent of the inspection frequency prescribed in IMC 2800. In addition, the Bureau performed approximately [no.] initial inspections during the review period, [no.] of which were conducted overdue. As required by IMC 2800, initial inspections should be conducted within 12 months of license issuance. The initial inspections were conducted late due to [EXPLAIN: e.g. database entry errors, lack of resources, etc.]. The Bureau [EXPLAIN HOW/WHY/WHAT actions were taken to correct, e.g. provided additional training to personnel, diverted resources from another section to perform inspections, concentrated efforts on performing overdue inspections, etc.]. Overall, the review team calculated that the Bureau performed [no.] percent of its inspections overdue during the review period.

The review team evaluated the Bureau’s timeliness in providing inspection findings to licensees. A sampling of [no.] inspection reports indicated that [no.] of the inspection findings were communicated to the licensees beyond the Bureau’s goal of 30 days after the inspection.[IF A LARGE MAJORITY IS DELAYED, DETAIL WHY/HOW ESPECIALLY IF IT RESULTS IN THE INDICATOR FINDING OF SAT, NEEDS IMPROVEMENT OR UNSAT]

During the review period, the Bureau granted [no.] reciprocity permits, [no.] of which were candidate licensees based upon the criteria in IMC 1220. The review team determined that the Bureau [met and/or exceeded/ did not meet] the NRC’s criteria of inspecting 20 percent of candidate licensees operating under reciprocity in each of the four years covered by the review period.

Based on the IMPEP evaluation criteria, the review team recommends that [STATE/NRC REGION]’s performance with respect to the indicator, Status of Materials Inspection Program, be found[satisfactory, satisfactory, but needs improvement or unsatisfactory].

3.3Technical Quality of Inspections

The review team evaluated the inspection reports, enforcement documentation, inspection field notes, and interviewed inspectors for [no.] radioactive materials inspections conducted during the review period. The casework reviewed included inspections conducted by [no.] Bureau inspectors and covered inspections of various license types, including: [LIST TYPES: e.g. medical broad scope, medical institutions-therapy including (e.g. high dose rate remote afterloader, unsealed radioiodine therapy, permanent or temporary implant brachytherapy etc), , radionuclide production (cyclotron),, medical-diagnostic, portable gauges, industrial radiography, veterinary use, panoramic and self-shielded irradiators, gamma knife, nuclear pharmacy, mobile nuclear medicine, and Increased Security Controls for Large Quantities of Radioactive Materials (Increased Controls), etc]. Appendix C lists the inspection casework files reviewed, [with case-specific comments], as well as the results of the inspector accompaniments.

Based on the evaluation of casework, the review team noted that inspections covered all aspects of the licensee’s radiation safety programs. The review team found that inspection reports were generally thorough, complete, consistent, and of high quality, with sufficient documentation to ensure that a licensee’s performance with respect to health and safety was acceptable. The majority of the documentation supported violations, recommendations made to licensees, unresolved safety issues, the effectiveness of corrective actions taken to resolve previous violations and discussions held with licensees during exit interviews.

The inspection procedures utilized by the Bureau are generally consistent with the inspection guidance outlined in IMC 2800. An inspection report is completed by the inspector which is then [reviewed and signed by the Regional Manager/senior reviewer/etc.]. Supervisory accompaniments were conducted annually for all inspectors.

The review team determined that the inspection findings were appropriate and prompt regulatory actions were taken, as necessary. All inspection findings were clearly stated and documented in the reports and sent to the licensees with the appropriate letter detailing the results of the inspection. The Bureau issues to the licensee, either a letter indicating a clear inspection or a Notice of Violation (NOV), in letter format, which details the results of the

inspection. When the Bureau issues an NOV, the licensee is required to provide a written corrective action plan, based on the violations cited, within 30 days. All findings are reviewed by the [Program Manager/Inspection Coordinator/etc.].

The review team noted that the Bureau has an adequate supply of survey instruments to support their inspection program. Appropriate, calibrated survey instrumentation, such as Geiger-Mueller (GM) meters, scintillation detectors, ion chambers, micro-R meters, and neutron detectors, was observed to be available. The Bureau also has portable multi-channel analyzers located in offices across the State. Instruments are calibrated at least annually, or as needed, by [NAME] with National Institute of Standards and Technology traceable sources. The Bureau uses a database to track each instrument, its current location, and next calibration date.

Accompaniments of [no.] Bureau inspectors were conducted by [no.] IMPEP team members during the week(s) of [date]. The inspectors were accompanied during health and safety inspections of [LIST:source manufacturing, industrialradiography, nuclear pharmacy, irradiator, medical therapy including high dose rate remote afterloader/gamma knife/ unsealed radioiodine therapy/permanent implant brachytherapy, etc., and medical diagnostic licenses/ETC.]. The accompaniments are identified in Appendix C. During the accompaniments, the inspectors demonstrated appropriate inspection techniques, knowledge of the regulations, and conducted performance based inspections. The inspectors were trained, well-prepared for the inspection, and thorough in their audits of the licensees’ radiation safety programs. The inspectors conducted interviews with appropriate personnel, observed licensed operations, conducted confirmatory measurements, and utilized good health physics practices. The inspections were adequate to assess radiological health and safety and security at the licensed facilities.