STANDARD OPERATING PROCEDUREPage 1 of 10
Document Number: S-274 Version 1.xx
Quality Assessment of Software and Computer System Suppliers

Standard Operating Procedure

Quality Assessment of Software and Computer System Suppliers

This is an example of a Standard Operating Procedure. It is a proposal and starting point only. The type and extent of documentation depends on the process environment. The proposed documentation should be adapted accordingly and should be based on individual risk assessments. There is no guarantee that this document will pass a regulatory inspection.

Publication from

Global on-line resource for validation and compliance

Copyright by Labcompliance. This document may only be saved and viewed or printed for personal use. Users may not transmit or duplicate this document in whole or in part, in any medium. Additional copies and licenses for department, site or corporate use can be ordered from

While every effort has been made to ensure the accuracy of information contained in this document, Labcompliance accepts no responsibility for errors or omissions. No liability can be accepted in any way.

Labcompliance offers books, master plans, complete Quality Packages with validation procedures, scripts and examples, SOPs, publications, training and presentation material, user club membership with more than 500 downloads and audio/web seminars. For more information and ordering, visit

Company Name: /
Controls:
Superseded Document / N/A, new
Reason for Revision / N/A
Effective Date / March 1, 2012
Signatures:
Author / I indicate that I have authored or updated this SOP according to applicable business requirements and our company procedure: Preparing and Updating Standard Operating Procedures.
Name:______
Signature:______
Date:______
Approver / I indicate that I have reviewed this SOP, and find it meets all applicable business requirements and that it reflects the procedure described. I approve it for use.
Name:______
Signature:______
Date:______
Reviewer / I indicate that I have reviewed this SOP and find that it meets all applicable quality requirements and company standards. I approve it for use.
Name:______
Signature:______
Date:______

1.PURPOSE

Users of software and computer systems are responsible for validation. This includes validation during development. User firms can delegate part of this validation to vendors of the software or computer systems. User firms should check through a formal vendor assessment process if vendors develop and validate software and computer systems according to regulatory requirements and industry practices. User firms should find the most efficient method for assessment. This can range from simple documentation of experience with the vendor through direct vendor audit. This SOP guides user firms through this assessment process.

2.SCOPE

Assessment of suppliers of software and computer systems used in GxP regulated environments.

3.GLOSSARY/DEFINITIONS

Item / Explanation
COTS / Commercial Off-the-Shelf System.

Note: For other definitions, see .

4.REFERENCE DOCUMENTS

4.1.SOP S-273: Auditing Suppliers of Software and Computer Systems
Available through

4.2.Checklist E-321: “Software/Computer System Vendor Qualification”. Available through

4.3.GAMP, Good Automated Manufacturing Practice, A Risk-based Approach for Compliant GxP Computerized Systems , Version 5: 2008 (order from

4.4.SOP S-134: “Risk Assessment for Systems Used in GxP Environments”
Available through

4.5.SOP S-252: “Risk-Based Validation of Computer Systems”
Available through

4.6.M-171: Validation Master Plan
Available through

4.7.M-131: Risk Management Master Plan,
Available through

5.RESPONSIBILITIES

5.1.System Owner

5.1.1.Initiates vendor assessment process.

5.1.2.Represents the operation’s department in the vendor audit team.

5.2.Quality Assurance

5.2.1.Leads the assessment process.

5.2.2.Defines assessment methodology and items related to internal company procedures and regulations.

5.2.3.Writes assessment report.

5.2.4.Takes the lead in follow-up action plan.

5.3.IT

5.3.1.Defines IT related assessment items.

5.3.2.Reviews IT related assessment items.

6.FREQUENCY OF USE

6.1.Before purchasing software or computer systems.

6.2.Every five years if major upgrades and new systems have been purchased.

7.PROCEDURE

7.1.Assessment Initiation

7.1.1.System owner initiates the process to qualify a vendor of a software or computer system as part of a validation process.

7.1.2.QA forms a vendor assessment team consisting of representatives from IT, QA and the operations department.

7.2.Assessment conduct

7.2.1.The assessment team agrees on the assessment methodology

7.2.1.1.Alternative methodologies are

  • Collecting and documenting internal and external references
  • Documenting own experience
  • Mail audit
  • 3rd party audit
  • Direct vendor audit.
  • Criteria for the methodology are product risk and vendor risk
  • Assessment team determines the product risk as high, medium or low. Factors for product risk are:
  • System complexity
  • #systems to be purchased
  • Maturity of the system
  • Level of networking
  • Influence on other systems
  • Impact on (drug) product quality
  • Impact on business continuity
  • Level of customization
  • Assessment team determines the vendor risk as high, medium or low. Factors for vendor risk are:
  • Size of company
  • Company history
  • Future outlook
  • Representation in target industry, e.g., Pharma
  • Assessment team agrees on final methodology using information from 7.2.1.3 and 7.2.1.4 and the graphics .

  • Document experience for green areas (proceed to 7.2.2)
  • Mail audit using for yellow areas. (proceed to 7.2.3)
    3rd party audits if results from mail audit are not satisfactory
  • Direct vendor audit for red areas. (proceed to 7.2.5)
  • Documenting experience
  • QA collects information from operation and IT
  • QA documents experience using form in 8.1
  • QA distributes 7.2.2.2 to the assessment team for review
  • Mail audit
  • QA sends the assessment list from reference 4.2 to the vendor with clear deadline for return
  • QA reviews and distributes the returned assessment list
  • QA makes an overall assessment using forms in attachment 8.2 and 8.3
  • QA arranges for a meeting with the assessment team
  • The assessment team decides if the results of the mail audit are satisfactory, if not go to 7.2.4 .
  • 3rd party audit
  • QA identifies the company or organization to conduct the audit
  • QA negotiates conditions and audit objectives.
  • Third party performs the audit and sends the audit results to QA
  • QA reviews the audit results and makes an overall assessment using forms in attachment 8.2 and 8.3
  • QA distributes the audit results to the assessment team and calls for a meeting with the assessment team.
  • The assessment team decides if the audit results are satisfactory.
  • Direct vendor audit
  • QA takes the lead for the direct audit following procedure in 4.1.
  • QA distributes the audit results to the assessment team and calls for a meeting with the assessment team
  • The assessment team decides if the audit results are satisfactory
  • Documentation
  • QA documents assessment process, methodology and results
  • QA distributes 7.2.6.1 to the assessment team, to IT and operations.

8.ATTACHEMENTS

8.1.Attachment 1 – Template documentation of vendor

Supplier:
Assessment for computer system category:
Experience 1 (e.g. system reliability): / Rating
Above average
At Average
Below average
Not acceptable
Experience 2 (e.g. compliance services): / Rating
Above average
At Average
Below average
Not acceptable
Experience 3 (e.g. support): / Rating
Above average
At Average
Below average
Not acceptable
Others: / Rating
Above average
At Average
Below average
Not acceptable
Overall: / Rating
Above average
At Average
Below average
Not acceptable

8.2.Attachment 2 – Template and Examples for Software/Computer System Vendor Assessment (mail audit)

Rating
# / Audit Item / 3 / 2 / 1 / 0 / N/A
1 / Company profile
2 / Organization/quality system/responsibilities
3 / Software development procedures
4 / Testing
5 / Support
6 / Customer training
7 / Change control, e.g., enhancement requests and software updates
8 / Security/disaster recovery
9 / Archiving of software and documentation, e.g., user manuals
10 / People qualification

Additional Comments or Observations:

______
______

8.3.Attachment 3 – Rating, Meaning and Interpretation

Rating / Meaning / Interpretation
3 / Excellent / Item/area/system exists and is above average.
2 / Adequate / Item/area/system exists and is about average.
1 / Poor / Item/area/system does exist but is below average and needs to be improved.
0 / Unsatisfactory / Item/area/system does not existor is unacceptable.
N/A / Not Applicable / Question is not applicable to the type of function or service.

(Replace with your company’s name) FOR INTERNAL USE