Standard Operating Procedure for Computer System Validation (CSV)

Standard Operating Procedure for Computer System Validation (CSV)

1. Purpose:

To provide a standard operating procedure for Computer System Validation.

2.0 Scope:

This SOP (Standard Operating Procedure) applies to Computer System Validation and outlines the life cycle of various computerized systems, including Process Control System, SCADA and PLC based systems, Analytical Laboratory Systems, Configurable Systems like Learning Management Systems (LMS, etc.), and Customized Software used at [company name].

3.0 Responsibility:

3.1 System Administrator:

3.1.1 Set system policy, assign privileges according to system-specific SOPs, and maintain user accounts.

3.1.2 Support and maintain computerized systems in accordance with system-specific SOPs.

3.1.3 Maintain the security of data residing on the system, its backups, and archives.

3.1.4 Ensure changes are documented and implemented according to QA-approved documents initiated by user departments.

3.1.4.1 IT department personnel shall act as the administrator for Analytical Laboratory Systems, Building Management Systems, Centralized Configurable systems, and Customized Software.

3.1.4.2 Building Management System is site-specific and critical to operational activities and runs in three shifts; hence, the second administrator access control is assigned to the Site Quality Head.

3.1.4.3 The Engineering Manager/ designee shall set the password policy and have user maintenance (User Creation/ Modification/ Deactivation) privileges for Process Control Systems.

3.1.4.4 IT department personnel shall be responsible for maintaining electronic data, backup and archival of the data from Process Control Systems, and checking compliance with equipment/instrument handover checklists.

3.2 Engineering Team:

3.2.1 Manage and plan projects overall.

3.2.2 Ensure all system requirements are fulfilled as per URS in discussion with user department(s) and vendor.

3.2.3 Communicate to user department if any requirement defined in the URS is not fulfilled by the vendor.

3.2.4 Report discrepancies promptly during project phase and take corrective action, document and trace in qualification documents.

3.2.5 Ensure the support service delivered is as per the requirements laid down in the URS.

3.2.6 Ensure that maintenance and changes to the system are performed according to documented procedures that maintain the validated status of the system.

3.3 System Owner:

3.3.1 Act as the subject matter expert to define business needs from the system.

3.3.2 Manage and plan installation of new systems or upgrade existing systems if Project/ Engineering team is not involved.

3.3.3 Define user needs and intended use of the system.

3.3.4 Provide input for the generation of the URS and clarifications with regards to the system functionality.

3.3.5 Provide Functional Requirement Specification for the system in URS.

3.3.6 Ensure discrepancies are correctly addressed and resolved during validation of the system.

3.3.7 Control project activities, resources, and costs jointly with cross-functional teams (e.g., Projects, Engineering, etc.).

3.3.8 Depute resources from the department for executing validation activities.

3.3.9 Ensure that the use and maintenance of the system comply with documented procedures that maintain the system in a validated state.

3.3.10 Review lifecycle documentation and validation status.

3.3.11 Prepare validation documents and ensure they are traceable.

3.3.12 Ensure adequate training is provided to users, administrators, and maintenance staff on relevant procedures.

3.3.13 Ensure adequate training is provided to users, administrators, and maintenance staff by suppliers during installation of new systems.

3.3.14 Ensure compliance with equipment/ instrument handover checklists after completion of CSV.

3.3.15 Review and initiate system changes if any.

3.3.16 Ensure changes to computerized systems or its environment are properly documented and approved.

3.4 The Quality Assurance Representative has the following responsibilities:

3.4.1 Make sure that the system meets all necessary requirements from regulations, business, technical and users.

3.4.2 Help prepare and review project documents to ensure they are up to standards.

3.4.3 Review the implementation of the system’s life cycle and make sure it stays in a validated state.

3.4.4 Ensure that the Validation Plan is appropriate and followed, and verify equipment/instrument handover checklist after completion of validation.

3.4.5 Review system changes and make sure they are properly assessed and validated to maintain the system’s validated state.

3.4.6 Ensure proper documentation procedures are followed for changes to the computerized system or its environment, and review the documentation produced along with risk assessment.

3.5 The Quality Assurance Manager has the following responsibilities:

3.5.1 Define the overall quality standards for the system implementation.

3.5.2 Review and approve necessary project documents.

3.5.3 Review and approve changes to the computerized system or its environment.

3.5.4 Ensure proper documentation procedures are followed for changes to the computerized system or its environment.

3.5.5 Make sure the system stays in a validated state.

3.5.6 Ensure all validation deliverables are approved, available and traceable.

3.6 The Contract Service Provider for CSV Support has the following responsibilities:

3.6.1 Follow the organization’s procedures and policies related to CSV and GDP.

3.6.2 Carry out pre-approved protocols and compile reports.

3.6.3 Submit all executed CSV deliverables to the User department.

3.7 The Responsibility Matrix for Development, Review and Approval of validation documents is as follows:

The following list identifies the responsible parties for generating and approving each deliverable. However, it should be noted that the responsible parties may vary depending on the specific system being implemented. This information should be outlined in the roles and responsibility section of the individual system validation plan or validation summary report.

ActivityDoerReviewer and Approver
Concept Phase
User Requirement Specification (URS)UserUser, Engineering, IT, QA
High Level Risk Assessment (HLRA)UserUser, Engineering, IT, QA
Software Vendor Assessment (SVA)QAEngineering, IT, QA
ActivityDoerReviewer and Approver
Design Phase
Development of Software Design Specifications (SDS)SupplierUser, Engineering, IT, QA
ActivityDoerReviewer and Approver
Project Phase
Functional/ Design Specification (FDS)/ Configuration Specification CS)Supplier/ UserUser, Engineering, IT, QA
Validation Plan (VP)Supplier/ UserUser, Engineering, IT, QA
Functional Risk Assessment (FRA)User and cross functional teamUser, Engineering, IT, QA
Installation Qualification (IQ)Supplier/ UserUser, Engineering, IT, QA 
Operational and Performance Qualification (OPQ)Supplier/ UserUser, Engineering, IT, QA
Migration Qualification (MQ)Supplier/ UserUser, Engineering, IT, QA
Traceability Matrix (TM)Supplier/ UserUser, Engineering, IT, QA
Validation Summary Report (VSR)UserUser, Engineering, IT, QA
ActivityDoerReviewer and Approver
Operation Phase
Standard Operating Procedure (SOP)UserUser, QA

4. Definitions:

4.1.1 System Owner: The System Owner is the person who is responsible for managing the entire lifecycle of the system, including procurement, development, integration, modification, operation, maintenance, and retirement.

4.1.2 Administrator: An Administrator is responsible for maintaining the software, setting system policies, managing user accounts, assigning privileges, and backing up and restoring electronic data.

4.1.3 Backup: A Backup is a copy of current data, metadata, and system configuration settings that are kept to facilitate recovery, including disaster recovery.

4.1.4 Archival: Archival is the process of safeguarding records from being altered or deleted and storing them under the control of independent data management personnel throughout the retention period.

4.15 Computer System Validation: It is a documented evidence that provides a high level of confidence that a computerized system functions consistently and reproducibly according to its intended use.

5. Procedure:

5.1 Validation activities can be done by company team or outsourced to external service providers. Use qualified external service providers who’ve been approved through a vendor approval process.

5.2 If company outsource validation support, make sure that the responsibilities of the external service providers are clearly defined in the agreement.

5.3 Any supplier we work with must perform validation activities according to the approach defined in company document.

5.4 This SOP doesn’t apply to equipment or instruments that don’t have Industrial PC (IPC), Human Machine Interface (HMI), or Desktop PC attached. It also doesn’t apply to microprocessor-based equipment/instruments like pH meters or weighing balances.

5.5 The Engineering and IT departments are responsible for reviewing process control systems-related documents.

5.6 For existing (legacy) systems, high-level risk assessment to be done to evaluate and document their impact on GxP. We’ll also prepare an inventory list of GxP impacting systems, and existing computerized system validation documents will be considered valid.

5.7 When validating GxP impacting computerized systems, it will be based on the following criteria:

5.7.1 A high-level risk assessment

5.7.2 Categorization of the system

5.7.3 Implementation of a life cycle model for GxP impacting computerized systems

5.8 The High-Level Risk Assessment process evaluates the validation requirements, and follow these steps:

5.8.1 Identify the relevance to GxP

5.8.2 Assess electronic records and signatures

5.8.3 Identify the level of risk

5.8.4 Categorize the software as follows:

GAMP Category 1: Infrastructure Software

Description: (a.) Layered Software i.e. Upon which applications are Built, (b.) Software used to manage operating environment, Example: Operating System, Middleware, Database engine, Network Monitoring tools, Scheduling tools, Spreadsheets

Typical Approach of CSV:

Record Version No., verify correct installation by following approved installation procedure

GAMP Category 2: This category is no more valid (firmware)

GAMP Category 3: Non- Configured

Description: (a.) Run Time parameters may be entered or stored but software can’t be configured to suit business requirements.

Example: Laboratory Instruments, Firmware Based applications, Commercial Off the shelf (COTS) software

Typical Approach of CSV: Abbreviated Life Cycle Approach, URS, Record Version No., Verify Correct Installation, Testing Against Requirements, Procedures in Place for maintaining compliance and fitness for intended use

GAMP Category 4: Configured

Description: Software often very complex that can be configured by user to meet the user’s business process. The software code is not altered

Example: ERP, LMS, LIMS, SCADA, HMI software

Typical Approach for CSV: Life Cycle Approach, Risk Based Approach to

supplier’s assessment, Demonstrate Supplier has adequate Quality Management System, Record version No., Verify correct Installation. Risk Based testing to demonstrate that applications work as designated in test Environment, Procedures in place for maintaining compliance and fitness for purpose, Procedures in place for managing data.

GAMP Category 5: Customized Software

Description: Software Custom Designed and coded to suit business Needs

Example: Internally and Externally developed custom Software, Customized PLCs, Custom Firmware.

Typical Approach: Life Cycle Approach, Risk Based Approach to supplier’s assessment, Demonstrate Supplier has adequate Quality Management System, Record version No., Verify correct Installation. Risk Based testing to demonstrate that applications work as designated in test Environment, Procedures in place for maintaining compliance and fitness for purpose, Procedures in place for managing data, More rigorous supplier assessment, Possession of Full Life Cycle Documentation (FS, DS and System Build documents), Design and Source Code Review.

5.9 Based on the outcome of the high-level risk assessment, prepare a validation strategy and minimum validation deliverables for each type of software category and its identified level of risk. Based on the categorization, it’ll need at least the following validation deliverables:

5.9.1 For Category 1 Infrastructure software: We don’t need to verify infrastructure software separately because we can verify it during the installation qualification of application software.

5.9.2 For Category 3 Non-configured software: We can consider the operational manual or any other relevant vendor document as the design specification or functional specification.

 Deliverable for
Non-configured
software
Level of RiskURSHLRAVPDS/ FS
/ CS
IQOQPQTMVSROM
Low     
Moderate 
High

5.9.2.1 When Operational Manual or Technical Manual any other relevant vendor document is used as Design/ Functional Specification, required functionalities to be traceable in the Traceability Matrix.

5.9.3 For Category 4: Deliverable for Configured software

 Deliverable for
Configured software
Level of RiskURSHLRASVAVPDS/ FS/ CSFRAIQOQPQTMVSROM
Low 
Moderate
High

5.9.3.1 For laboratory based systems, software performance i.e. PQ can be verified as part of calibration activity.

5.9.4    For Category 5: Deliverable for Customized software

Level of RiskURSHLRASVAVPDS/FSCSFRA
Low     
Moderate
High
Level of RiskIQOQPQTMVSROM
Low  
Moderate
High

5.10 Prepare the High Level Risk Assessment (HLRA) according to respective Format.

5.11 The following definitions apply to the HLRA:

5.11.1 Direct Impact on Product Quality: If a system’s function can change any characteristic of the product, such as its physical form or chemical properties, it’s considered to have a direct impact on the product quality. Examples include Co-mill, Sifter, and HVAC system.

5.11.2 Indirect Impact on Product Quality: If a system’s function can’t change any characteristic of the product, but it impacts equipment that directly affects product quality, it’s considered to have an indirect impact on product quality. Examples include Chillers and Hot Water systems.

5.11.3 No Impact on Product Quality: If a piece of equipment’s function doesn’t change any characteristic of the product, it’s considered to have no impact on the product quality. Examples include Lifting and Positioning systems.

5.12 System Implementation Life Cycle:

The computerized system life cycle is divided into five phases: Concept Phase, Design Phase, Project Phase, Operation Phase, and Retirement Phase.

5.12.1 Concept Phase: In this phase, the overall scope of the business needs is defined, and the type of system and overall requirements are identified.

5.12.2 User Requirements Specification (URS): A URS document is typically prepared to indicate user requirements.

5.12.3 User Requirements Specification (URS) document: The URS document defines the system requirements and expectations for data reliability and data security. The URS shall include, but is not limited to:

5.12.3.1 Business Requirements

5.12.3.2 Interface Requirements

5.12.3.3 Security and Safety Requirements

5.12.3.4 Electronic Records, Reports and Electronic Signature Requirements

5.12.3.5 Audit trail Requirements

5.16.3.6 Backup, Archival and Restoration Requirements

5.12.3.7 Performance Requirements

5.12.3.8 Operational/ Maintenance Support Requirements

5.12.4: Each requirement in the User Requirement Specification (URS) should be numbered to create a Traceability Matrix.

5.12.5: Use the User Requirement Specification template provided in SOP.

5.12.6: The software or system must be evaluated for potential high-risk issues.

5.12.7: Identify potential suppliers and vendors for the system.

5.12.8: Evaluate the maintenance and technical support practices of vendors using respative Format

5.12.9: Choose a vendor based on their GxP regulated industry experience, feedback from their previous customers, and the quality of their product and support.

5.13: Specify the design elements of the system during this phase to meet the URS requirements.

5.13.1: Provide detailed design specifications during this phase that show how the requirements will be met. This phase primarily focuses on developing customized software.

5.14 Project Phase: The vendor will provide a Functional Specification (FS), Design Specification (DS), or Design Qualification (DQ) based on the User Requirement Specification (URS). The provided documents will be reviewed and approved according to the DQ Approval process. The Validation Plan will be prepared as per respactive Format to define the validation strategy.

5.14.1 The Validation Plan will be used to prepare for validation testing.

5.14.2 During this phase, the system hardware and software will be assembled/ installed and configured based on the information supplied in the design specifications.

5.14.3 All system components will be integrated to complete the system architecture.

5.14.4 The Validation Team will review the functional/ design/ configuration specifications to confirm that the system satisfies the User Requirement Specification.

5.14.5 The software operation manual or any other relevant system-specific document may be used as functional/ design/ configuration specifications and must be approved.

5.14.6 If the vendor cannot provide the specification document, the User Department/ Engineering can prepare the functional/ design/ configuration specification for required functionalities by referring to the operation manual or any other relevant system-specific document. The document title will be based on the type of document.

5.14.7 If the software operation manual or any other relevant system-specific document is used as the functional/ design specification, the required functionalities should be traceable in the Traceability Matrix.

5.14.8 If the vendor cannot provide the configuration specification, the system configuration (e.g. privileges, system policies, password policy, etc.) will be documented in the SOP, or the User Department can prepare it by referring to the system-specific document.

5.14.9 The Functional/ Design/ Configuration Specification must be prepared before the preparation of qualification protocols.

5.14.10 Functional Risk Assessment to be carried out based on the functionality of the software and controls to be examined to mitigate the risk. Functional Risk Assessment (FRA) shall be done as per respactive Format.

5.14.11 Mitigation strategy for the evaluated risks will be developed and verified through the Design of software, Verification of functions, SOPs and Trainings for the business process.

5.14.12  The general criteria that should be considered for FRA are listed below (but not limited to):

5.14.12.1 Unauthorized system access

5.14.12.2 Abnormal process condition at the time of process operation

5.14.12.3 Power failure condition

5.14.12.4 Communication failure of software/ hardware / network

5.14.12.5  Failure of control system / set parameters

5.14.12.6 Improper training

5.14.12.7 Improper system function

5.14.12.8 Procedures not available/ inadequate

5.14.12.9  Improper safety measures

5.14.12.10 Loss of data backup

5.14.12.11 Password policy not applied/ functional

5.14.12.12 Security policy not applied/ functional

5.14.12.13 Incorrect configuration

5.14.12.14 Audit Trail not configured/ functional

5.14.12.15 Procedures for calibration does not exists

5.14.12.16  Validation documents/ User manual not available

5.14.13 Before qualifying the system, a Functional Risk Assessment should be carried out to identify any risks that should be addressed during the qualification process.

5.14.14 If the vendor is unable to provide the configuration specification, the user should prepare a document, according to the system-specific details, and include it in the Standard Operating Procedure (SOP).

5.14.15 Formal testing is essential to ensure that the computerized system has been installed correctly and operates in accordance with the User Requirement and Functional specification documents.

5.14.16 The testing will be conducted in three stages: Installation Qualification (IQ), Operational Qualification (OQ), and Performance Qualification (PQ).

5.14.17 The testing will ensure that the system meets the GxP requirements and is supported by documented evidence.

5.14.18 Typical tests in the Installation Qualification (IQ) protocol will include, but are not limited to:

5.14.18.1 Software Version identification.

5.14.18.2 Verification of Hardware and Software Configuration.

5.14.18.3 Connected Instrument/ System identification.

5.14.18.4 Identification of Software Backup copy for disaster management control

5.15: A protocol called Installation Qualification Protocol needs to be prepared, following respactive Format. If the vendor prepares the protocol, it needs to be approved using respactive.

5.16: If a backup copy of the software is not available, a letter of willingness from the vendor should be obtained to support the system during a disaster condition. Alternatively, this can be managed through an Annual Maintenance Contract (AMC) or Service Level Agreement (SLA).

5.17: During the Operation and Performance Qualification (OPQ), system functionality and performance need to be verified. This includes various tests based on their relevance, but not limited to:

5.17.1  Audit trail verification

5. 17.2 Access control verification

5. 17.3 Communication Failure verification (Equipment to PC, PC to Server; as applicable)

5. 17.4 System Operation verification

5. 17.5 Data backup and restoration verification

5. 17.6 Time zone verification

5. 17.7 Alarm/ Interlock and Reset Response verification

5. 17.8 Input/ Output Verification

5. 17.9 Loop/ Network Testing

5. 17.10 Calculation verification

5. 17.11 Power Failure and Restoration verification

5. 17.12 Report verification

5.18 The equipment/ instrument’s time zone should be selected and set according to India’s time zone.

5.19 For critical systems, both positive and negative case testing will be conducted to test the system’s capabilities.

5.20 Client-Server based systems may have separate or combined qualification protocols for Server and Client configuration.

5.21 Screen captures should be attached to executed Qualification Protocols as evidence for critical systems when required.

5.22 OPQ protocol shall be prepared according to respactive Format and Vendor OPQ shall be approved accordingly.

5.23 Sometimes, it is suitable to combine different testing stages such as Installation and Operational Qualification (IOQ) or Operation and Performance Qualification (OPQ).

5.24 The criteria for accepting whether a system is qualified or not will be described in the appropriate qualification protocols.

5.25 All protocol documents will be approved beforehand.

5.26 Each test result will be signed and dated by the person who conducted the test, and the test case will be reviewed by another person.

5.27 Any discrepancies or non-conformances where the actual results do not match the expected results will be documented in the respective qualification protocol.

5.28 If any non-conformances/ discrepancies require further effort to resolve, the resolution process will be included.

5.29 After completing all tests successfully, the executed protocols shall be approved.

5.30 To ensure the accuracy of the system, all draft and final SOPs that support the system shall be verified.

5.31 Migration Qualification:

5.31.1 The Migration Qualification (MQ) includes verifying original data, migrating data, and verifying the migrated data.

5.31.2 During the implementation of certain systems, data migration may be required.

5.31.2.1 New system Implementation

5.31.2.2 System upgrade

5.31.2.3 System retirement

5.31.2.4 Data archival

5.31.3 Data migration involves moving the data in the following cases, but not limited to

5.31.3.1 Database or software upgrade

5.31.3.2 Originating System to a different system

5.31.3.3 One server to another server

5.31.3.4 Multiple data sources to one data source

5.31.4 In order to perform the migration qualification, several parameters need to be verified, which may vary depending on the type of system:

5.31.4.1 Verify that the original data is intact just before migration.

5.31.4.2 Verify the correctness of the migrated data, including raw data and audit trail.

5.31.5 If a new system requires migration qualification, the migration plan shall be included in the validation plan.

5.31.6 If a vendor performs the migration qualification for a system upgrade, the migration qualification protocol shall be approved as per respactive Format.

5.32 The Traceability Matrix should be updated to connect the User Requirement Specification with the tests designed in the IQ, OQ, PQ, OPQ, MQ protocols.

5.33 The Traceability Matrix needs to be updated before the validation summary report as per respactive Format.

5.34 Any accepted discrepancy from the URS should be documented in the ‘Comment’ section of the Traceability Matrix table.

5.35 The Validation Summary Report should summarize the results of the validation activities and provide a decision to release the System for use.

5.36 The system can only be used for operations once the Validation Summary Report for a particular system has been approved.

5.37 The Validation Summary Report should include information about:

5.37.1  Decision on System Release

5.37.2  List of approved deliverables

5.37.3  Result of tests and open deviations from expected results, if any

5.38.4  Discussion and conclusion including any system limitations, open deviations (if applicable) and, if required follow-up activities.

5.39.5  All training associated with Users, Technical support personnel shall be completed and logged.

5.40 The Validation Summary Report needs to be prepared according to respactive Format.

5.41 If there are multiple systems associated with a single software system, an interim summary report needs to be written and approved for individual systems before releasing the system for official use. Once all associated computerized system validation is completed, a final summary report can be developed.

5.42 External Validation Support:

5.42.1 Validation activities can be given to outside service providers.

5.42.2 External service providers must be qualified through a vendor approval process.

5.42.3 The outsourced validation support’s responsibilities must be mentioned in the agreement.

5.42.4 The vendor must perform the validation activities according to the approach defined in this SOP.

5.43 Vendor’s Qualification Document:

5.43.1 The vendor qualification document may be acceptable, subject to review for its competency.

5.43.2: Review the vendor qualification documents and record the evaluation outcomes in respactive formats, which must be authorized by the QA department.

5.43.3: If the vendor document already has a document number, keep the numbering system of the vendor document as it is.

5.44: The computer system validation documents should be numbered in the following order.

5.45 Each validation document should have a revision number.

5.46 The first version of the document should be marked as “00”. Subsequent revisions will have to be incremented by “01”, “02”, “03”, “04”, and so on.

5.47 When an existing document is revised, it should be numbered according to the current version of this SOP.

5.48 If there are any modifications/ changes in the existing system, the addendum document’s numbering system should start from “Addendum 01”, followed by “Addendum 02”, “Addendum 03”, and so on.

5.49 The Equipment/ Instrument handover checklist should be completed by IT personnel and verified by the User and QA department, preferably with the Validation Summary Report.

5.50 Retirement Phase:

5.50.1 When the system no longer meets company’s business needs or business operations, it should be retired or decommissioned according to QMS documents.

5.50.2 A defined retirement plan should be in place that outlines how system data will be archived and restored in the future or how hardware and software components will be decommissioned.

5.50.3 The data should be available or restored for audit purposes for the length of time required by the relevant regulatory authorities.

5.51 Strategy for Maintaining the Validated State:

5.51.1 Change Control: Any modifications made to the computerized system or its related documents that may impact the software validation status must be documented through Quality Management System, i.e., change control.

5.51.2 Up-gradation of the system must be done through the change management process, and system data should be transferred orderly to the new application software or alternatively archived.

5.51.3 Error Handling: Any error, incident or defect encountered with the system that may affect data integrity, product quality, or patient safety must be investigated and handled according to the Quality Management System, i.e., Deviation.

5.51.4 Periodic Review: Systems must be reviewed periodically during equipment re-qualification. Periodic review helps to determine the changes that have occurred on the system, the effects of those changes, the status of system documentation such as SOPs and verification of active users. This review must ensure that the system and all related documentation are current at all times.

6. Formats:

1. Format for High Level Risk Assessment

Table of Contents

1. Purpose

2. Scope

3. References

4. System Overview

5. Responsibilities

6. GxP Relevance

Sr. No.QuestionsYes/ No
 1Is the system involved in the environmental control processes of facilities used for animals in GLP studies or for the manufacture, processing, packaging, holding or distribution of products? 
 2Is the system used in the collection, analysis or storage of data from pre-clinical studies or clinical trials? 
 3Is the system used to produce or process data that will be used in (drug) regulatory submissions? 
 4Is the system used for distribution or collection of information in the event of a commercial product recall, or in patient follow-up of pre-clinical or clinical trials? 
 5Does the system provide information that is used as evidence of compliance with a process liable to external audit or inspection by health authority such as FDA, EMEA, WHO, MHRA, or any other health authority? 
 6Is the system used in the collection, processing, analysis or storage of data related to product quality and patient safety? 
 7Is the system used in the manufacture or control of products? 
 8Is the system used to control Packaging or Labeling activities? 
 9Is the system used to maintain purchasing, inventory or distribution data for the product?  

If one of the questions is answered with “Yes”, the System is classified as “GxP relevant” and Computer System Validation is required.

7. Electronic Records, Electronic Signature

Applicable if the system is classified as “GxP Relevant”

Sr. No.QuestionYes/No/NA
 1Does the system support business processes that are part of the product development, registration, manufacturing or the distribution of products that are to be delivered to the US and/or European market and/or to any other market that is regulated regarding ERES? 

If 7.1 Question is answered as “Yes”, then go to the next questions.

Sr. No.QuestionYes/No/NA
 1Does the system create, modify, maintain, archive, retrieve or transmit electronic records specifically required by any GxP regulation? 
 2If the system processes electronic records as noted above, are these records used in their electronic form to support GxP decisions? 
 3Does the system support application of electronic signatures to records that are required to be signed? 

If 7.2 and/ or 7.3 is answered with “Yes”, “Electronic records” are applied to the system.

If 7.4 is answered with “Yes”, “Electronic signatures” is applied to the system.

8. Level of Risk

For GxP relevant system, determine the ‘Level of Risk’ (Severity of harm) based on system impact on GxP relevance, Business relevance and failure consequence. Use below table for this purpose.

Assessment Step:

Assessment regarding GxP relevance (Yes/No)

Assessment regarding Business relevance (Yes/No)

Assessment regarding Failure Consequence (High/ Moderate/ Low)

If system is classified as GxP relevance, system must be automatically classified as business relevance and at least moderate for the failure consequence.

If system is impacting on business, which means on activities like Research and Development, Quality Control, Production, Sales, etc. then system should be classified as Business Relevance.

If system is classified as neither GxP nor business relevant, the classification of the failure consequence should be Low.

If system has direct impact on, Product quality, Patient Safety then the system failure consequences shall be High.

If system has indirect impact on, Product quality, Patient Safety then the system failure consequences shall be Moderate.

If system has No direct or indirect impact on Product quality, Patient Safety then system failure consequences shall be Low. 

Yes- 10, High- 20, Moderate- 08, Low- 02

GxP RelevanceBusiness RelevanceFailure Consequence (Level of Risk)Calculated Risk
YesYesHigh40
NoYesHigh30
YesYesModerate28
NoYesModerate18
NoYesLow12
NoNoLow02
Sr No.QuestionResponse
 1Is system GxP relevant?  (Yes/No) 
 2Is system Business relevant? (Yes/No) 
 3What is the failure consequence? (High/Moderate/Low) 
 4Calculated Risk 
 5Level of Risk 

9.            GAMP Categorization

CategoryQuestionYes/No/NA
 1Is it commercially available operating system? (If No go for the next question) 
 2This category is no more valid 
 3Is this commercially available standard software package providing an off the shelf solution to a business or manufacturing process? (If No go for the next question) 
 4Is system commercially available package that involve configuring predefine software modules and have you developed customized modules? (If No go for the next question) 
 5Is system custom built software or a custom extension to an existing system? 

10. Summary

Respond to the questions according to section 6 to 9 of this document

QuestionsResponse
Is system classified as GxP impacting and computer system validation required? 
Is Electronic Records applies to the system? 
Is Electronic Signature applies to the system? 
What is the Level of Risk? 
What is the GAMP category? 

11. Comment

12. Revision History

2. Format for Software Vendor Assessment Questionnaire

Table of Contents

1. Purpose

This questionnaire is designed to evaluate the quality management practices, organizational structure, and experience of the vendor. Its purpose is to establish documented evidence that the vendor can meet technical, functional, and regulatory requirements.

2. Scope

This questionnaire is only applicable to Computerized System vendors that impact GxP.

3. Instructions for Completion

Each question must be answered clearly and with sufficient detail to demonstrate compliance or non-compliance. Responses may be provided in a separate document or cross-referenced to each question number. Supporting documentation may be requested for certain questions. Copyright, confidentiality, and intellectual property should be respected. Providing high-quality documentation may prevent the need for an on-site audit.

4. Company Details

4.1. Provide company name, address, telephone and fax number, and contact name(s).

4.2. Provide details of your Parent/Holding Company.

4.3. Provide a brief company history and geographical organizational structure.

4.4. Provide an organizational chart showing the company structure, organization, and the responsibilities of key individuals.

4.5. Provide the number of employees globally and in India, broken down by staff type.

4.6. Provide details of your company’s working experience in the pharmaceutical sector and provide references.

4.7. Provide details of any previous products or services provided to the organization, including location and contact details.

4.8. Does your company currently have any litigation involvement?

Products Supplied to the Pharmaceutical Industry

Information is required on the products and/or services your company provides to the pharmaceutical industry.

5.1. Provide a list of products your company supplies to the pharmaceutical industry, including products supported under a maintenance agreement.

5.2. Identify which products are standard or configurable and the GAMP category (if known).

5.3. What is the software product approval process for release to the pharmaceutical industry?

6. Specific Products Information

This section provides details on a particular product included in the audit scope.

6.1. Name the product(s) covered under this audit.

6.2. How long has the product been in the market?

6.3. What is the current version and build number of the product?

6.4. How many versions are supported?

6.5. How many staff members work on and support the product? Please provide a breakdown of analytical experts, configuration experts, database experts, and software installation and support engineers. If any part of your support is outsourced, please provide details of the subcontracted vendor.

6.6. Provide a reference list of the major installed sites for the product or number of licenses sold.

6.7. Provide details of any user community/support groups for the product.

6.8. How does your company provide product support (e.g., hotline, website, or callout)?

6.9. What is your company policy for providing support for older or “out of support” versions?

6.10. Subcontracted:

Product Development – General: If the development of the product was subcontracted, please provide information on the company that developed the product and where this development took place. Please send a copy of this Postal Audit Questionnaire to the subcontractor for completion and submit their response with the postal audit response from your company.

6.11. Subcontracted:

Product Development – Quality Monitoring: If the development of the product was subcontracted, please provide a copy of the latest report for the audit performed on the subcontractor by your company. How often do you audit this subcontractor?

6.12. Open Source Software (OSS): If the product, or a part of the product, was developed by an Open Source Supplier (OSS), which may be a commercial firm or community, provide information on who developed the product and where the development took place. Please perform an OSS supplier assessment and submit it with the postal audit response from your company.

7. Quality Management System – Overview and General Requirements

This section requires information on the Quality Management System used to develop the product and/or to provide services.

7.1. Provide a copy of your ISO9001 registration certificate or details of the certification achieved for your Quality Management System. Please provide the name of your Certification Body, the date your current certificate was issued and date of renewal, and the frequency of the audits performed by your Certification Body. Provide details of what is covered under your Quality Management System certification (i.e., your scope of certification). When was your first Quality Management System certification?

7.2. Provide a copy of your registration certificate or details of the certification achieved for software development (Note: There are several relevant schemes, frameworks, and approaches, including TickIT, ISO15504 (SPICE), and CMMI). Please provide the name of your Certification Body, the date your current certificate was issued and date of renewal, and the frequency of the audits performed by your Certification Body. Provide details of what is covered under your software certification (i.e., your scope of certification). When was your first registration?

Quality Management System – Software Delivery, Installation, and Verification

This section requires information on software user documentation, verification, and training.

8.1. User Documentation: Please provide details of the user documentation (manuals, procedures, etc.) that you would provide to support the operation of the product. For each document, please provide a copy of the cover page, history page, and index page.

8.2. Verification Documentation. Provide details of the verification documentation (plans, protocols, reports, etc.) that you would provide to support the onsite verification of the product. For each document, provide a copy of the cover page, history page, and index page.

8.3. Training for Customers: Please describe the training program that will enable our personnel to operate the product effectively. Additionally, please explain how the documentation of this training will be recorded.

9. Quality Management System – Support Processes: Please provide information on the business and quality support processes utilized by your company, and their effectiveness. Specifically, please provide details on the number of customer complaints raised and resolved within the past 12 months, as well as the process followed to address these complaints.

10. Electronic Records and Electronic Signatures: Please provide information on how your product has been designed and configured to meet the supplier-specific requirements related to pharmaceutical regulatory agency regulations on electronic records and signatures. This information will help us assess the work required to integrate your product into our business in order to comply with relevant regulations (e.g., US FDA 21 CFR Part 11). Please provide documentation (e.g., a white paper) for each of your products, which describes in detail how you have interpreted individual requirements on Electronic Records and Electronic Signatures related to pharmaceutical regulatory agency regulations (e.g., US FDA 21 CFR Part 11), and how each requirement has been addressed for each product.

11. Documentation Required for Audit Response: Please provide the following documents with your response to this postal audit. If any of the documents cannot be submitted or are not relevant, please provide a reason why.

Sr. No.Document Name/TypeSubmitted (Yes)Submitted (No) and the reason provided
 1Organizational chart  
 2A list of pharmaceutical companies your company has worked with and associated contact names for references.  
 3A list of products your company currently supplies to the pharmaceutical companies including those products which you support under a maintenance agreement.  
 4A reference list of the major installed sites for the product  
 5A copy of the completed Postal Audit Questionnaire from the subcontractor used for product development.  
 6A copy of the latest report for the audit performed on the subcontractor by your company.  
 7A copy of any supplier assessments.  
 8A copy of your ISO9001 registration certificate.  
 9A copy of your registration certificate, if applicable.  

Assessment Reviewed and Approved by:

Sign and Date

3. Format for Validation Plan

1. Purpose

2. Scope

3. References

4. System Overview

5. System Lifecycle/Validation Approach

6. Activities and Deliverables

6.1. Validation deliverables and approval matrix

6.2. Validation Plan (VP) responsibilities:

6.3. Document Numbering Procedure

6.4. Acceptance Criteria

6.5. Document Management

7. Abbreviation

8. Attachment

9. Revision History

4. Format for Functional Risk Assessment (FRA)

1. Purpose

2. Scope

3. References

4. System Overview

5. Responsibilities

6. Risk Assessment – Failure Modes & Effect Analysis (FMEA)

Methodology

This section defines the various failure modes of the system and their effects.  Based on this FMEA, the extent of qualification and other mitigation activities will be determined, documented and carried out. The Risk Rating is calculated as described below:

For each risk, its severity (S) will be rated, determined and documented

(Low = 1, Medium=2, High =3)

DescriptionSeverity (S)
No Impact on Product Quality No Impact on Overall System Performance and/or Functionality No Impact on Operator Safety No Impact on Regulatory compliance1
Indirect Impact on Product Quality Indirect Impact on Overall System Performance and/or Functionality Indirect Impact on Operator Safety2
Direct Impact on Product Quality Direct Impact on Overall System Performance and/or Functionality Direct Impact on Operator Safety Direct Impact on Regulatory compliance3

Direct Impact on Product Quality: The function of system which could change any characteristic of the product with respect to physical form or chemical property to be considered as direct impact on the product quality.

E.g. Co-mill, Sifter, HVAC system

Indirect Impact on Product Quality: The function of system which could change any characteristic of the product with respect to physical form or chemical property, the system which shall impact to the direct quality impacting equipment to be considered as indirect impact on the product quality.

E.g. Hot water system, Chiller, etc.

No Impact on Product Quality: The function of equipment does not change any characteristic of the product with respect to physical form or chemical property to be considered as no impact on the product quality.

E.g. Lifting and positioning system

For each risk, its frequency of occurrence (O) will be rated, determined and documented

(Low = 1, Medium=2, High =3)

DescriptionOccurrence (O)
Occurrence is extremely rare or unlikely1
Occurs occasionally2
Occurs almost every time the system is used3

For each risk, its likelihood of detectability (D) will be rated, determined and documented

(Low = 3, Medium=2, High =1)

DescriptionDetectability (D)
100% detection or inspection technique is in place specifically for the failure like an alarm, interlock and error message or system shutdown.1
Detected by indirect means or by observation like indirect indication or visual inspection.2
No specific detection method3

RPN = Severity of failure X frequency of occurrence X Detectability of the fail

The Risk level will then be based on the following table:

Risk Priority NumberRisk
1 to 4Low (L)
6 to 8Medium (M)
9 to 27High (H)

7. Functional Risk Assessment using FMEA

8. Conclusion

5 Format for Installation Qualification Protocol

1. Purpose

2. Scope

3. References

4. Abbreviations:

5. System Overview

6. Responsibilities

7. Qualification Tests

8. Qualification Test Procedure

9. Acceptance Criteria

10. Training and Signature Identification

11. Tests:

Test Date:                                                                                          

Test Run:

Purpose:

Test Procedure:

Acceptance Criteria:

Step No.         

Procedure       

Expected Result         

URS/FRS No.

Actual Result 

Pass/Fail        

Sign & Date

Acceptance Criteria Met? Yes/No _________ If No, Enter Discrepancy Reference: ___________

12. Discrepancy Handling Procedure

13. List of Attachment

14. Summary and Conclusion

15. Revision History

6. Format for Operation Qualification / Performance Qualification / Operation and Performance Qualification Protocol

1. Purpose

2. Scope

3. References

4. Abbreviations

5. System Overview

6. Responsibilities

7. Qualification Tests

8. Qualification Test Procedure

9.  Acceptance Criteria

10. Training and Signature Identification

11.  Tests

12. Discrepancy Handling Procedure

13. List of Attachment

14. Summary and Conclusion

15. Revision History

7. Format for Traceability Matrix

Sr. No.URS Point No.Description of URS pointQualification Document NumberTest NumberStatusPass/ Fail
       

Verified by:

Sign/ Date

8. Format for Validation Summary Report

1. Purpose

2. Scope

3. References

4. System Overview

5. Responsibilities

6. Qualification Test Result

7. Summary of the Validation Activities

8. Deviation from Validation Master Plan

9. System Release

10. Conclusion

11. Attachment

12. Revision History

9. Format for Equipment/Instrument handover checklist

Following checkpoints need to be verified before handover of equipment/instrument:

Sr. No.CheckpointsCompliance StatusVerified by Sign/ Date
 Check whether IT policy applied to the equipment / instrument PC.  
 Check whether C, D or any other drive is accessible to user level login.  
 Check whether all unnecessary data is removed from C, D, E or any other drive available.  
 Check whether recycle bin is emptied.  
 Check whether cut/ copy/ paste/ delete function disable  
 Check whether control panel/ task manager function is disable  
 Check whether command function/ USB drive is disable  
 Check whether user ID’s are deactivated/ deleted prepared during CSV execution.  
 Check no Generic ID is available in equipment/instrument/application. If available please justify/ disable.  
 Check whether all new and empty folders are deleted.  
 Check whether path for backup data is created. Mention path.  
 Verify path for backup of files.  
 Check whether equipment/instrument is connected to LAN/ Server (if applicable).  
 Verify setting for synchronization and time zone (applicable to respactive country, location) to be selected.  
 Check whether date and time change settings are restricted to any user in Windows.  
 Removal of unwanted files and software from the system.  
 Verify that auto updates for Windows is disabled. If enabled, shall be disabled.  
 Check whether antivirus is present in the equipment IPC/instrument PC. If present then check for the required portion for the functioning of the equipment application is waived off.  
 Check whether windows event storage size kept as Minimum required Kb/ Mb recomeded by supplier and select the “Do not over write events (Clear log manually)”  
 List user id’s kept active and their role  

10. Format for FS/ DS/ CS/ FDS

1. Purpose

2. Scope

3. References

4. System Overview

5. Responsibility Matrix

6. System Specifications

7. Attachments

8. Revision History

11. Format for Migration Qualification Protocol

1. Purpose

2. Scope

3. References

4. Definition

5. System Overview

6. Responsibilities

7. Qualification Tests

8. Qualification Test Procedure

9. Acceptance Criteria

10. Training and Signature Identification

11. Tests

12. Discrepancy Handling Procedure

13. List of Attachment

14. Summary and Conclusion

15. Revision History

Guideline References:

Part 11, Electronic Records; Electronic Signatures – Scope and Application: https://www.fda.gov/media/75414/download

Annex 11: Computerised Systems: https://health.ec.europa.eu/system/files/2016-11/annex11_01-2011_en_0.pdf

Computer Software Assurance for Production and Quality System Software: https://www.fda.gov/media/161521/download

Scroll to Top