
Computer System Validation
Frequent speaker and chair person at FDA, ISPE, PDA, USP. IVT,
ECA and GAMP conferences and workshops
 |
Ludwig Huber
(right) during a discussion with John Murray, FDA's national
expert for computer system validation, (left), at the IVT
Computer System Validation Conference (2009) |
For Dr. Huber's connection with the FDA,
click here
New Audio Seminar
2-Day Seminars with Workshops
-
Validation and Part 11 Compliance of
Computer Systems and Data:,
March 17/18, Zurich, Switzerland
-
Validation and Part 11 Compliance of Computer Systems and Data:,
April 21/22, Sydney, Australia
-
Validation and Part 11 Compliance of Computer Systems and Data:,
April 25/26, Hong Kong
-
Validation
and Part 11 Compliance of Computer Systems and Data:,
April 28/29, Mumbai, India
-
With Dr. Ludwig Huber
Introduction and Regulatory Requirements
Computers are widely used during development and manufacturing of
drugs and medical devices. Proper functioning and performance of
software and computer systems play a major role in obtaining
consistency, reliability and accuracy of data.
Therefore, computer system validation (CSV) should be part of
any good development and manufacturing practice. It is also
requested by FDA regulations and guidelines through the overall
requirement that "equipment must be suitable for it's intended use".
Specific requirements for computers can be found in section
211.68 of the US cGMP regulations
- Automatic, mechanical, or electronic equipment or other
types of equipment, including computers, or related systems that
will perform a function satisfactorily, may be used in the
manufacture, processing, packing, and holding of a drug product.
If such equipment is so used, it shall be routinely calibrated,
inspected, or checked according to a written program designed to
assure proper performance. Written records of those calibration
checks and inspections shall be maintained.
- Appropriate controls shall be exercised over computer or
related systems to assure that changes in master production and
control records or other records are instituted only by
authorized personnel.
- Input to and output from the computer or related system of
formulas or other records or data shall be checked for accuracy
- The degree and frequency of input/output verification shall
be based on the complexity and reliability of the computer or
related system
- A backup file of data entered into the computer or related
system shall be maintained except where certain data, such as
calculations performed in connection with laboratory analysis,
are eliminated by computerization or other automated processes.
In such instances a written record of the program shall be
maintained along with appropriate validation data.
- Hard copy or
alternative systems, such as duplicates, tapes, or microfilm,
shall be designed to assure that backup data are exact and
complete and that it is secure from alteration, inadvertent
erasures, or loss shall be maintained
Recent Warning Letters and 483's Related to Computer
Validation and Part 11
With Case Studies to Avoid and Respond to 483's and Warning Letters
The FDA has developed several specific guidance documents on
using computers for other FDA regulated areas. Most detailed is the
Industry Guide: General Principal of Software Validation: (2). It
deals with development and validation of software used in medical
devices. More recently the FDA has released a draft guidance ob
using computers in clinical studies (3). The guidance states FDA’s
expectations related to computer systems and to electronic records
generated during clinical studies.
Specific requirements for computers and electronic records and
signatures are also defined in FDA’s regulations 21 CFR Part 11 on
electronic Records and Signatures (4). This regulation applies to
all FDA regulated areas and has specific requirements to ensure
trustworthy, integrity and reliability of records generated,
evaluated, transmitted and archived by computer systems. In 2003 the
FDA published a guidance on scope and applications of 21 CFR Part 11
(5). In this document the FDA promoted the concept of risk based
validation
.By far the most detailed and most specific official document
that has ever been developed on using computers in regulated areas
is the “Good Practices Guide on Using Computers in GxP
Environments.” (6). It has been developed by inspectors for
inspectors of the Pharmaceutical Inspection Convention Scheme
(PIC/S) but is also quite useful for the industry. It has more than
50 pages and includes a six page checklist recommended to be used by
for inspectors.
Because of their importance, computer validation issues have been
addressed by several industry organizations and private authors:
- The Good Automated Manufacturing Practices Forum (GAMP) has
developed guidelines for computer validation (7).
-
Huber has published a validation reference books for the
validation of computerized analytical and networked systems (8).
-
The Parenteral Drug Association (PDA) has developed a
technical paper on the validation of laboratory data acquisition
system (9)
All these guidelines and publications follow a couple of
principles:
- Validation of computer systems is not a one time event. It
starts with the definition of the product or project and setting
user requirement specifications and cover the vendor selection
process, installation, initial operation, going use, and change
control and system retirement.
- All publications refer to some kind of life cycle model with
a formal change control procedure being an important part of the
whole process.
-
There are no detailed
instructions on what should be tested. All guidelines refer to risk
assessment for the extent of validation
While in the past computer validation was more focused on
functions of single user computer systems, recently the focus is on
network infrastructure, networked systems and on security,
authenticity and integrity of data acquired and evaluated by
computer systems (10). With the increasing use of Internet and
e-mail communications the validation of web-based applications also
gets more important. Labcompliance recently published a package
entitled Internet Quality and Compliance.
Scope of the Tutorial
This tutorial will guide IT personnel , QA managers, operational
managers and users of computer hardware and software through the
entire high level validation process from writing specifications and
vendor qualification to installation and initial and on-going
operation.
It covers
- Qualification of computer hardware with peripherals and
accessories like printers and disk drives.
- Validation of software loaded on a computer, which is used
to control equipments, to capture raw data, to process the data
and to print and store. Software typically includes operating
systems, standard applications software and software written by
of for a specific user.*
- Development of documentation as required by regulations.
Risk assessment and risk based validation will be discussed for
all validation phases to optimize validation efforts vs. costs for
systems with different impact and risk on product quality. This is
especially important since the FDA has been using and supporting the
risk based approaches for compliance as part of the 21st century
drug cGMP Initiative
One of the main purposes of this primer is to answer the key
question regarding validation: How much validation is needed and how
much is sufficient for a specific computer system?
This primer gives a good overview and lists major validation
steps and tasks but for an in depth understanding and for easy
implementation readers are recommended to read further references,
for example the SOPs and validation examples as included in the
Computer System Validation Package from Labcompliance.

Validation Overview
Validation of computer systems is not a once off event. Annex 11
of the European GMP directive is very clear about this: Validation
should be considered as part of the complete life cycle of a
computer system. This cycle includes the stages of planning,
specification, programming, testing, commissioning, documentation,
operation, monitoring and modifying”.
For new systems validation starts when a user department has a
need for a new computer system and thinks about how the system can
solve an existing problem. For an existing system it starts when the
system owner gets the task of bringing the system into a validated
state. Validation ends when the system is retired and all-important
quality data is successfully migrated to the new system. Important
steps in between are validation planning, defining user
requirements, functional specifications, design specifications,
validation during development, vendor assessment for purchased
systems, installation, initial and ongoing testing and change
control. In other words, computer systems should be validated during
the entire life of the system.
Because of the complexity and the long time span of computer
validation the process is typically broken down into life cycle
phases. Several life cycle models have been described in literature.
One model that is frequently used is the V-model as shown in figure
1.
Figure 1. V-Lifecycle model
This model comprises of User Requirement Specifications (URS),
Functional Specifications (FS), Design Specifications (DS),
development and testing of code, Installation Qualification (IQ),
Operational Qualification
(OQ) and Performance Qualification (PQ).
The V-Model as described above is quite good if the validation
process also includes software development. However, it does not
address some very important steps, for example, vendor assessment.
It also looks quite complex for true commercial off the shelf system
with no code development for customization. Phases like design
specification or code development and code testing are not
necessary. For such systems the 4Q model is recommended with just
four phases: design qualification (DQ), installation qualification
(IQ), operational qualification (OQ), performance qualification
(PQ). The process is illustrated in Figure 2.

Figure 2. 4Q Lifecycle model
Both the 4Q and the V-model do not address the retirement phase.
The 4Q model is also not suitable when systems need to be configured
for specific applications or when additional software is required
that is not included in the standard product and is developed by the
user’s firm or by a 3rd party. xxx In this case a life cycle model
that combines system development and system integration is
preferred. An example is shown in figure 3.

Figure 3. System Integration combined
with system development
User representatives define User or System Requirement
Specifications (URS, SRS). If there is no vendor that offers a
commercial system the software needs to be developed and validated
by following the steps on the left side of the diagram. Programmers
develop functional specifications, design specifications and the
code and perform testing in all development phases under supervision
of the quality assurance.
When commercial systems are available either the SRS or a special
Request for Proposal (RFP) is sent to one or more vendors (see right
site of the diagram). Vendors either respond to each requirement or
with a set of functional specifications of a system that is most
suitable for the user’s requirements. Users compare the vendor’s
responses with their own requirements. If none of the vendors meet
all user requirements, the requirements may be adjusted to the best
fit or additional software is written to fulfill the user
requirements following the development cycle on the left side of the
diagram. The vendor that best meets the user’s technical and
business requirements is selected and qualified.
The extent of validation depends on the complexity of the
computer system. The extent of validation at the user’s site also
depends on the widespread use of the same software product and
version. The more a standard software is used and the less
customization made for such software the less testing is required by
individual users. GAMP has developed software categories based on
the level of customization. In total there are five categories.
Category one and two define operating systems and firmware of
automated systems. In the context of this primer only categories
three to five are of interest. They are described in Table 1. Each
computer system should be associated to one of the three categories.
GAMP 3 |
Standard software package. No
customization.
Examples: MS Word (without VBA scripts). Computer controlled
spectrophotometers. |
GAMP 4 |
Standard software package.
Customization of configuration.
Examples:
LIMS, Excel spreadsheet application where formulae and/or
input data are linked to specific cells.
Networked data systems. |
GAMP 5 |
Custom software package. Either all
software or a part or the complete package has been
developed for a specific user and application.
Examples: Add-ons to GAMP Categories 3 and 4, Excel®
with VBA scripts. |
Validation Master Plan and Project Plan
All validation activities should be described in a validation
master plan which should provide a framework for thorough and
consistent validation. A validation master plan is officially
required by Annex 15 to the European GMP directive. FDA regulations
and guidelines don’t mandate a validation master plan, however,
inspectors want to know what the company’s approach towards
validation is. The validation master plan is an ideal tool to
communicate this approach both internally and to inspectors. It also
ensures consistent implementation of validation practices and makes
validation activities much more efficient. In case there are any
questions as to why things have been done or not done, the
validation master plan should give the answer.
Within an organization a validation master plan can be developed
for
- multiple sites
- single sites
- single locations
- single system
categories
- department categories, e.g., for development departments
Computer Validation master plans should include:
- Introduction with a scope of the plan, e.g., sites, systems,
processes
- Responsibilities by function
- Related documents, e.g., risk management plans
- Products/processes to be validated and/or qualified
- Validation approach, e.g., system life cycle approach
- Risk management approach with examples of risk categories
and recommended validation tasks for different categories
- Vendor management
- Steps for Computer System Validation with examples on type
and extent of testing, for example, for IQ, OQ and PQ
- Handling existing computer systems
- Validation of Macros and spreadsheet calculations
- Qualification of network infrastructure
- Configuration management and change control procedures and
templates
- Back-up and recovery
- Error handling and corrective actions
- Requalification criteria
- Contingency planning and disaster recovery
- Maintenance and support
- System retirement
- Training plans (e.g., system operation, compliance)
- Validation deliverables and other documentation
- Templates and references to SOPs2
- Glossary
For larger projects a detailed individual validation project plan
should be developed. An example would be implementing a Laboratory
Information Management (LIMS) System or networked chromatographic
data system. This plan is derived from the validation master plan
using the principles and templates of the master plan. It formalizes
qualification and validation and outlines what is to be done in
order to get a specific system into compliance. For inspectors it is
a first indication on which control a department has over a specific
computer system and it also gives a first impression of the
validation quality.
A validation project plan should include sections on
- Scope of the system, what it includes, what it doesn’t
include.
- System description
- Validation approach
-
Assumptions, limitations and exclusions
- Responsibilities
- Risk assessment
- Risk based test strategy and approach for validation steps,
e.g., DQ, IQ,
OQ,
PQ
- Ongoing performance control
- Configuration management and change control
-
Handling system security
* Data back-up and recovery
-
Contingency planning
-
Error handling
-
References to other documents
-
Timeline and deliverables for each phase
Design Qualification and Specifications
“Design qualification (DQ) defines the functional and operational
specifications of the instrument and details the conscious decisions
in the selection of the supplier “(8). DQ should ensure that
computer systems have all the necessary functions and performance
criteria that will enable them to be successfully implemented for
the intended application and to meet business requirements.
Errors in DQ can have a tremendous technical and business
impact, and therefore a sufficient amount of time and resources
should be invested in the DQ phase. For example, setting wrong
functional specifications can substantially increase the workload
for OQ testing, adding missing functions at a later stage will be
much more expensive than including them in the initial
specifications and
selecting a vendor with insufficient support capability can decrease
instrument up-time with a negative business impact.
Steps for design specification normally include:
- Description of the task the computer system is expected to
perform
- Description of the intended use of the system
-
Description of the intended environment
- Includes network environment)
- Preliminary selection of the system requirement
specifications, functional specifications and vendor
-
Vendor assessment
-
Final selection of the system requirement specifications and
functional specification * Final selection and
supplier
- Development and documentation of final system specifications
System requirement specifications (SRS) or user requirement
specifications (URS) are usually written by user representatives.
The vendor’s specification sheets can be used as guidelines.
However, it is not recommended to simply writing up the vendor’s
specifications because typically commercial software has more
functions than the user ever will need. On the other hand there
should be documented evidence that the system performs all specified
functions and compliance to the specifications must be verified
later on in the process during operational qualification and
performance qualification. Specifying too many functions will
significantly increase the workload for OQ. The development of
requirement specifications should follow a well documented
procedure. Most important is to involve representatives of all user
departments in this process.
User requirements should have a couple of key attributes. They
should be:
- Necessary. Unnecessary functions will increase development,
validation, support and maintenance costs.
-
Complete. Adding missing functions at a later stage will be much
more expensive than including them initially.
- Feasible. Specified functions that can not be implemented
will delay the project.
- Accurate. Inaccurately specified functions will not solve
the application’s problem.
- Unambiguous to avoid guessing and wrong interpretation by
the developer.
- Specific to avoid wrong interpretation by the developer.
-
Testable. Functions that are not testable can not be validated.
- Uniquely identified. This helps to link specifications to
test cases.
Functional specifications answer the question: what functions
does the system need to comply with users requirements. They are
normally written by the developer of the system and should be
reviewed by the user.
Design specifications are
also written by the developer. They answer the question: how does
the system implement specified functions. They should be formally
reviewed by a team of developers under the supervision of QA.

Vendor Assessment
Validation of software and computerized systems covers the
complete lifecycle of the products which includes validation during
design and development. When software and computer systems are
purchased from vendors, the user is still responsible for the
overall validation.
FDA’s guide on Principles of Software Validation states this very
clearly: “Where the software is developed by someone other than the
device manufacturer (e.g., off-the-shelf software) the software
developer may not be directly responsible for compliance with FDA
regulations. In that case, the party with regulatory responsibility
(i.e., the device manufacturer) needs to assess the adequacy of the
off-the-shelf software developer’s activities and determine what
additional efforts are needed to establish that the software is
validated for the device manufacturer’s intended use”.
The objective of vendor
qualification is to get assurance that the vendor’s products
development and manufacturing practices meet the requirements of the
user’s firm for quality. For software development this usually means
that the software is developed and validated following documented
procedures.
Vendor assessment should
answer the questions: "What type of assurance do you have that the
software has been validated during development" or "How can you be
sure that the software vendor did follow a quality assurance
program?" Depending on the risk and impact on (drug) product quality
answers can be derived from
- Documentation of
experience with the vendor
Experience may come from the product under consideration or from
other products.
- External references
Useful if there is no
experience within the vendor within your company
- Assessment checklists
(mail audits)
Use checklists available
within your company, through public organizations, e.g., PDA and
from private authors.
- 3rd party audits
Gives an independent
assessment of the quality system and/or product development
- Direct vendor audits
Gives a good picture on
the vendors quality system and software development and
validation practices.
Assessment cost increase
from 1 to 5 and the final procedure should be based on justified and
documented risk assessment. Such risk assessment include two parts:
- Product risk
- Vendor risk
Factors for product risk include
- System complexity
- Number of systems to be purchased
- Maturity of the system
-
Level of networking
-
Influence on other systems, e.g., through networks
-
Impact of the system on drug quality
-
Impact of the system on business continuity
-
Level of customization
Factors for vendor risk include
- Size of company
- Company history
- Future outlook
- Representation in
target industry, e.g., Pharma
- Experience with the
vendor
Risk factors are estimated for the computer system (product) and
the vendor and entered in table like in figure 4.

Figure 4. Vendor Risk vs. Product Risk
Most critical is the red area with high product and high vendor risk. This
scenario would require a vendor audit either through the user firm or through a
trusted 3rd party.
On the other hand green areas could be handled by a one to
two page document describing who the vendor and why you did select the vendor.
Vendors in the yellow area could be assessed through mail audits
supported by good internal or external references.
Results of the vendor audits
should be documented following
a standardized ranking scheme. An example is shown in Table
2.
The results of the vendor
assessment and any vendor audit should be well communicated within a
company to avoid duplication of audits of the same vendor by
different departments or sites. This can be achieved by developing a
company wide repository with entries of all vendor assessment
activities. The whole process of vendor assessment and
audits should be controlled by documented procedures.
3 |
Excellent |
Vendor procedures and practices are above average |
2 |
Adequate |
Vendor procedures and practices are about average |
1 |
Poor |
Vendor procedures and practices are below average
and need to be improved |
0 |
Unsatisfactory |
Vendor procedures and practices are unacceptable |
N/A |
Not Applicable |
Question is not applicable to the type
of function or service |
Installation Qualification
Installation qualification establishes that the computer system
is received as designed and specified, that it is properly installed
in the selected environment, and that this environment is suitable
for the operation and use of the instrument.
The list below includes steps as recommended before and
during installation.
Before installation
- Obtain manufacturer's recommendations for installation site
requirements.
-
Check the site for the fulfillment of the manufacturer’s
recommendations (utilities such as electricity, water and gases
and environmental conditions such as humidity, temperature,
vibration level and dust).
During installation
- Compare computer hardware and software, as received, with
purchase order (including software, accessories, spare parts)
- Check documentation for completeness (operating manuals,
maintenance instructions, standard operating procedures for
testing, safety and validation certificates)
-
Check computer hardware and peripherals for any damage
- Install hardware (computer, peripherals, network
devices, cables)
-
Install software on computer following the manufacturer’s
recommendation
- Verify correct software installation, e.g., are all files
accurately copies on the computer hard disk. Utilities to do
this should be included in the software itself.
- Make back-up copy of software
-
Configure network devices and peripherals, e.g. printers and
equipment modules
- Identify and make a list with a description of all hardware,
include drawings where appropriate, e.g., for networked data
systems.
- Make a list with a description of all software installed on
the computer
-
Store configuration settings either electronically or on paper
-
List equipment manuals and SOPs
-
Prepare an installation report
Installation and installation qualification (IQ) of larger
commercial system is normally performed by a supplier’s
representative. Both the suppliers representative and a
representative of the user’s form should sign off the IQ documents.

Operational Qualification
“Operational qualification(OQ) is the process of demonstrating
that a computer system will function according to its functional
specifications in the selected environment (
Before OQ testing is done, one should always consider what the
computer system will be used for. There must a clear link between
testing as part of OQ and requirement specifications as developed in
DQ phase. Testing may be quite extensive if the computer system is
complex and if there is little or no information from the supplier
on what tests have been performed at the supplier’s site. Extent of
testing should be based on a justified and documented risk
assessment. Criteria are
- Impact on product quality
-
Impact on business continuity
-
Complexity of system
-
Information from the vendor on type of tests and test
environment
-
Level of customization
Most extensive tests are necessary if the system has been developed
for a specific user. In this case the user should test all
functions. For commercial off-the-shelf systems that come with a
validation certificate, only tests should be done of functions that
are highly critical for the operation or that can be influenced by
the environment. Examples are data acquisition over relatively long
distance from analytical instruments at high acquisition rate.
Specific user configurations should also be tested, for
example correct settings of IP addresses of network devices should
be verified through connectivity testing.
Based on the risk factors above a system risk factor should be
estimated. Extent of testing should be defined for each risk level
in a risk management master plan or in the ‘risk’ section of the
validation master plan. An example is shown in the table below. The
level of customization is expressed through the GAMP Categories 3,
4, or 5. Category three is a standard software without customization
and configuration setting. Category 4 is a configurable system and
Category 5 a fully customized system. Extent of testing increases
from the left lower site (low risk, standard system) to the right
upper site (high risk, full customization).
High risk |
Test critical
functions.
Link tests to requirements. |
Test critical
standard functions.
Test all non standard functions
Link tests to requirements |
Test critical standard functions.
Test all non standard functions
Link tests to requirements. |
Medium risk |
Test critical
functions. |
Test all critical
standard and non standard functions
Link tests to requirements. |
Test critical standard functions.
Test all non standard functions
Link tests to requirements. |
Low risk |
No testing |
Test critical non
standard functions |
Test critical non standard functions |
Proper functioning of back-up and recovery and security functions
like access control to the computer system and to data should also
be tested.. Full OQ test should be performed before the system is
used initially and at regular intervals, e.g., for chromatographic
data systems about once a year and after major system updates.
Partial OQ tests should be performed after minor system updates.
Tests should be quantitative. This means inspectors would not
only expect a test protocol with test items and pass/fail
information but also expected results, acceptance criteria and
actual results. An example for a test protocol template is shown in
figure 8.
Tests should be linked to
requirement specifications through a test traceability matrix. A
template for such a matrix is the table below should help to easily
find a test protocol for a specific test requirement.
The matrix can be documented on paper format but for larger
projects it is recommended to use electronic document management
systems. This can range from simple Word tables to data bases and
software specifically developed for managing traceability matrices.
1.1 |
Example 1 |
4.1, 4.3 |
1.2 |
Example 2 |
1.2 |
1.3 |
Example 3 |
3.1 |
1.4 |
Example 4 |
3.1, 4.1 |
Performance Qualification
“Performance Qualification (PQ) is the process of demonstrating
that a system consistently performs according to a specification
appropriate for its routine use”.
Important here is the word ‘consistently’. Important for
consistent computer system performance are regular preventive
maintenance, e.g., removal of temporary files and making changes to
a system in a controlled manner and regular testing.
In practice, PQ can mean testing the system with the
entire application. For a computerized analytical system this can
mean, for example, running system suitability testing, where
critical key system performance characteristics are measured and
compared with documented, preset limits.
PQ activities normally can include
- Complete system test to proof that the application works as
intended. For example for a computerized analytical system this
can mean running a well characterized sample through the system
and compare the results with a result previously obtained.
- Regression testing:
reprocessing of data files and compare the result with
previous result
- Regular removal of temporary files
-
Regular virus scan
- Auditing computer systems
Most efficient is to use software for automated regression
testing. The software runs typical data sets through a series of
applications and calculates and stores the final result using
processing parameters as defined by the user. During regression
testing the data are processed again and results are compared with
previously recorded results. Normally such tests don’t take more
than five minutes but give assurance that they key functions of the
system work as intended.

Configuration Management and Change
Control
Any changes to specifications, programming codes or computer
hardware should follow written procedures and be documented. Changes
may be initiated because errors have been found in the program or
because additional or different software functions or hardware may
be desirable. Requests for changes should be submitted by users and
authorized by the user’s supervisor or department manager. For
initiation, authorization and documentation of changes forms should
be used. An example is shown in figure 5.

Figure 5: Change Request Form
Most important is that changes should follow standard procedures
for initiation, authorization, implementing, testing and
documenting. All activities should be planned in the validation
project plan and documented in the validation report.
After any changes the program should be tested. Full testing should
be done for the part of the program that has been changed and
regression testing should be done for the entire program.
Validation Report and other
Documents
Validation Report
When the validation project is completed a validation summary
report should be generated by the system owner. The report documents
the outcome of the validation project. The validation report should
mirror the validation project plan and should include:
- A brief description of the system.
- identification of the system and all software
versions that were tested.
-
Description of hardware used.
-
Major project activities.
-
Listing of test protocols, test results and conclusions.
-
Statement on system status prior to release.
-
List of all major or critical issues and deviations with risk
assessment and corrective actions. * Statement that all tasks
have been performed as defined in the project plan.
- Statement that validation has been performed according to
the documented procedures.
-
Listing of all deliverables.
-
Final approval or rejection statement.
The validation report should be reviewed, approved and signed by QA
and the system owner.
Standard Operating Procedures
Validation activities should be performed according to written
procedures. Generic procedures should be taken from the corporate
SOP list. System specific procedures should be developed for the
system to be validated. Labcompliance has examples for most of the
procedures. They are indicated by S-Numbers (S-xxx) in the list
below and are either included in the Computer System Validation
Package,
or can be ordered from the labcompliance
SOP website.
Procedures should be available under the same or a similar title
as follows:
- Training for GxP, 21 CFR Part 11 and Computer Validation
(S-125).
-
Risk Assessment for Systems Used in GxP Environments (S-134).
-
Validation of Commercial Off-the-Shelf (COTS) Computer Systems
(S-271).
-
Validation of Macro Programs and Other Application Software
(S-263).
-
Risk-Based Validation of Computer Systems (S-252).
-
Development of User Requirement Specifications for Computers
(S-253).
-
Quality Assessment of Software and Computer System Suppliers
(S-274).
- Auditing Software Suppliers: Preparation, Conduct, Follow-up
(S-273).
-
Development and Maintenance of Test Scripts for Equipment
Hardware, Software and Systems (S-237).
-
Handling of Problems with Software and Computer Systems.
-
Data Back-Up and Restore (S-317).
-
Disaster Recovery of Computer Systems (S-319).
-
Archiving and Retrieval of GMP Data and Other Documents (S-162).
- Access Control to Computer Systems and Data (S-320).
- Configuration Management and Version Control of Software
(S-259).
- Change Control of Software and Computer Systems (S-262).
- Revalidation of Software and Computer Systems (S-260).
-
Retention and Archiving of Electronic Records (S-315).
-
Qualification of PC Clients (S-289).
-
Retirement of Computer Systems (S-261).
21. Review of Computer Systems.
- Auditing Computer Systems (S-272)
Checklists
Checklists should help to verify that validation tasks are
identified and performed. However, some validation tasks are
specific for specific systems. Therefore going through checklists
does not mean that everything is covered for each system nor does it
mean that all checklist items are applicable for every system.
Labcompliance has examples for checklists related to computer system
validation. They are indicated by E-Numbers (E-xxx) in the list
below and are either included in the Computer System Validation
Package,
or can be ordered from the labcompliance
Examples
website.
Examples are checklists for:
- Commercial Off-the-Shelf Computer Systems (E-160).
- Assessment of Software Vendors (E-255).
- User Requirement Specifications for Software and Computer
Systems (E-153).
Templates and Validation Examples
Templates are useful to effectively follow and document
validation tasks and results. Validation examples help to get
adequate information on how to conduct validation and to prepare
deliverables. Labcompliance has templates and examples for
validation tasks. They are indicated by E-Numbers (E-xxx) in the
list below and are either included in the Computer System Validation
Package:
or can be ordered from the labcompliance
Examples
website.
Such documentation can include templates/examples for:
- Requirement Specifications for Chromatographic Data Systems
(E-255).
- Requirement Specifications for Excel Applications (E-268).
-
User Requirement Specifications - 20 Good/Bad Examples (E-308).
-
Computer System and Network Identification (E-326).
- Template/Examples: Test Protocol For Excel™ Spreadsheet
Application (with traceability matrix): Includes 12 test scripts
examples for functional testing, boundary testing, out of range
testing and test traceability matrices: tests vs.
specifications, specifications vs. test cases and test summary
sheet (E-358).
- Testing of Authorized System Access (E-362).
-
MD5 Checksum File Integrity Check Software with Validation
Documentation: DQ, IQ,
OQ,
PQ (E-306).

Links to Other Websites
Expert Advice on Selected Topics
Working with the FDA
A good relationship with industry and FDA has always been my
highest priority. Most issues between industry and FDA can
be resolved by having a good understanding of each others
position. Personal visits to FDA's centers in Rockville,
panel discussions at public conferences, joint industry/FDA
workshops and having FDA and industry guest speakers in the
Labcompliance audio seminars helped a lot to get a real good
understanding on both positions, and to get an insight on
what's coming.
The examples below show interactions with the FDA.
 |
Ludwig Huber and
Paul Lepore, FDA's 'Father of GLP' during a GLP Workshop in
Tokyo.
Paul Lepore told the audience how they should prepare for a
GLP inspection and what inspectors will ask.
Ludwig Huber explained how to implement computer validation
in GLP environment.
1999 |
 |
Nick Buhay, Acting
Director in FDA/CDER's Division of Manufacturing and Product
Quality, and Ludwig Huber in the Q&A Discussion Session on
Laboratory Compliance
2006 |
 |
Ludwig Huber was
on conference program's Part 11 panel discussion with FDA's
the Joseph Famulare, Acting Director, Office of Compliance,
CDER.
2006 |
 |
Dennis Cantellops
(right) , QA manager at the FDA Labs in Puerto Rico, and
author of FDA's Excel Bulletins spoke at Ludwig Huber's
Excel audio
seminar as guest speaker.
2005 and in 2009 |
 |
Ludwig Huber has
been asked to review FDA's internal SOP and Lab Information
Bulletin on Spreadsheet Validation.
FDA acknowledged Ludwig Huber's contribution in the LIB |
 |
Thomas S. Savage,
Senior Coordinator at FDA's Office of Regulatory Affairs,
told the audience that he took quite a lot of material from
Ludwig Huber's Labcompliance website. He also said that he
regularly visits this website when he wants to learn about
news on FDA inspections. |
 |
Ludwig Huber in an
joint industry/FDA panel discussion at the IVT Part11
conference in Washington DC: From right to left: Martin
Browning, fEduQuest Inc, Paul Motise, US FDA, Kathryn
Davidson. Baxter Healthcare, INC, and Ludwig Huber fAgilent,
1999 |
 |
Ludwig Huber
(middle) with FDA's Paul Motise (second from left) and the
speaker panel at an IVT conference in Washington (2002).
Other speakers, from right to left: Chris Reid, Rebecca
Fuller Heyde, and Jeff Beck
2002 |

|
Video Clips with
FDA Management are on the Lab Compliance Website
FDA management explained FDA's 21st Century Drug cGMP
initiative and the impact on Part 11
Mark B. McClellan, MD., PhD; Commissioner of Food and Drugs,
Janet Woodcock, MD. ; CDER Director David Horowitz, Director
of CDER's Office of Compliance, Joseph Famulare, Director,
Div. of Manufacturing & Product Quality Office of
Compliance, CDER/FDA
2003
|
 |
Ludwig Huber
(right) and John Murray, FDA, (left), speaking at the IVT
Computer System Validation Conference
2004 |
 |
George Smith,
FDA's national Part11 expert, at a panel discussion with
Ludwig Huber during an IVT conference 2005
Smith and Huber discussed and answered questions about
computer system validation and e-records. |
 |
Ludwig Huber
(right) in a panel discussion with FDA inspectors and
directors:
Dr. Robert C. Horan, B Erik Henrikson,Joseph Famulare, and
Nicholas Buhay, Acting Director in FDA/CDERs Division of
Manufacturing and Product Quality.
|
 |
Ludwig Huber (right) with Dr. Robert C. Horan, FDA,
Nicholas Buhay, and Joseph Famulare, in the joint
SINO-SFDA-US FDA cGMP /workshop
|