...

Application of Design of Experiment (DOE) Manufacture

by user

on
Category: Documents
32

views

Report

Comments

Transcript

Application of Design of Experiment (DOE) Manufacture
Application of Design of Experiment (DOE)
Techniques to Process Validation in Medical Device
Manufacture
Tuesday, August 29, 2006 - Journal of Validation Technology, February 2006 Volume. 12, Number
2
This article has been nominated for the Journal of Validation Technology
Article of the Year Please give us your opinion and vote here.
By Dr. D. Dixon, Dr. J. Eatock, Prof. B.J. Meenan, and M. Morgan
ABSTRACT
Process validation is a requirement in heavily regulated industries such as the automotive and aerospace
industries. Both the International Organization for Standardization (ISO) 9000/13485 and the U.S. Food and
Drug Administration (FDA) Quality System Regulations (QSR) 820 Part 21 Code of Federal Regulations (CFR)
“Quality Systems for Medical Devices,” require process validation as a regulatory requirement. Design of
experiment (DOE) statistical methods will allow these requirements to be met in the most resource-efficient
manner possible, whilst providing a greater understanding of the process and highlighting opportunities for
quality improvement.
This paper provides an overview of the application of DOE techniques during key aspects of process
validation and discusses two practical examples from medical device manufacturing. The attendant benefits
of this type of procedural approach to the medical device sector are considered. From the results obtained, it
is clear that meeting process validation requirements can be taken as an opportunity to better understand
and improve manufacturing processes. As such, DOE techniques are pivotal in effectively achieving these
aims in a manner that provides for regulatory compliance.
INTRODUCTION
Many medical device manufacturers view validation and other regulatory requirements as a life science sector
burden. Whereas those companies which have developed a mature approach to regulatory compliance have
an integrated approach, many others in this sector only consider this aspect once the main elements of
product and process development have been completed. When viewed from this perspective, meeting
regulations can become, for the most part, both a paper generating exercise and a drain on resources.
It has been considered that „The blind, bureaucratic approach [to validation] being followed by some
companies is a needlessly expensive process that achieves nothing more than a temporary reprieve from the
regulatory authorities.‟1
However, these requirements can be taken as an opportunity to increase process understanding, ensure that
processes are operated under optimum conditions, improve quality, and reduce costs. Management may lack
an understanding of what is required during a process validation (i.e.: to ensure that an acceptable product
is produced under all foreseeable conditions). This often produces a mindset that simply aspires to have all
the relevant documentation completed as soon as possible.
Ideally, process validation considerations should be integrated into the new product development process
from an early stage, for example, when writing a product specification or user requirements document.
Quality functional development (QFD) can be seen as a useful method for determining what the customer
really wants, and then translating such requirements into a definitive product specification.2 This customerfocused specification can then form the basis of the subsequent process validation acceptance criteria.
A large number of companies in the medical device sector have adopted the six-sigma approach to quality
improvement especially those that produce high volume products (e.g.: disposables). Based on total turnover
in 2004, 6 out of the 15 largest medical device companies worldwide promoted the fact that they have
adopted six-sigma on their respective company websites. The primary aim of the six-sigma quality method is
quality improvement by reducing the defect rate and by effectively addressing customer needs.
Six-sigma refers to six standard deviations between the mean and the specification limit equating to a defect
level of <3.4 defects per million opportunities (DPMO). In six-sigma thinking, process defects are seen as
quality improvement opportunities and as such they should be exploited to improve overall quality. 3, 4 The
six-sigma method uses a number of established statistical tools such as design of experiment methods (DOE)
and statistical process control (SPC). The prevalence of six-sigma training has led to a greater industrial
awareness of DOE techniques. However, it should be noted that no standards exist regarding the level of
statistical training needed to obtain six-sigma qualifications such as green or black belt. Hence, this can lead
to an inconsistency in its application across the sector.
In recent years, DOE has been increasingly recognised as an essential tool for the validation of medical
manufacturing processes,5 but in the authors‟ experience, the uptake of these methods by the medical device
sector has been limited. An investigation into UK companies across a range of industries reported that only
30%, approximately, apply DOE techniques. This is due in part to the inadequate teaching of such statistical
methods as DOE in university engineering courses.6 The widespread teaching of statistical techniques as part
of six-sigma training may improve the situation.
The term, design of experiments refers to a set of statistical approaches to experiment design and analysis.
DOE is one of the key tools used in six-sigma, and staff (e.g., six-sigma black belts) trained in its use can
also apply the methodology to efficiently fulfil regulatory requirements. DOE is used to model the effect of
one or more process factors on the chosen outputs.
In traditional experiments, only one factor at a time is varied while all others are held constant. In DOE all
the factors of interest can be investigated in a single trial, minimizing the size of the experimental schedule
required and providing information on key process interactions. This latter aspect is of importance since
optimal conditions may occur when one factor is high and another low. This type of information on
interactions between factors cannot be easily obtained by investigating the effect of each factor separately.
Types of DOE trials include single factor, two-level, factorial designs, Taguchi methods, mixture methods,
and response surface models. Trials which only study each factor at a high and low (so called two-level trials)
level are useful for determining the important sources of variability for a process and are, therefore, often
referred to as „screening trials.‟ Two-level trials are useful for highlighting the critical factors for further
detailed study. Taguchi methods are often used to design robust processes by locating the operating
conditions that minimise product variation.7 Response surface experiments are used to map the effect of
varying two or more factors across a range of values. Other experimental types include mixture trials for
investigating product formulations.
A DOE approach permits efficient use of resources (personnel time, machine time, materials, etc.), provides
detailed analysis, gives information on reproducibility and errors, and provides a predictive capability. 8
Applying DOE reduces the size and hence the cost of process validation trials. It is a regulatory requirement
to run sufficient trials to demonstrate the statistical significance of results, and DOE can assist in this
proceedural aspect.
Some general statistical packages such as Minitab™ and SPSS™ have DOE capacity and dedicated software is
available, including Design-Expert,™ PQ systems™ and JMP.™ The dedicated packages tend to be userfriendly as they lead you through the design and analysis of the experimental outcomes, and often have a
more comprehensive DOE capability. First, one decides on the experimental type (e.g., two levels) and then
the factors and levels are entered. At this point, the software will suggest the number of runs necessary. One
can increase the accuracy of the trial by adding replicate runs or by reducing the trial size by 1/2 or 1/4
factorial design. Unlike in a full design, a fractional design does not require tests at every combination of
factor levels. The software then generates the experimental design showing the factor levels for each run.
The experiment is then conducted, samples produced and tested, and the data entered into the software.
The analysis of the results often consists of determining significant factors, fitting a model to the results, and
checking for outliers. The results can be plotted in various forms including surfaces, box plots, etc. Despite
the ease of use of many software packages, it is still necessary to understand the assumptions made and the
underlying statistical techniques used, such as ANOVA, to avoid drawing incorrect conclusions. For example,
a two-level design assumes a linear response between the low and high levels.
Process Validation and DOE
Process validation is defined under the FDA quality regulations QSR/Good Manufacturing Practice (GMP) as
„establishing documented evidence which provides a high degree of assurance that a specific process will
consistently produce a product that meets predetermined specification and quality attributes.‟9
Validation indicates that a process has indeed been subjected to such scrutiny and that the results of the
process can be practically guaranteed. The complete validation for a product will include both design and
process validation.
The European Union (EU) and FDA regulations operated in the medical device sector do not require all
manufacturing processes to be validated. Under FDA regulations if a process‟ output cannot be fully verified
during routine production by inspection or test, the process must be validated according to established
procedures [820.75(a)9]. If the output of a manufacturing process can be verified by inspection and it is
sufficient and cost effective to do so, then validation is not necessary.10 The manufacture and testing of
wiring harnesses is an example of a process that can be satisfactorily covered by verification by testing each
product. Validation is vitally important if the predetermined requirements can only be assured by destructive
testing. Examples include testing for mechanical properties, heat treatment processes, sealing and bonding
operations, and sterilization processes.
It is important to implement a quality assurance program that is appropriate to the device being
manufactured.11 The manufacturer of a simple low-risk (e.g., Class 1) device that is not exempt from
QSR/GMP regulations, does not need to implement as extensive and detailed a validation and control plan as
that required for a complex high risk Class III device.
Process validation is normally broken into a number of stages:
Installation Qualification (IQ) establishes that all key aspects of the equipment installation
adhere to the manufacturer’s specification including: equipment description, installation and
supplies, environment, calibration, maintenance, and operator training.
Operational Qualification (OQ) demonstrates that the equipment consistently operates to
specification under normal conditions including: testing of alarms, software function,
extremes of operating ranges, and machine consistency.
Performance Qualification (PQ) produces product within specification when operated under
challenged conditions.
Validation is then an umbrella term for the set of qualifications for a particular process. These qualification
stages are not defined in an identical manner in GMP/QSR and ISO standards e.g.: GMP Part 820 refers to
process performance and product performance stages. However, it is not necessary to divide validation into
these stages. Many manufacturers have a single document to cover both the IO and OQ.12 The ISO 13485
and QSR/GMP Part 820 regulations themselves do not provide specific guidance on how to conduct a
validation. Various bodies, including the regulatory authorities, produce guidance documents to offer more
practical and specific help.5, 6, 8 However, unlike the ISO and GMP/QSR Part 820 quality standards, these
guidance documents are optional.13 It should be possible to meet all quality system validation requirements
applicable to medical device manufacture with one procedure using the GHTF (Global Harmonisation Task
Force) document as a guide.10
Regardless of the way in which the validation is subdivided, a protocol is written for each stage stating the
tests that will be conducted and the acceptance criteria (based on the product specification). The tests are
then conducted and a report is written based on the protocol. If the acceptance criteria are not met, it may
be necessary to make changes to the process and repeat the qualification. It is often necessary to analyse
historical data or conduct a prequalification study of equipment before writing the validation documentation
since it is only possible to write a qualification protocol for a well-understood and stable process.
It is a requirement of the validation process to monitor the long-term stability of a validated process.
Statistical process control (SPC), another technique from the six-sigma toolbox, can be used to analyse the
long-term stability of a process and to highlight out of control situations before they become critical.
This paper discusses two examples from the authors‟ practical experience of the ways in which DOE can be
applied to process validation. First, the use of a response model type DOE in determining process operation
limits, and second, the application of a two-level design for challenging a process under worst case operating
conditions.
Case Study 1.
Use of DOE to Establish Process Limits
Device and process specifications must be established before a piece of equipment can be validated, because
this information is used to write the protocol.10 Equipment can then be tested to ensure that it is capable of
operating consistently within these limits. The acceptance criteria for a capability analysis are often set at
CpK > 1.33. The critical process capability factor (CpK) is defined as (Specification Limit-Mean)/ 3s.
The effect of altering process parameters on product quality can be established using DOE, thus defining an
acceptable window of operating conditions, i.e.: the process limits. This trial was conducted as part of the
pre-qualification study to define the allowable operating conditions.
The heat-sealing of medical packaging is an example of a manufacturing process requiring validation, as it is
not possible to test seal integrity non-destructively. In the process, a heated platen tool is used to thermally
seal together the two sides of a medical package. Failure of a seal during sterilisation or handling would
result in loss of sterility and thus affect patient safety. The process variables during the sealing operation are
platen temperature, pressure applied, and seal time. Each of these variables must be controlled within the
correct limits to achieve acceptable seal quality. DOE is the most efficient method to determine these limits.
The table in Figure 1 illustrates the high and low levels for each of the three variables, the influence of which
can be investigated using one compact DOE trial.14 The range investigated for each factor is chosen based on
knowledge of the system. Traditional testing methods involve using a matrix approach to assess the
acceptable processing window on form, fill, and seal equipment.14 Using this method, ten temperatures, three
pressure levels, and five dwell times were investigated resulting in 150 separate runs. Using a DOE approach,
it was possible to complete the trial with only 20 runs. This approach is, therefore, very efficient in terms of
resources required and is also more robust statistically.14
Figure 1
_____________________________________
Heat-sealing Process Factor Levels
Since in DOE trials multiple variables are varied in a single run, this makes it possible to fully comprehend
the effects of all treatments and the interactions between them while limiting the size of the study. For this
trial, a central composite type of response surface model (RSM) was chosen. This method studies each
variable at 5 levels over the 20 runs. The required number of samples produced and tested for each run
depends upon the number of factors involved, the process variability, and by what is deemed to be a
significant change in the output.
A software package (Design Expert 6.0) was used to design the experiment, i.e.: generate the conditions for
each of the twenty runs and analyse the results. The software randomises the run order to reduce the
influence of external factors. If, for example, one was to start with all the low temperature settings and then
the high temperature runs, a noise factor such as material variability may be wrongly attributed to the
temperature change. A randomised run order is a critical aspect of DOE. Although in certain circumstances
this may lead to prolonged change over times between runs. For example, it may be necessary to allow the
equipment to cool when changing the temperature between runs.
In this study, the equipment was set up under the conditions needed for each run and a number of samples
were then produced and tested. The results were then entered into the software. In this case, acceptance
was based on a specification being met in terms of peel strength of the bond formed during the sealing
process. This is normally the specification from the data sheet that is used to sell the product and, therefore,
its correctness is paramount.
Figure 2 illustrates the relationship between dwell time, temperature, and seal strength on a contour plot.
The values on the contours correspond to the measured bond strength in N/mm2. Of the three process
variables, pressure was found to have only a minimal effect on seal strength and it was, therefore, decided to
fix this at the normal operating level of 65 psi. It should be noted that longer dwell times and higher
temperatures result in greater bond strength. This is due to the higher heat input and, therefore, greater
melting at the interface, which consequently increases the bond strength. Short dwell times are desirable for
high machine throughput. However, higher temperatures are required at short dwell times in order to
achieve the required amount of heat transfer. The maximum temperature is limited to 150°C, as higher
temperatures cause thermal damage to the packaging materials.
Figure 2
______________________________________________________________________________
Effect of Temperature and Time on Seal Strength
The design specification for the packaging calls for a peel strength of >15.5N/mm2. Therefore, the acceptable
operating window can be defined as being between 1.5-2 seconds duration and 130°-145°C, corresponding
to the box in the upper right hand corner of Figure 2. This operating window provides for a safety margin in
the product specification. Hence, the DOE method of investigating the effect of process variables allows the
levels of the operating limits to be evidence-based and ensures that the process is being operated under
optimal conditions.
Capability analysis is often used as part of the OQ stage of process validation. This is conducted to prove that
the process is capable of consistently operating within the set limits (e.g.: temperature between 130°145°C). As mentioned earlier a CpK value of >1.33 is often defined as the acceptance criteria. The normal
operating conditions should be set at the centre point of the operating window, in this case 1.75s and
137.5°C. SPC could then be used to monitor temperature and time on a continuous basis to ensure that the
process remains in control.
Case Study 2.
DOE for Determination of the Worst Case Operating Conditions during Process Qualification (PQ)
The IQ and OQ phases of a validation demonstrate that equipment is installed correctly, functions as
designed, and can operate consistently within the required limits. It is often necessary, however, to challenge
the process under worst-case conditions during validation; this is commonly conducted as part of the PQ
stage. The GHTF states that:
“the challenge should include the range of conditions as defined by operators‟ practices and procedures as
established in the OQ phase.”10
Challenges should include the range of conditions allowed in the written standard operating procedures
(SOPs) and should be repeated enough times to ensure that the results are statistically meaningful.
Challenges may need to include forcing the preceding process to operate at its allowed upper and lower
limits. The PQ phase proves that an acceptable product is produced under all foreseeable operating
conditions.
It is not always necessary to manufacture product during the OQ procedure, but it is a prerequisite for
process qualification (PQ). For example, it would be possible to demonstrate the operation as part of an OQ
of an autoclave used for sterilisation by recording oven temperature without actually sterilizing equipment.
The process qualification study presented here concerns the reflow solder process used in the manufacture of
PCBs (printed circuit boards) for medical device applications. Solder paste is first applied to blank PCBs;
components are then robotically placed in the correct locations onto the paste. The board then travels
through a reflow solder oven, where the paste melts and then solidifies on exiting the heated zone to hold
the components in place and form the electrical connections. The reflow oven, therefore, consists of an air
oven with a number of temperature zones; PCBs are fed though the system on a conveyor belt and
experience a temperature profile.
A trial was not necessary to establish process limits for this process, as industry standards exist regarding
the required temperature profile. The specification states the maximum and minimum, heating rate, peak
temperature, cooling rate, and time above 183°C (solder melting point). During the capability analysis in the
OQ phase a DatapaqTM data recording system was used. Thermocouples are connected to the data recorder
which is placed in a thermally insulated box. This is then sent through the oven in place of a PCB and stores
data from a number of thermocouples; the temperature profiles are then downloaded onto a PC.
Figure 3 is an example of a typical temperature profile from the reflow oven used. Data was logged from six
thermocouples placed across the width of the conveyor system corresponding to the six traces provided in
Figure 3. This is to ensure that all points on the PCB experience the correct profile. Thirty such profiles were
obtained for this capability study to demonstrate that the oven operated consistently within the required
limits. Acceptance was based on a Cpk value of greater than 1.33 for each of the profile specifications (e.g.:
heating rate, peak temperature, etc.).
Figure 3
______________________________________________________________________________
Reflow Oven Temperature Profile
To ensure that the process continually produces product to the required specification under all foreseeable
conditions, PCBs were produced under worst-case conditions during the PQ stage. Depending upon the
nature of the process and its sensitivity, causes of variation may include changes in environmental conditions
(ambient temperature, humidity, etc.), drifting in process parameters, human factors (ergonomic factors,
training), and material variability. A brain storming session could be used to create a list of possible sources
of variability. This should be based on knowledge of the system and historical data. The challenge may need
to include forcing the preceding process to operate at its allowed upper and lower limits.
Brainstorming produced the three possible sources of variation illustrated in Figure 4.
The use of a two-level DOE screening experiment was used to demonstrate that an acceptable product is
produced under challenged conditions, determine the important sources of variability and highlight the worstcase conditions. This approach allows the trial to be conducted with the minimum number of runs and gives
information on interactions that would not be available if each possible cause of variation was investigated
individually. The worst-case conditions will occur at such interactions, e.g.: a high loading rate combined with
process start up. A simple two-level factorial experiment can rapidly increase the user‟s knowledge about the
behaviour of the process being studied.16
Testing at every combination of the three factors shown in Figure 4 would require an experiment with eight
runs. Alternatively, a 1/2 fractional design could be used halving the number of runs required. A general rule
of thumb for two-level experiments is that a minimum of twice as many runs as factors is required, e.g.: ten
runs to investigate five factors. A disadvantage of such two-level designs is that a linear response is assumed
between the low and high points. Conducting an extra run at the centre level of all the variables can be used
to check that the response is linear.16
Figure 4
______________________________________________________________________________
Possible Sources of Variation in Reflow Oven Operation
The number of samples required for each run will depend upon the confidence level required (e.g., 95%), the
size of effect one is trying to measure, and the variability inherent in the process.
Once boards had been manufactured under each set of conditions, they were tested visually and by flying
probe. The analysis of these trials revealed which, if any, of the possible sources of variance had a
statistically significant effect on product quality. To fulfil the purpose of validation, steps can then be taken to
reduce or remove these sources, e.g.: by installing air-conditioning to maintain ambient temperature or by
altering the standard operating procedures (SOP) to change operator behaviour.
The acceptance criteria could be based on clinical or user needs. The acceptance criteria for the PQ were
based on the maximum number of minor defects allowable. This limit could have been based on normal
defect rates from historical production data. In this case, the maximum number of defects allowable was
defined as:
UCL = A + v(3A)
Where:
A = the average defect rate for all trial runs
UCL* = the maximum allowable defect rate for any run
* Upper Control Limit
This commonly used acceptance criteria is based on a Poisson distribution.17 The average defect rate was
calculated for each run, and the defect rate is acceptable if no run averages exceed the upper control limit.
The defect rates for all runs were found to be below the upper control limit. This provides evidence that none
of the sources of variance investigated had a large detrimental affect on product quality. If any of the
variables had a significant effect on the defect rate, then the process could be altered to reduce these
sources of variability. If for example, loading rate was found to significantly affect the defect rate, then new
SOPs could be written to change the operators‟ loading methods. The acceptance criteria must be set out in
the protocol for the qualification before any tests are conducted.
SUMMARY
Validation is a regulatory requirement in both FDA and European medical device regulations. Statistical
techniques such as DOE and SPC, which are taught in six-sigma training, can be usefully applied to process
regulation. The use of DOE methods, in particular, ensures that the necessary trials are conducted in the
most resource efficient manner possible. It is, however, vital that staff have sufficient understanding of the
assumptions and modelling methods used in DOE. This knowledge could, for example, be gained by
attending a short course and working through several practical examples.
DOE methods allow a large number of variables to be investigated in a compact trial, enable outliers in the
data to be identified, and provide detailed process knowledge.
The two case studies presented here show that there are indeed tangible benefits to be gained from such
approaches. These benefits can result in better production control, i.e.: an increase in product yield whilst at
the same time contributing to regulatory compliance.
Article Acronym Listing
C Centigrade
CFR Code of Federal Regulations
DOE Design of Experiment
DPMO Defects Per Million Opportunities
EU European Union
FDA Food and Drug Administration
GHTF Global Harmonization Task Force
GMP Good Manufacturing Practice
IQ Installation Qualification
ISO International Organization for Standardization
OQ Operational Qualification
PCB Printed Circuit Board
PQ Performance Qualification
PSI Pressure per Square Inch
QFD Quality Functional Development
QSR Quality System Regulations
RSM Response Surface Model
SOP Standard Operating Procedure
SPC Statistical Process Control
U.S. United States
UK United Kingdom
REFERENCES
1. Johnston R., 1995, “Validation Proof or Document Elegance,” Medical Device and
Diagnostic Industry, Vol. 17, Part 6, pp. 20-22.
2. Booker J.D., 2003, “Industrial Practice in Designing for Quality,” International Journal of
Quality Reliability and Management, Vol. 20, No. 3, pp. 388-203.
3. Pande, P.S, Mewman, R.P. and Cavanagh, R.R., 2000, The Six Sigma Way, (McGraw Hill,
ISBN 0-07-135806-4).
4. Coronado, R.B. and Antony, J, 2002, “Critical Success Factors for the Implementation of Six
Sigma Projects,” The TQM Magazine, Vol. 14, No. 2, pp. 92-99.
5. Mark, J. et al, 1999, “Design of Experiment for Process Validation,” Medical Device and
Diagnostic Industry, Vol. 21, pp. 193-199.
6. Anthony J, 2000, “Improving the Manufacturing Process Quality and Capability Using
Experimental Design: A Case Study,” International Journal of Production Research, Vol. 38,
No 12, pp. 2607-2618.
7. Reddy RBS and Babu A.S. 1998, “Taguchi Methodology for Multi Response Optimisation,”
International Journal of Quality Reliability and Management, Vol. 15, pp.646-668.
8. Montgomery, D.C., 1996, Introduction to Statistical Quality Control, 3rd Edition, (New York:
John Wiley & Son), p. 478.
9. Good Manufacturing Procedures, U.S. Food and Drug Administration, Process Validation,
Section-820, Part 4.
10. Quality Management Systems, Process Validation Requirements, Jan. 2004, Edition 2,
Global Harmonization Task Force.
11. Guidance on the General Principles of Process Validation, May 1987, Guidance Document,
FDA Centre for Devices and Radiological Health.
12. Weese, D. L., 1998, “Conducting Process Validations with Confidence,” Medical Device and
Diagnostic Industry, Vol. 20, pp. 107-112.
13. Alexander, K., 2000, “Good Design Practice for Medical Devices and Equipment,” Journal of
Medical Engineering and Technology, Vol. 24, No. 1, pp. 5-13.
14. Dixon, D. et al, 2002, “The Effect of Sealing Properties on the Fracture Mechanism and Peel
Properties of Medical Packaging Materials,” Journal of Applied Medical Polymers, Vol. 6, No
1, pp. 30-34.
15. Barcan D., 1995, “Using a Seal Matrix to Optimise Package Sealing Variables,” Medical
Device and Diagnostic Industry, Vol. 17, No. 9, pp. 112-122.
16. Kim. J.S. and Kalb. J.W., 1996, “Design of Experiment, an Overview and Application
Example,” Medical Device and Diagnostic Industry, Vol. 18, Part 3, pp. 78-88.
17. Montgomery. D.C., 1997, Design and Analysis of Experiments, (New York: John Wiley &
Son, ISBN 0-471-15746-5), pp. 41-48.
ABOUT THE AUTHORS
Dr. Dorian Dixon is a Research Fellow on the MATCH program at the University of Ulster in N. Ireland. MATCH
is a UK-wide program with five universities and a number of industrial partners. The aim of the program is to
enhance value assessment in the medical device sector and to improve new product development methods.
Dr. Dixon has eight years of industrially focused research and consultancy experience in conjunction with a
range of regional and international companies. His research has focused on the areas of polymer processing,
surface modification, medical packaging systems, adhesives, and anti-microbial surface coatings. Other
research interests include design of experiment techniques, regulatory affairs and quality systems. He
obtained a BEng in Mechanical Engineering from Queens University Belfast (1996) and has since completed
an MSc in Polymer Engineering and Science (1997) and a Ph.D. on Microcellular Polymeric Foams (2000).
Dr Dixon is the corresponding author and may be reached by email at: [email protected]
Prof. B.J. Meenan and M. Morgan are co-investigators on the UK MATCH project, NIBEC, University of Ulster,
Shore Rd, Newtownnabbey, N.Ireland, BT37 0QB
Dr. J. Eatock is a MATCH researcher in the Department of Information Systems, Brunel University, Uxbridge,
Middlesex, UB8 3PH
The authors acknowledge support of this work through the MATCH Programme (EPSRC Grant
GR/S29874/01), although the views expressed are entirely their own.
View Journal of Validation Technology, February 2006 Vol. 12, No 2
Copyright 2006 Advanstar Communications Inc. All rights reserved. No part of this article may be reproduced
or transmitted in any form or by any means, electronic or mechanical including by photocopy, recording, or
information storage and retrieval without permission in writing from the publisher, Advanstar
Communications Inc. Authorization to photocopy items for internal, educational or personal use, or the
internal, educational, or personal use of specific clients is granted by Advanstar Communications Inc., for
libraries and other users registered with the Copyright Clearance Center, 222 Rosewood Dr., Danvers, MA
01923, 978-750-8400 fax 968-750-4470. For uses beyond those listed above, please direct your written
request to Permission Dept. fax 440-891-2650. Microfilm or microfiche copies of issues are available through
Advanstar Marketing Services, phone: 440-891-2742 and fax 440-891-2650.
Fly UP