|
|
Accreditation and Quality Assurance: Journal for Quality, Comparability
and Reliability in Chemical
Measurement (v.17, #4)
The challenges and benefits of implementing the requirements of ISO/IEC 17043 by PT/EQA providers
by Jane Gun-Munro (pp. 363-370).
ISO/IEC 17043 Conformity assessment—general requirements for proficiency testing is intended to replace previous international guides used to assess the competence of proficiency testing (PT) providers. It expands on the requirements of previous guides and is intended to accommodate PT providers of calibration laboratories and testing laboratories handling both qualitative and quantitative data. QMP-LS is an office-based external quality assessment provider for medical laboratories in Ontario, Canada and operates 46 different PT schemes for approximately 250 diagnostic tests. In 2010, these schemes were accredited to ISO/IEC 17043. Schemes included tests from the following disciplines: chemistry, hematology, microbiology, transfusion medicine, cytology, histology and genetics. Thirty of the schemes were qualitative. The challenges and benefits of implementing ISO/IEC 17043 are discussed, with particular emphasis on clauses addressing the following requirements: statistical design, determination of assigned value, homogeneity and stability testing, packaging, labeling and distribution, performance evaluation and subcontracting services.
Keywords: ISO/IEC 17043; Accreditation; External quality assessment; Proficiency testing; Medical laboratory
Accrediting PT/EQA providers to ISO/IEC 17043
by Christian Lehmann (pp. 371-374).
After many years and having several different attempts for the accreditation of proficiency testing provider (PT provider), there is finally one stand-alone standard defining the requirements for the competence of PT providers and therefore an internationally harmonised basis for the accreditation of proficiency test providers. Since February 2010, the ISO/IEC 17043:2010 has replaced ISO/IEC Guide 43:1997 and ILAC G 13:2007. The philosophy of the standard about subcontracting work is different to this of the standards mostly used for accreditation like ISO/IEC 17025:2005 or ISO/IEC 17020:2004, etc. Besides the planning of the proficiency tests (PT), the performance evaluation and the authorisation of the PT reports the ISO/IEC 17043:2010 allows subcontracting for the rest of the work when providing PTs. This is a challenge for the assessors to judge about the competence of a PT provider. In numerous paragraphs, the standard sets very detailed requirements. Nevertheless, there is room for interpretation. For these cases, for example, contracts for subcontractors, procedure for the advisory board, minimum requirements for PT certificates, etc., some proposals are given to enable harmonised approach for the assessment of PT providers.
Keywords: Accreditation; Proficiency testing provider; ISO/IEC 17043; Requirements
Evaluating participant performance in qualitative PT/EQA schemes
by Vivienne L. A. James (pp. 375-378).
A basic principle that needs to be satisfied before a proficiency testing (PT)/external quality assessment (EQA) scheme is first introduced is that measurement and assessment of performance is possible. For qualitative analyses, data can be presented as ‘percentage correct’, and these data can be aggregated to show trends and to facilitate monitoring performance over time. Alternatively data can be analysed to identify significant patterns or changes in practice, or to compare different categories. However, the most useful tool is the application of a numerical score to the results. A number of different scoring strategies can be used, varying in complexity depending on the nature and significance of the type of results generated. Examples from selected microbiological EQA schemes are presented.
Keywords: EQA; Qualitative; Performance; Score
Establishing PT schemes in developing countries: examples from Africa
by Kezia Mbwambo; Michael Koch (pp. 379-382).
Developing countries face challenges in meeting requirements of importers and regulators due to inadequate resources in conformity assessments. Laboratories in developing countries fail to participate in proficiency testing (PT) schemes from abroad due to unaffordable cost. In the southern and eastern African regions, with support from German metrology institute, PTB, PT schemes for water, wheat flour, edible salt and edible vegetable oil have been installed to support regional laboratories in their competence improvement. There have been great achievements in terms of knowledge exchange, capacity building and skills improvement. Development projects are always short- or midterm projects. So currently, the schemes face sustainability challenge as external support is ending, and the standard deviation of participants’ results are still high. To overcome these challenges, technical support from another development partner would be a short-term solution, but on the long term, financial support from the governments in the regions is required as the permanent solution.
Keywords: Proficiency testing; Developing countries; Africa; Water analysis; Food analysis
Establishing the standard deviation for proficiency assessment ( $$ hat{sigma } $$ ). in microbiology PT/EQA schemes
by Tracey Noblett (pp. 383-388).
Microbiological testing can be challenging due to the unique nature of microorganisms. Studies have shown that values for reproducibility in microbiology are difficult to establish as they can vary depending upon a number of factors such as the organism under test, the matrix and the test method. The relative lack of consistent data on reproducibility means it can be difficult for providers of microbiology PT schemes to establish a suitable standard deviation for proficiency assessment ( $$ hat{sigma } $$ ). Methods for establishing $$ hat{sigma } $$ are described in BS ISO 13528: 2005, but these are not all appropriate in microbiology, given the lack of available data and the large range of possible variables. Using the robust standard deviation of participant results is not recommended as this can vary greatly from round to round and means that performance is not comparable over time. Use of a fixed standard deviation depends very much on the perception of the PT scheme organiser as to what constitutes fitness-for-purpose. In microbiology, is it really necessary to obtain precise results? Should the emphasis be on the correct isolation and identification of organisms, and on performance over time to detect laboratory bias rather than on enumeration results alone?
Keywords: Microbiology; Reproducibility; Proficiency testing
Determination of the standard deviation for proficiency assessment from past participant’s performances
by Isabelle Côté; Piotr Robouch; Benjamin Robouch; David Bisson; Philippe Gamache; Alain LeBlanc; Pierre Dumas; Mikaël Pedneault (pp. 389-393).
The “uncertainty function” introduced by Thompson et al. estimates the reproducibility standard deviation as a function of concentration or mass fraction. This model was successfully applied to data derived from three proficiency testing schemes aiming at the quantification of cadmium, lead and mercury in blood and urine. This model allows the estimation of standard deviation for the performance assessment for proficiency testing rounds.
Keywords: Horwitz equation; Thompson-modified equation; Uncertainty function; Limit of detection; Reproducibility; Proficiency testing
On the use of consensus means as assigned values
by Michael Koch; Frank Baumeister (pp. 395-398).
An assigned value can be derived by using either a consensus mean or a reference value. It is up to the proficiency testing provider to decide whether the consensus mean or the reference value might be used. If the consensus mean is used, it must be ensured that there really exists a consensus. This requirement is fulfilled if the participants’ results are not biased on average, and there is an agreement between the results with a precision which is fit for the intended use. The best way to avoid potentially “biased” assigned values is to use reference values thus ensuring that the assigned value is close to the “true” value.
Keywords: Consensus mean; Reference value; Standard deviation for proficiency assessment; Assigned value; ISO/IEC 17043
Use of characteristic functions derived from proficiency testing data to evaluate measurement uncertainties
by Michael Koch; Bertil Magnusson (pp. 399-403).
Interlaboratory comparisons show that reproducibility standard deviations are dependent on the concentration of the analyte. Many attempts have been made to model this. In this paper, ‘characteristic’ functions are used for modelling the concentration dependence of the reproducibility standard deviations based on data from proficiency tests for water analysis. The characteristics of the resulting functions can be used for the estimation of measurement uncertainties at different concentration levels. These functions are especially useful to determine the concentration levels below which absolute uncertainties tend to be constant and above which the relative uncertainties are more constant. By comparing the characteristic functions of different analytical procedures for the determination of the same analyte, the performance of these procedures under routine application can be compared. Finally, these functions may be used to get an indication on the average quality of analytical result in a specific field to be used by regulators in order to formulate requirements in the legislation that are in accordance with current measurement quality.
Keywords: Characteristic function; Proficiency testing; Measurement uncertainty; Reproducibility standard deviation; Variation coefficient
Anthelmintics in bovine milk and muscle: interlaboratory studies among EU National Reference Laboratories
by Manfred Stoyke; Wolfgang Radeck; Petra Gowik (pp. 405-412).
The paper describes in detail how to evaluate interlaboratory studies for the measurement of authorised and non-permitted veterinary drugs. Examples of different kinds of evaluation will be presented including the results of two interlaboratory studies for anthelmintics in bovine milk and muscle. As compared to an interlaboratory study 2008, a clear development regarding the proficiency of the laboratories of the European Residue Control System could be noted in 2009. As a result of this development, the percentage of false-negative and false-positive results had decreased considerably. The described improvements in the analysis of the included anthelmintics were also reflected in the overall assessment. While in 2008 ten (38.5 %) out of 26 NRLs fulfilled the proficiency criteria of the interlaboratory study, this number had increased to fourteen (53.8 %) by 2009. In some cases, the correct quantification needed to be improved.
Keywords: Interlaboratory studies; Anthelmintics; European Residue Control System; Proficiency criteria
The effect of the choice of method for determining assigned value on the assessment of performance in melting point analysis
by M. Whetton; K. Baryla; H. Finch (pp. 413-417).
The determination of melting point is a fundamental test in the Pharmaceutical industry, since it is one of the simplest techniques for the identification of a chemical substance. The melting point provides information on both identity and purity of a chemical substance and for that reason is a key test in the PHARMASSURE proficiency testing (PT) scheme. The PT scheme assesses participant’s determination of melting point, using chemicals of high purity and basing the assigned value on the robust consensus mean (median). In recent rounds, melting point reference standards have been provided as the test material and a reference value used as the assigned value for PT assessment. Comparison of the PT results over a number of rounds, using test materials with a wide range of melting points, shows the overall performance of the participant group is worse in rounds where a reference material and associated reference assigned value are used for performance assessment. When participants were assessed against the reference assigned value, a positive bias was observed in the participant’s results. Detailed information regarding the methodology used demonstrated that the majority of participants use the same analytical method, EU.Ph.2.2.14 (Council of Europe, Strasbourg, 2011), for the determination of melting point although this procedure allows flexibility in key methodological parameters, such as heating ramp rate, which may fail to ensure consistent performance across the group of participant laboratories.
Keywords: Melting point; PHARMASSURE; Proficiency testing; Method bias
Role of proficiency testing in monitoring of standardization of hemoglobin A1c methods
by Berna Aslan; Jane Gun-Munro; Gregory J. Flynn (pp. 419-424).
After Hemoglobin A1c (HbA1c), therapeutic targets for monitoring diabetes therapy were recommended, first, National Glycohemoglobin Standardization Program (NGSP), then, the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) developed standardization initiatives. The aim of this article is to demonstrate the role of a proficiency testing (PT) programs in monitoring the long-term effect of these initiatives and the current status of HbA1c measurement. Measurement precision as a coefficient of variation (CV), measurement bias (bias), and satisfactory HbA1c result rates in proficiency testing (PT) surveys were evaluated using fresh single donor whole blood PT items and assigned values from a NGSP-certified secondary reference laboratory. Between 2000 and 2010, both CV and bias of the IC measurement method showed a decreasing trend. While the CV of the HPLC measurement method decreased, no significant change was observed in its bias. The rates of satisfactory HbA1c results in PT surveys were higher in HPLC users than IC users. In 2010, the average CVs in HPLC and IC groups were 2.6 and 3.4 %, biases were 2.7 and 1.8 %, and corresponding total error (TE) estimates were 7.8 and 8.5 %, respectively. These TE values were higher than the maximum permissible measurement error of 7 %, developed based on clinical use of the test. The NGSP and the IFCC networks have promoted improvements in HbA1c testing; however, tightening of NGSP method certification criteria seems to be necessary to achieve a maximum permissible measurement error of 7 %.
Keywords: HbA1c ; Method standardization; External quality assessment; Proficiency testing
Proficiency testing in food microbiology: experience from implementation of ISO/IEC 17043 and ISO/TS 22117
by Marzia Mancin; Maria Grimaldi; Lisa Barco; Romina Trevisan; Marco Ruffa; Renzo Mioni; Antonia Ricci (pp. 425-430).
The proficiency testing (PT) scheme “AQUA” for food microbiology was organised by the Istituto Zooprofilattico Sperimentale delle Venezie (IZSVe) according to ISO/IEC 17043 and ISO/TS 22117. This paper describes the IZSVe experience on the application of the above-mentioned standards for the PTs, with focus on the Enterobacteriaceae enumeration one. Freeze-dried food matrices contaminated with American Type Culture Collection bacterial strains were used as test samples for each microbiological PT organised by IZSVe. The sample homogeneity and stability were verified prior to distribution to participants and throughout the PT, respectively. The participating laboratories analysed samples using their routine methods, and results were transmitted to IZSVe. Data and methods used by each participating laboratory were analysed in order to evaluate the laboratory performance. With reference to the Enterobacteriaceae PT, the test samples were homogeneous and stable. In addition, most laboratory results were obtained using equivalent test methods. Statistical approaches applied to analyse data generated from all participating laboratories revealed similar outcomes as no significant outlying count and only 5 % of unacceptable results were observed. Finally, the z-score, with the standard deviation that does not vary from round toround, was applied to compare and to evaluate the performance of each laboratory over time highlighting possible persistent trends over several rounds.
Keywords: Proficiency testing; Food microbiology; ISO/IEC 17043; ISO/TS 22117; Statistical evaluation
Proficiency testing schemes for the assessment of Legionella PCR methodologies
by Raquel Múrtula; Elena Soria; M. Adela Yáñez; Vicente Catalán (pp. 431-437).
Standard operating procedures used for the detection of bacteria in environmental samples are primarily based on bacterial growth on specific culture media and confirmation by biochemical and/or immunological tests. In the case of Legionella, isolation on BCYE-α medium is the standard method, although it presents a number of drawbacks, and for this reason, the implementation of molecular methods, mainly those based on PCR, has increased over the last years. Following the ISO/IEC 17025, laboratories need an external evaluation of their work to assure the quality of the results they are producing, and the participation in proficiency testing (PT) schemes is compulsory. For those water-testing laboratories using PCR methods for Legionella, we have developed a PT scheme accredited according to ISO/IEC 17043. The preparation and the statistical analysis of the results are performed following this standard and the ISO 13528. The used samples have a very rapid and easy to use format, consisting of tablets with inactivated freeze-dried Legionella cells or freeze-dried Legionella DNA. In this PT scheme, participants evaluate both Legionella pneumophila and Legionella spp. detection systems and control the whole PCR process from the water sample concentration until the PCR results.
Keywords: Proficiency testing; Legionella ; Molecular methods; qPCR
Determination of brominated flame retardants: a proficiency test
by Fernando Cordeiro; Piotr Robouch; Thomas Linsinger; Beatriz de la Calle (pp. 439-444).
This manuscript presents the results of a proficiency test for the determination of total bromine and several polybrominated biphenyls and diphenyl ethers (PBB and PBDE) in plastic. The test material used is a poly(ethylene terephthalate) (PET) granulate fortified with a mixture of PBB and PBDE. Up to twenty laboratories from 15 countries registered for the exercise and reported results. Homogeneity and stability were investigated to asses the adequacy of the selected test material. Laboratory results were rated with z and ζ scores according to ISO 13528. The standard deviation for proficiency assessment was set to 25 % of the assigned values. This exercise highlights the difficulties laboratories have to provide consistent values for the investigated measurands. Many participants reported underestimated measurand values; satisfactory z scores ranged from 61 to 88 %. The critical experimental parameters are identified and discussed.
Keywords: Proficiency test; External quality assessment; Flame retardants; Polymers; Scoring
Proficiency testing in analytical chemistry, microbiology and laboratory medicine: working discussions on current practice and future directions
by Brian Brookman; Ewa Bulska; Owen Butler; Michael Koch; Tracey Noblett; Kees van Putten; Piotr Robouch (pp. 445-451).
A summary of the working group (WG) discussions on proficiency testing (PT) and external quality assessment (EQA) held at the EURACHEM Workshop, Istanbul, 3–6 October 2011, is provided. The six WGs covered a range of issues concerned with current practice and future directions; implementing the requirements of ISO/IEC 17043 by PT/EQA providers (WG1); accrediting PT/EQA providers to ISO/IEC 17043 (WG2); pre- and post-analytical aspects in PT/EQA (WG3); evaluating participant performance in qualitative PT/EQA schemes (WG4); establishing PT/EQA schemes in developing countries (WG5); and establishing acceptability criteria in microbiology PT/EQA schemes (WG6). Delegates with different backgrounds were on each WG in order to capture a range of views and experience from a number of different sectors. Working group representatives included PT/EQA providers, participants in PT/EQA schemes and end-users of PT results such as accreditation bodies and regulatory authorities, from countries around the world.
Keywords: Proficiency testing; External quality assessment; Accreditation
Causes of error in analytical chemistry: results of a web-based survey of proficiency testing participants
by Stephen L. R. Ellison; William A. Hardcastle (pp. 453-464).
Results of a voluntary-response survey of respondent-identified causes of unacceptable results in nine proficiency testing schemes are reported. The PT schemes were predominantly environment and food analysis schemes. 111 respondents reported 230 identified causes of error. Sample preparation (16 % of causes reported), Equipment failures (13 %), ‘Human error’ (13 %) and Calibration (10 %) were the top four general causes of poor analytical results. Among sample preparation errors, sample extraction or recovery problems were the most important causes reported. Most calibration errors were related to errors in calculation and dilution and not in availability or quality of calibration materials. No failures were attributed to failures in commercial software; software-related problems were largely associated with user input errors. Corrective actions were generally specific to the particular problem identified. Review of all reported causes indicated that about 44 % could be attributed to simple operator errors.
Keywords: Proficiency testing; Causes of error; Survey
Proficiency testing for the improvement of analytical practice
by Mark Sykes (pp. 467-471).
The results of an individual laboratory’s participation in proficiency testing are often taken in isolation. A single poor assessment may be investigated, or a trend of participation over time may be charted. However, the overall results of proficiency testing (over all participants and over time) may also provide some insights into aspects of the analysis being undertaken. Two examples are summarised here. Analysis of sodium in various foods appears to be difficult with no obvious method trends. Analysis of vitamin B2 in liquid dietary supplement requires enzymatic dephosphorylation, as well as acidic digestion. In the latter case, some participants appear to have changed their method since the first reporting of the problem. Investigation of proficiency testing results and the implications for the analysis takes time and requires data sets to be retrieved from archive. This may not be the highest priority in the workload of a busy proficiency testing provider. However, the benefits to the analytical community from such investigations are great, and their reporting is to be encouraged.
Keywords: Proficiency testing; Food; Analytical chemistry; Improvement
|
|