Best Practices for Single Laboratory Validation (SLV) of Chemical Methods for Trace Elements in Foods

Cory J. Murphy1, James D. MacNeil1

1Canadian Food Inspection Agency, Dartmouth Laboratory, 1992 Agency Drive, Dartmouth, Nova Scotia, B3B 1Y9, Canada

Introduction

The use of analytical methods within a regulatory analysis or accredited laboratory framework imposes certain requirements on both the analyst and laboratory. It is expected that regulatory analyses will be conducted according to what may generally be described as “best practices” to ensure the reliability of findings leading to regulatory action. In some situations, such analyses and the sampling associated with them must also be conducted in a manner that meets requirements for legal proceedings, including presentation as evidence in court. Under the International Organization for Standardization’s (ISO) and International Electrotechnical Commission (IEC) general requirements, accredited laboratories are expected to demonstrate both “fitness for purpose” of the methods for which they are accredited and competency of their assigned analysts in performance of the methods[1]. There is, therefore, activity in many areas of regulatory analysis to develop consensus on best practices associated with particular types of analyses.

In 1997 (amended in 2006), the Codex Alimentarius Commission (CAC) issued a general guideline for analytical laboratories involved in the import and export testing of foods which contains four principles[2]:

  • The laboratory should have in place internal quality control procedures which meet the requirements of the Harmonised Guidelines for Internal Quality Control in Analytical Chemistry[3];
  • The laboratory should participate regularly in any available proficiency testing schemes, appropriate to their area of testing, which have been designed and conducted as per the requirements of the International Harmonized Protocol for Proficiency Testing of (Chemical) Analytical Laboratories[4];
  • The laboratory should become accredited according to ISO/IEC-17025:1999 General requirements for the competence of calibration and testing laboratories (now ISO/IEC-17025:20051) for tests routinely performed; and
  • The laboratory should use methods which have been validated according to the principles laid down by the Codex Alimentarius Commission whenever such methods are available.

General requirements for validation of analytical methods according to principles laid down by the Codex Alimentarius Commission are provided in the Codex Manual of Procedures, including provision for “single laboratory” validation of analytical methods[5]. Additional guidance is provided through a number of general guidelines issued by a consensus process in international scientific organizations and subsequently adopted as CAC guidelines[6],[7],[8],[9],[10]. The CAC has also issuedguidelines related to the validation of methods used for the analysis of pesticide residues[11], mass spectrometric analysis of pesticide residues[12], the estimation of uncertainty of measurements[13] and the analysis of veterinary drug residues in foods[14]. A recent CAC guideline on the settlement of disputes over analytical test results also makes reference to method validation requirements[15]. However, there remains considerable misunderstanding among analysts and laboratory managers as to precisely what is meant and what is required to demonstrate “method validation”. Furthermore, no specific guidance on the validation of methods used for the determination of elemental composition or element speciation is provided within Codex documents to supplement the general guidance provided in other documents or contained in guidance from independent international scientific organizations. Additional guidance on method validation for future inclusion in the CAC Manual of Procedures and CAC guidelines is currently under discussion in the Codex Committee on Methods of Analysis and Sampling (CCMAS) and other Codex Alimentarius committees, but does not relate to this specific issue[16].

A new project was established by the Analytical Chemistry Division of the International Union of Pure and Applied Chemistry (IUPAC) in 2009 to provide guidance on experimental designs suitable for use in method validation[17], supplementing the general guidance previously provided by IUPAC on single laboratory validation requirements[18]. It may reasonably be anticipated that any such guidance will also be adopted by the CAC. While compliance with CAC standards and guidelines is voluntary for member states, subject to World Trade Organization (WTO) agreements, they do reflect international scientific consensus on issues related to the analysis of foods. These guidelines can therefore be informative for the development of guidance documents to be used within AOAC International for issues such as single laboratory validation of analytical methods for trace elements, whether in foods or in other matrices.

Validation was defined by ISO in 1994 as “confirmation by examination and provision of objective evidence that the particular requirements for a specified intended use are fulfilled ”[19]. In analytical chemistry, method validation was defined by Eurachem in 1998 as a process of “establishing the performance characteristics and limitations of a method and the identification of the influences which may change these characteristics and to what extent” and thereby “verifying that a method is fit for purpose, i.e., for use for solving a particular analytical problem.”[20] A recent guideline issued by CAC[21] defines a validated method as an “accepted test method for which validation studies have been completed to determine the accuracy and reliability of this method for a specific purpose ”[22] and validation as “verification, where the specified requirements are adequate for an intended use”[23]. The process includes identification of the method scope and method performance characteristics. The scope defines the analytes and the matrices in which they can be determined, the concentration range and any known effects from interferences, while the expected performance characteristics are usually stated in terms of precision and accuracy. The IUPAC Harmonized Guidelines for Single Laboratory Validation of Methods of Analysis state that “strictly speaking, validation should refer to an ‘analytical system’ rather than an ‘analytical method’, the analytical system comprising a defined method protocol, a defined concentration range for the analyte, and a specified type of test material.”18 An AOAC International guidance document defines validation as “the process of demonstrating or confirming the performance characteristics of a method of analysis.”[24] Similarly, the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) guidance states that the “objective of validation of an analytical procedure is to demonstrate that it is suitable for its intended purpose.”[25]

Method validation can therefore be practically defined as a set of experiments which confirm that an analytical method is suitable for its intended purpose when conducted using specific instrumentation and within a specific laboratory environment in which the set of experiments have been conducted. An inter-laboratory collaborative study is considered to provide a more reliable indicator of statistical performance characteristics of the method because it requires testing of the method in multiple laboratories, by different analysts using different reagents, supplies and equipment and working in different laboratory environments[26]. Validation of a method, even through collaborative study, does not, however, provide a guarantee of method performance in any laboratory performing the method. This is where a second term, verification, is sometimes used18. In this context, verification may be defined as a set of experiments conducted by a different analyst or laboratory on a previously validated method to demonstrate that in their hands, the performance standards established from the original validation are attained. Verification has been described as part of internal quality control (QC) procedures[27]. That is, the verification experiments demonstrate that the performance achieved meets requirements for attributes such as scope (analytes/matrices), analytical range, freedom from interferences, precision and accuracy that have been identified for suitable application of the method to the intended use in the initial method validation.

The guidelines for conduct of an inter-laboratory collaborative study stress the importance that the performance of the method should first be well-characterized in the developing laboratory (or laboratory sponsoring the study) before the method is tested in multiple laboratories in the collaborative trial[28],[29]. Current guidance from AOAC International for conduct of a collaborative study stresses the importance of optimizing the performance of the method (usually demonstrated through completion and reporting of a “single laboratory validation”) before attempting the collaborative study[30]. Thus, following a recognized approach based on scientific consensus to method validation within a single laboratory is important not only to demonstrate “fitness for purpose” as required by accrediting bodies, but also to lay the proper base when methods are proposed to be tested in an inter-laboratory method trial.

In contrast, method development is the series of experiments conducted to develop and optimize a specific analytical method for an analyte or group of analytes. This can involve investigations into detection/extraction of the analyte, stability of the analyte, analytical range, selectivity, ruggedness, etc. It is important to note that method validation experiments will always take place after method development is complete; that is, validation studies are intended to confirm method performance parameters which were demonstrated during method development. Validation should not begin until method development, including ruggedness testing, has been completed. A ruggedness design should identify if small changes at certain steps of the analytical method, which might occur when other analysts use the method, affect method results. A common approach is to vary seven factors simultaneously and measure these changes to determine how they may affect method performance[31]. Once method development and ruggedness experiments are complete, the method should not be further modified or changed during the validation process.

When validating a method for elements in food products, many factors should be considered during the planning phase of the validation experimental design. For example, it should be determined if the method is to be used in a regulatory environment, and if the analyte(s) of interest have a maximum level (ML) which is to be assessed for compliance. In some cases, such as the analytes for which no safe limits have been established, the purpose of the method may be to achieve the lowest possible detection limit. The method may be intended for use in the determination of a single element in a particular matrix, or it may require capability for multi-analyte analyses in various matrices. The availability of an authentic blank matrix to be used as the analytical samplefor method characterization should be considered. For example, many elements are naturally present in some intended test matrices (such as arsenic or cadmium in shellfish tissue). The inability to obtain authentic blank test sample materialcan therefore cause many validation challenges when assessing parameters such as matrix effects and limits of detection and quantification, particularly when attempting to use the signal of the “blank” as a basis for the latter determinations.

Although food testing programs frequently include testing for a range of elements (predominantly metals), there are actually few formally established MLs or other action limits for these analytes. The Codex Alimentarius Commission has established limits for arsenic (total), cadmium and lead in a variety of foods, total mercury in mineral waters and salt, methylmercury in fish and tin in canned goods[32]. Similarly, the European Union (EU) has established regulatory limits for cadmium, lead, mercury and tin in a variety of foods[33]. Requirements for analytical methods to enforce EU standards for lead, cadmium and mercury in foodstuffs are the subject of another EU regulation[34]. Canada has established maximum limits for arsenic, lead and tin in various foods[35] and standards for mercury in seafood have been set by both Canada[36] and the United States[37].

Table 1: Regulated Toxic Elements of Codex Alimentarius Commission and Various Countries

Organization/Country / Regulated Element
Codex Alimentarius Commission32 / As, Cd, Pb, Hg, methyl mercury in a variety of foods
EU and member states33 / Hg, Cd, Pb Sn in some foods
Canada3536 / Hg in fish, Cd, Pb, Sn in some foods
USA37 / Hg in fish
Japan / Hg and methyl mercury in some fish

The aim of this paper on single laboratory validation (SLV) is to provide guidance for the scientist when validating a method for trace elements in food as “fit-for-purpose” for an element or a group of elements in those products. Definitions for common analytical chemistry terms used in food analysis are taken from contemporary references and the procedures proposed for method validation are based on available technical guidelines and recommended approaches. An example of a SLV experimental plan to implement the proposed approach for methods used in elemental analysis in foods samples is provided. The proposed approach is intended to address any specific requirements that are currently provided in Codex Alimentarius guidance documents or in regulations or guidelines for the analysis of trace elements in foods set by national or regional authorities, so is intended to be generally applicable for a variety or potential users.

Definitions

In general, it is recommended that definitions included in the Codex Alimentarius Commission “Guidelines on Analytical Terminology”21 should be used as a primary source for methods used in the analysis of foods as these have been adopted after extensive international consultation and are taken from authoritative sources, such as the Joint Committee for Guides in Metrology (JCGM), ISO, IUPAC and AOAC International. Definitions of key terms used in method validation recommendations contained in this document are contained in Appendix I, with a reference to the source. Adherence to these definitions when reporting the validation of an analytical method will provide transparency to the process and should eliminate the misunderstandings that can occur when different laboratories use different definitions for the same analytical terminology. When definitions are available from multiple sources and there are differences in the wording, accredited laboratories should use definitions contained in the International Vocabulary of Metrology (VIM)23as the primary source of definitions for analytical terms, as national bodies performing laboratory accreditation under ISO/IEC-17025 refer tothis source. The VIM is the source of many of the definitions cited by the CAC.

[M1]For terms related to “sample” the analyst should use the nomenclature recommended by the International Union of Pure and Applied Chemistry (IUPAC) [Reference: Horwitz, H. (1990) Nomenclature for Sampling in Analytical Chemistry Pure & Appl. Chem. 62, 1193-1208.], for analytical chemistry, based upon the International Organization for Standardization (ISO) recommendations. The terminology is also supported by AOAC International [Reference: Official Methods of Analysis of AOAC INTERNATIONAL (2005) AOAC INTERNATIONAL, Gaithersburg, MD, USA, Definition of Terms and Explanatory Notes, Sample (23). OMA Online accessed August 2, 2010. Terms most frequently applicable to element analysis of foods are the following and will be used throughout this document:

  • Laboratory sample—sample or subsample sent to or received by the laboratory
  • Analytical (or test) sample—sample, prepared from the laboratory sample (by homogenization, grinding, blending, etc.), from which analytical portions are removed for analysis.
  • Analytical (or test) portion—quantity of material removed from the analytical sample for analysis.
  • Analytical (or test) solution—solution prepared by dissolving (with or without reaction) of an analytical portion in a liquid.

Concern has been expressed that the limit of detection (LOD) and the limit of quantification (LOQ) should not always be used as mandatory fixed performance limits for validated methods, due to the inherent variability which may be observed in the determination of these limits by different analysts using different instruments. For example, an expert meeting on the validation of analytical methods noted in its report that:

LOD and LOQ are estimates of variable parameters, the values of which depend on various factors, including the conditions of measurement and the experience of the analyst. The use of these estimates in client reports can be misleading. In view of this, it was requested that the FAO/IAEA expert consultation following the Workshop would consider that the lowest calibrated level of the analysis be recommended to be used in client reports as an alternative to the LOD and LOQ.”[38]

The report of the subsequent expert consultation defined two terms to reflect the performance characteristics which may be required of analytical methods used in a regulatory setting, the accepted limit and the lowest calibrated level (See Appendix I)27. More recently, the IUPAC Guidelines for Single Laboratory Validation of Methods of Analysis advised that “the detection limit need not be part of validation” when the actual concentration range measured by the method “does not include or approach” this limit and also, regarding the limit of quantification, recommended that the measurement uncertainty “as a function of concentration” should be assessed with regard to fitness for purpose, rather than using a “fixed multiple” of the detection limit to establish a limit of quantification18.

It also is important to note that while many analytical chemistry texts and older papers in scientific journals use the term “specificity” for “selectivity”, the term “selectivity” is now recommended and use of the term specificity is discouraged21. It is considered that a method is either “specific” or it is “non-specific”, while the term selectivity implies that there may be varying degrees of “selectivity”.

Performance Criteria

The Codex Committee on Methods of Analysis and Sampling (CCMAS) has recommended new guidance on method performance with respect to implementation of the criteria approach for analytical methods which has been included in the 19th Edition of the Codex Manual of Procedures5. This guidance is based on accepted approaches to the establishment of performance criteria for analytical methods[39],[40],[41] and was subject to extensive consultation by representatives of major international organizations and national regulatory authorities prior to acceptance and implementation. It therefore is recommended that these recommendations should be followed, particularly with regard to acceptable performance for recovery and precision expected at various concentrations of analyte(s), which may be found in Table 1, tiltled “Guidelines for establishing numeric values for the criteria” on page 53 of theCACManual5.[M2]

Performance Characteristics

In order for a method to be considered “fit-for-purpose” certain performance requirements should be evaluated and met. Listed below are the requirements typically considered in the validation of a quantitative method of chemical analysis18. A screening or confirmation method may require different, usually fewer, parameters.