CNAS-CL01 "Accreditation Criteria for Testing and Calibration Laboratories" is a standard that software testing laboratories need to focus on when applying for CNAS qualification. This article is about ensuring the validity of the results.
7 Process Requirements
7.7 Ensure the validity of the results.
Overview: Program Requirements (7.)7.1)
Internal control methods (7.)7.1)
External monitoring methods (7.)7.2)
Monitoring results utilization (7.)7.3)
Laboratories should have procedures in place to monitor the validity of results. Results data should be recorded in a way that makes it easy to identify trends and, where feasible, the results should be reviewed using statistical techniques. Laboratories shall plan and review monitoring, including, when appropriate, including, but not limited to, the following:
a)Use of reference materials or quality control substances;
b)Use other instruments that have been calibrated to provide traceable results;
c)Functional verification of measuring and inspection equipment;
d)Where applicable, use verification or working standards and produce control charts;
e)Period verification of measuring equipment;
f)Repeat the test or calibration using the same or different methods
g)Repeated testing or repeated calibration of retained samples;
h)Correlation between the results of different characteristics of an item;
i)Review of the results of the report;
j)Intra-laboratory comparison;
k)Blind sample test.
Analysis:
This clause focuses on the use of internal laboratory means to ensure the validity of the results.
This clause requires laboratories to have procedures in place to monitor the validity of results. CNAS-RL02 also requires laboratories to have the ability to validate procedures, and laboratories may consider merging these two documents into a single procedure or developing them separately when formulating procedures.
The laboratory should record the measurement results in a way that makes it easy to identify trends in the measurement results and, if feasible, use statistical techniques to review the measurement results so that potential problems can be identified in a timely manner.
The monitoring of measurement results in the laboratory should be planned in advance. When planning, attention should be paid to the monitoring of results, which should cover all technical capabilities within the scope of the management system. When a requirement for monitoring results (or quality control) is specified in a method standard, laboratories should implement and meet this requirement.
The results of the monitoring are reviewed by the laboratory and used as input for the management review.
Monitoring is oversight monitoring and control, so it is used in a more diverse way, and the 11 most basic types of internal control are listed in this article, and if there are other better ways for the laboratory to monitor the measurement results, it can also be used, so the description of these 11 methods in the guidelines is "including but not limited to". It should be noted that not all of these 11 methods are suitable for the laboratory, and the laboratory can choose the appropriate method according to the type and workload of the work carried out and other actual conditions to ensure the effectiveness of the results.
CNAS-CL01-G001 specifies the factors to be considered by laboratories when formulating monitoring protocols:
7.7.1 a) The quality control scheme of the corresponding test or calibration method is specified in the test method or other documents. The following factors should be considered when developing an internal quality control protocol for laboratories:
Detect or calibrate traffic;
The purpose of the test or calibration results;
The stability and complexity of the test or calibration method itself;
Degree of dependence on the experience of the technician;
Frequency and results of participation in external comparisons (including proficiency testing);
Competence and experience of personnel, number and changes in personnel;
New methods or changed methods, etc.
For cases where the test results cannot be reproduced, CNAS-CL01-G001 specifies:
7.7.1 c) For some special testing activities, the test results cannot be reproduced and it is difficult to follow 77.1a) For quality control, the laboratory should focus on the competence of personnel, training, supervision, and technical exchanges with peers.
In addition, most of the CNAS-CL01 application notes for special areas are specified to ensure the validity of the results, and laboratories should follow them.
Case 1: The "Test Results Quality Control Record" with the implementation date of May 10, 2021 and the control object is a laptop, and a test item shows that the control method is personnel comparison, and the laboratory cannot provide the original comparison recordThe "Test Results Quality Control Record" dated May 10, 2021 shows that the control method is retained samples for retesting, and the laboratory cannot provide the original records of retained samples for retesting.
Case 2: In the internal quality control activities of a project carried out in 2021, only the evaluation records of the test results of the quality control items were available, and the original test records of the quality control items could not be provided.
Analysis: The above two cases only have quality control results, and cannot provide original records of quality control testing. A record is a document that sets out the results achieved or provides evidence of the activities completed, and without the original records, it cannot be proved that the testing activities were carried out. It is not necessary to produce a test report for quality control activities, but the original test records must be kept.
Case 3: In October 2022, the laboratory carried out an intra-laboratory comparison, and the original records of the comparison test showed that two experimenters carried out the testing of two different samples respectively, and the comparison result report did not conduct data comparison and analysis.
Analysis: The laboratory conducts internal comparison, and if the result data is not analyzed, the significance of the comparison is lost. Comparing and analyzing the result data, regardless of its consistency, it can identify development trends, and then discover possible or potential risks in the laboratory, and better eliminate or reduce the hidden dangers caused by risks.
Case 4: The original record of the test report numbered SRT-03 is missing the instrument performance verification required by the quality control in HJ834-2016.
Analysis: When there are requirements for quality control in the test method standard, the laboratory should implement quality control in accordance with the requirements of the method.
Case 5: On May 30, 2018, a personnel comparison test was conducted for a project, and the evaluation criteria of the comparison results were not evaluated in accordance with the monitoring criteria specified in the 2018 quality control schedule.
Analysis: The evaluation basis of the quality control results, if there are provisions in the quality control method standard, should be based on the provisions;If not, the laboratory may select an appropriate method for evaluation. Therefore, when the laboratory prepares the quality control plan, the evaluation basis should be fully considered and stipulated, and the provisions cannot be stipulated and implemented, so the regulations will lose their significance. At the same time, if the provisions do not correspond to the actual situation, there is no guarantee that they will be effectively implemented. For the evaluation of quality control results using statistical methods, please refer to CNAS-GL002 "Guidelines for Statistical Processing and Competency Evaluation of Proficiency Testing Results" and GB T 28043 "Statistical Methods for Proficiency Testing Using Interlaboratory Comparison".
Case 6: In the 2018 quality control plan, there are no specific products and testing parameters and external quality control program content that are not monitored, and the 5 quality control parameters that have been implemented by the two laboratories only involve one product.
Analysis: This case illustrates that there are two problems in the laboratory, one is that the quality control plan is missing, and there are no specific products and parameters, resulting in the quality control plan being a mere formality. Second, the laboratory has a number of recognized technical capabilities, and as of the end of 2018, only 5 parameters have implemented quality control, and its quality control coverage is too low. The relevant documents of the old accreditation guidelines required that "the quality monitoring plan should cover all testing or calibration (including internal calibration) items within the scope of accreditation", and the new version of the accreditation standards and related documents was implemented in September 2018, and CNAS-CL01-G001 stipulated that "the laboratory's monitoring of results should cover all testing or calibration (including internal calibration) items within the scope of accreditation", and the requirements are essentially the same. The monitoring methods in the new version of the guidelines are more selective than those in the old version, so laboratories should consider comprehensively and choose the appropriate way to monitor the results to meet the requirements of CNAS.
Where feasible and appropriate, laboratories should monitor the level of competence by comparing results with those of other laboratories. Monitoring should be planned and reviewed, including, but not limited to, one or both of the following measures:
a)Participate in proficiency testing;
Note:gb/t 27043Contains the details of the proficiency test and the proficiency testing provider. Satisfiedgb/t 27043The required proficiency testing provider is considered competent.
b)Participate in inter-laboratory comparisons in addition to proficiency testing.
Interpretation:
This article focuses on the use of external proficiency testing and inter-laboratory comparisons to ensure the validity of the results.
Internal monitoring is implemented in a way that uncovers random issues with laboratory measurements, such as equipment issues, or personnel understanding or operational issuesEven if the internal monitoring results are good, the laboratory will not be able to detect systemic problems in the laboratory, such as the appropriateness of the method, or the correct understanding of the method, without comparing it with the external laboratory (or participating in proficiency testing), which will pose a risk to the laboratory. Therefore, the role of internal and external monitoring is different, one is indispensable, and cannot replace each other. When conditions are available, laboratories should ensure the validity of the results by comparing them with external laboratories (or participating in proficiency testing) in addition to internal monitoring methods to avoid systematic biases.
The requirements of CNAS-RL02 "Proficiency Testing Rules" for laboratories to participate in proficiency testing are only the minimum requirements, and laboratories should determine the proficiency testing or inter-laboratory comparison that they need to participate in according to their own competence scope and business volume.
After all, the field covered by the proficiency testing providers recognized by CNAS is limited, so laboratories should participate in inter-laboratory comparisons as much as possible when they cannot seek proficiency testing, and can also organize inter-laboratory comparisons by themselves if conditions permit. However, it should be noted that when organizing or participating in inter-laboratory comparisons, the participating laboratories should have a comparable level of competence, so as to avoid deviations in statistical results due to technical reasons.
On the basis of comprehensive consideration of internal quality control level, personnel capacity, equipment status, risk, operating cost and other factors, the laboratory reasonably plans and seeks appropriate proficiency testing. For the selection of proficiency testing, please refer to CNAS-GL032 "Guidelines for the Selection, Verification and Utilization of Proficiency Testing". While it is good to make inter-laboratory comparisons with accredited laboratories, it is sometimes difficult to find a comparison mechanism in accredited laboratories due to the different number of laboratories carrying out measurement activities in different technical fields.
cnas-gl032:2018Main content:
Choice of proficiency testing
4.1. Develop a proficiency testing participation plan.
4.2. Choose a path.
4.3. Selection basis.
Proficiency testing verification
5.1. Proficiency testing design scheme.
5.2. Sample type and measurement method.
5.3. Proficiency testing work instructions.
5.4. Uniformity and stability of the sample.
5.5. Result analysis.
5.6. Specify the value.
5.7. Standard deviation of ability assessment.
5.8. Ability evaluation.
5.9. Proficiency testing report.
5.10. Verification of measurement audits.
Proficiency testing results utilization
6.1. Use of the results of a single proficiency test.
6.2. Use of the results of multiple proficiency tests.
Case 1: For a certain field, the laboratory has not monitored the level of competence by comparing the results with other laboratories since 2019.
Note: Review time: November 2021.
Case 2: Check the quality monitoring records of the laboratory in 2020-2021, and fail to compare the results with other laboratories to monitor the ability level of the test items applied for accreditation.
Note: Preliminary evaluation laboratories for 2021 review.
Analysis: The internal monitoring methods of the laboratory find the randomness problems of the laboratory measurement results, and the external monitoring methods, such as inter-laboratory comparison and proficiency testing, find the possible systemic problems of the laboratory measurement results. CNAS-RL02 also stipulates that in areas where adequate proficiency testing is not available, laboratories should ensure competence by strengthening other means of quality assurance. Therefore, where possible, laboratories should ensure competence through external monitoring. The above two cases are not in the sub-field required by CNAS-RL02, and they have not participated in inter-laboratory comparison, which is not conducive to the laboratory to find possible systemic problems.
Case 3: The 2021 proficiency testing plan was not implemented into a specific project.
Analysis: The purpose of monitoring the effectiveness of the results is to ensure the effectiveness of laboratory activities and reduce the risk of the laboratory, so when formulating relevant plans, it is not only necessary to meet the requirements of the corresponding documents of CNAS, but also to combine the actual needs of the laboratory itself, clarify the purpose of using external monitoring methods, and determine and participate in proficiency testing or inter-laboratory comparison in a targeted manner.
Laboratories should analyze the data from monitoring activities to control laboratory activities and implement improvements where applicable. If it is found that the results of monitoring activity data analysis exceed the predetermined guidelines, appropriate measures should be taken to prevent the reporting of incorrect results.
Interpretation:
The data obtained in the course of monitoring activities, whether obtained by means of internal or external control, is analyzed by the laboratory and used by the laboratory to control the laboratory activities and improve the laboratory activities.
The most common problem in laboratories is monitoring for the sake of monitoring, and the data obtained from monitoring activities is not used to control and improve laboratory activities. For example, after the inter-laboratory comparison, only two test reports are archived, and the results are not analyzed and not used to control laboratory activities, so the significance of monitoring is lost.
In the event that monitoring data is found to be outside of pre-determined criteria, such as warning or control lines pre-determined by the laboratory in the QC chart, the laboratory should take steps to prevent the reporting of erroneous results.
Case 1: Check a quality control record of a laboratory in July 2021, and the difference in the measurement results of inter-laboratory comparison (0.).02) The quality control threshold set in the quality control plan has been reached, and the laboratory has not taken countermeasures.
Case 2: In 2022, the laboratory participated in the software functional test and measurement audit with number xxx, and the sample expected software defects were 13, and the laboratory detected 10 expected software defects, and the laboratory could not provide the analysis of the proficiency test results and the corrective action taken records.
Analysis: In both cases, the results of proficiency testing or inter-laboratory comparison were near the cut-off value or did not achieve satisfactory results, and the laboratory did not take corresponding measures. The existence of this situation indicates that there is a risk to the ability to carry out laboratory activities, and if they are not identified and measures are taken, the monitoring of the effectiveness of the results loses its meaning. Only by developing and implementing appropriate measures on the basis of a risk analysis can risks be eliminated or reduced and the work of the laboratory improved. Hence the endorsement of Criterion 85. Measures to address risks and opportunities, 86 Improvements do not exist in isolation, they are achieved through the effective operation of individual elements.
Case 3: The quality control record "Comparison Result Analysis Report" provided by the laboratory on October x, 2019 The comparison results exceeded the specified limit, and the laboratory comparison conclusion was satisfactory.
Analysis: The problem in this case is that the laboratory did not carefully analyze the monitoring activity data to achieve the purpose of monitoring. The occurrence of this situation of "exceeding the specified limit and not finding" may be due to the fact that the personnel who do the analysis of the comparison results do not have the ability to find the problem;It may also be a problem with the work attitude of the personnel, perfunctory. The reasons for this are different, and therefore the corrective actions that need to be taken are different. It should be noted that the academic document cannot "cure all diseases", and the laboratory can only fundamentally solve the problem by carefully finding the cause, formulating and implementing effective corrective measures for the cause of the problem.
Case 4: From 2015 to 2019, the laboratory participated in an international proficiency testing activity organized by IIS for 5 consecutive times. The verification laboratory did not analyze and monitor the results of the feedback from the organizer and used the results to control laboratory activities and implement improvements where applicable.
Analysis: The problem with this case is that the lab did not analyze and utilize the monitoring activity data. "The data from the analysis and monitoring activities is used to control the laboratory activities and to implement improvements where applicable. "It is not only for unsatisfactory or problematic monitoring results (results that exceed predetermined guidelines), but also for monitoring data that achieves satisfactory results, in order to identify trends and, if necessary, to take appropriate measures to improve laboratory activities and prevent problems. At the same time, it can also be used as an input for future monitoring planning, providing a basis for the adjustment of monitoring methods and frequency.
Case 5: In the 2021 quality monitoring analysis report, only individual test results were evaluated, and there was no overall evaluation of the overall analysis and effectiveness of the monitoring results.
Analysis: Accreditation Criterion 77.3 It is required that the data of monitoring activities should be analyzed for the control of laboratory activities, although it is not clear whether it is an analysis of a single monitoring data (results) or a comprehensive analysis of all monitoring data (results) over a period of time, but the acceptance criterion 89. The management review requires the "output of ensuring the validity of the results" as the input of the management review, so the laboratory will conduct a comprehensive analysis of all the monitoring data (results) for a period of time, from which the development trend of the quality of laboratory activities will be discovered, and opportunities for improvement will be identified to help the laboratory control risks and better ensure the effectiveness of the results.
The above is the interpretation and case analysis of the CNAS official for the part of CNAS-CL01 "Accreditation Guidelines for Testing and Calibration Laboratories" to ensure the validity of the results.
More testimonials:
CNAS Accreditation Documents: Encyclopedia, CNAS CMA, Software Testing Laboratory, Create a Software Testing System from 0 to 1, How to Budget for Software CNAS Construction Costs?C C++ Programming Security Standard GJB-8114 Interpretation - How to Obtain CNAS Accreditation for Statement Use Software Testing Laboratories