UKAS’s latest technical bulletin gives further insight into the muddled thinking that lies behind the ISO scheme.
Medical Royal College examinations, Continuing Professional Development, GMC registration, revalidation etc. etc. are all unable to render doctors competent any more.
And UKAS want their cut of the bureaucratic profit through ISO 15189:2012.
“As shown by the range of indicators that might contribute to the assessment of competence, good performance in EQA/iEQA does not automatically demonstrate competence and equally, poor performance does not automatically mean lack of competence.”
So, even though performance in EQA schemes proves little, doctors must still submit to UKAS assessing them through a method they admit confirms nothing.
The old BSI strategy is to maximize the potential points of failure and pretend this is meaningful or helpful.
This self-contradiction will protect UKAS from being sued for defective pathology.
Though note that BSI is in trouble again for continuing with its deceptive advertising when the Advertising Standards Agency told it not to.
UKAS will play its part in squeezing the life out of healthcare purely for its own profit.
The bulletin in full:
28 November, 2017
This paper is intended to clarify requirements of ISO 15189:2012 and UKAS’ expectations of how a medical laboratory demonstrates the competence of its clinical staff. This paper also includes information on the use of External Quality Assurance (EQA) programmes.
This paper has been developed by the UKAS Medical Laboratory Technical Advisory Committee following some questions concerning the approach to the assessment of clinical staff competence and seeks to provide clarification to laboratories of the information that a laboratory needs to provide to UKAS as part of its assessments.
UKAS accreditation provides independent confirmation of a laboratory’s competence to carry out specified activities. UKAS assessments are conducted to gather objective evidence that a laboratory is competent; assessing the competence of a laboratory will include the competence of the personnel, the validity of test methodologies and the validity of outcomes/test results.
It follows that UKAS must assess the procedures a laboratory has in place to evaluate the on-going competence of all staff and the associated evidence and records that demonstrate implementation. As with all areas of assessment, this will be assessed through a sampling exercise on assessment visits. As stated in previous guidance in this area (Assessor Update April 2015), competence in the provision of clinical advice/opinion is likely to be based on multiple components.
UKAS considers that EQA provides valuable information for laboratories and accreditation bodies that gives assurance about the medical laboratory service, including assurance on clinical interpretations. Participation in EQA is not a compulsory component of the assessment of the competence of clinical staff, however, as evidence of conformity to good medical and scientific practice and continuing professional development, where appropriate schemes exist, laboratories might reasonably expect their clinical staff to participate.
It is the laboratory’s responsibility to define the criteria that it uses to determine the competence of its staff. UKAS uses competent peer assessors that have the demonstrable knowledge, skills and experience to evaluate the laboratory’s approach and assess examples of records to determine that the approach is followed and is effective.
ISO 15189 provides some examples (Clause 5.1.6 Note 1) of how a laboratory can assess the competence of its staff, both initially and on an ongoing basis. A laboratory may choose to use different mechanisms to provide information about the ongoing competence of its staff and this might include the use of EQA where it exists and is appropriate. UKAS understands that there are some limitations to the use of EQA in this way, particularly ’interpretive’ EQA schemes (see below).
ISO 15189 clause 5.1.9 specifies the content of the records to be held for all personnel and requires that the records are readily available. The laboratory should consider carefully how it will retain access to this information, taking into account that some competence information may be held as part of confidential appraisal records or performance reviews. The assessment team needs to assess the evidence that is used to demonstrate competence and the laboratory should seek to provide this without compromising confidential information that is not relevant to the assessment.
The assessment of the laboratory’s approach to evaluating clinical staff competence may include assessment of, but not necessarily be limited to, the following:
- Qualification records, experience, knowledge, appointment process, induction, training sign off
- Records of EQA participation
- Mechanisms to monitor on-going competency internally and associated records
- Any competency assessment programme would have defined acceptance criteria, including for clinical staff. It would be expected that such an on-going programme is suitably robust to cover all of the staff member’s scope of activity, at sufficient frequency.
- Records of knowledge sharing, for example MDT involvement, case review discussions, case handovers, on call involvement
- Suitability of competency programme acceptance criteria
- CPD (e.g. College CPD, external meetings, course evaluations, iEQA)
- Review of test reports
- Coverage of all areas by internal audit
- Minutes of meetings aimed at service improvement
It is acknowledged that information relating to a doctor’s competence is essential for the GMC revalidation process and will be gathered and reviewed as part of the regular appraisal process. Competence records held by the laboratory may be used by clinical staff to generate evidence for appraisal and vice versa. This should be conducted in such a way to preserve the confidentiality of information that is not relevant to the assessment.
UKAS Position on External Quality Assurance Participation
Technical EQA schemes
ISO 15189:2012 (clause 126.96.36.199) requires the laboratory to participate in interlaboratory comparison programmes (such as EQA or proficiency testing) appropriate to the examination and interpretation of results. UKAS Publication TPS 47 Policy on participation in proficiency testing contains more information to support the implementation of ISO 15189 and recognises that it is for the laboratory to judge the suitability of an EQA scheme to meet their needs. A laboratory may be able justify non-participation in an available scheme based on its suitability and/or other factors. The laboratory can also consider alternative approaches to providing assurance. It may be necessary to look beyond the UK for EQA scheme availability and laboratories may refer to the EPTIS database to assist with this process [http://www.eptis.bam.de/].
Interpretive EQA Schemes
UKAS understands that not all interpretive EQA (iEQA) schemes are suitable for use in demonstrating the performance of the laboratory and/or individuals, and that participation may solely be for the purpose of CPD. It is for the laboratory to determine which schemes it will participate in and for what purpose. Where a laboratory is not participating in an existing scheme there needs to be justification for this; where a valid justification is not provided, a suitable finding will be raised.
Where individual staff participate in iEQA for the purposes of their own CPD, evidence should be included in their CPD records.
Assessment and Reporting
UKAS understands that participation in EQA/iEQA is primarily to provide information to develop and improve processes, knowledge and understanding, and to assure outcomes, and does not simply aim to achieve a specific result or answer. However, where unexpected results are encountered and any necessary action is identified, this should be progressed through the laboratory’s own governance system to consider opportunities for improvement.
While participation in EQA/iEQA may provide evidence of a positive approach to quality assurance, it is recognised that there is a strong educational component and EQA/iEQA scheme reports will often contain advisory and educational components to achieve this aim. As shown by the range of indicators that might contribute to the assessment of competence, good performance in EQA/iEQA does not automatically demonstrate competence and equally, poor performance does not automatically mean lack of competence.
Where UKAS identifies nonconformities, it is the responsibility of the laboratory to specify its improvement/corrective action, this is not something that is specified by the assessment team. It is usual for the laboratory to discuss its proposals with the assessor(s) and there is usually some discussion about the possible routes to take and the impact of each. Should the laboratory reconsider its original proposal and develop an alternative solution to an agreed improvement that addresses the finding, such evidence can be submitted and would be considered by UKAS.
Download printable version of this Technical Bulletin