Re-engineering EA’s peer-evaluation system

NoticesInternational activities

As amendments in the peer-evaluation system of EA turn challenges into a success story, an interview was based on that with Maria Papatzikou, Chair of the EA MLA Council, and Daniela Ionescu, Secretary of the EA MLA Council and manager of the peer-evaluation (PE) system.

Accreditation developing in many new areas, digitalization booming in more and more sectors of daily operations, cybersecurity issues, Artificial Intelligence (AI) -related discussions, intensive development of internet-based processes, etc. These are a few of the major changes that organizations are faced with. EA and its members, the National Accreditation Bodies (NABs), are no exception. More specifically, changes in accreditation standards, but also changes in the processes in the NABs prompted EA to revisit its practices regarding peer evaluations.

“As members of EA, we continuously collect and analyze feedback from the market and national authorities, our main customers, and prescribers. It provides useful input for constant improvement and a real incentive to review our way of doing things. The same applies when we work in the EA MLA Council (MAC) to improve the peer-evaluation system of EA NABs”, says Maria.

The results of the survey made among the EA members highlighted five major improvement routes in the peer-evaluation system: how to ensure better coverage of the NAB activities, improve the way to determine the composition of the teams and options for evaluation, the reporting system, and the decision-making system to be able to respond quickly to NABs when they are required to develop a new scheme shortly? Revising the policy for planning witnessing activities is also a key point.

What measures to address the issues identified? 

In general, the Secretariat’s support to the evaluation team has been considerably reinforced by transferring more tasks from the MAC Management Group to the Secretariat, while increasing resources and competencies in the Secretariat. Experience already shows that the change is a good one; it is very well received by the NABs and the PE teams.

“For example, the system set up by the Secretariat on the internet by which the PE teams can share information and interact about PE and reports with the Secretariat and the NAB under evaluation, has been reported to be a real improvement,” says Daniela.

More flexibility has been introduced in the planning of witnessing activities, which can be disconnected from the main evaluation week. It provides better coverage of the evaluation and makes it easier for the NAB to plan the relevant activities.

An online decision-making system has been set up. It can be used every time the peer evaluation report does not raise critical issues. So far, it applies in 90% of the cases. The principle is that the Secretariat, with the MAC Management Group and the Task Force Group assigned to review the peer-evaluation report, come up with a recommendation submitted to the MAC members for voting through the EA intranet. There is no need to wait until the next MAC meeting: the decision can be made and published shortly after the closing of the ballot. Consequently, only critical cases remain on the MAC meeting agenda for discussions, leaving more room for other discussions e.g. policies, workshops, etc.

The risk-based approach in the peer evaluations – From in-the-field experience to a documented system, will be mentioned by Maria Papatzikou and Daniela Ionescu HERE