ETF Forum for Quality Assurance: international peer review helping vocational education

Quality assurance (QA) experts from the EU and surrounding countries met recently at the ETF to discuss about ways to improve QA in vocational education and training (VET).

Since 2017, the ETF has been building up a network of partners from the Southern and Eastern Mediterranean, the Western Balkans and the Eastern Partnership, and the European Union. Yearly meetings have been organised to identify and disseminate QA best practice in a collaborative, inclusive way.  

Each year the meeting is hosted by a different country (last year Moldova, this year Armenia).

“The host country”, explains the ETF’s Mounir Baati, “opens up its quality assurance system, describes it in a way that is understandable to others, and then people provide feedback and answer questions posed by the host. When you take part in a peer visit the participation is very active.”

“It’s about the transparency of the system. Transparency increases trust. If an employee or employer is holding a qualification, you can easily get information about what is behind that piece of paper. It’s very important.”

With Italy hosting, Laura Evangelista and Concetta Fonzo, National and Deputy Coordinators respectively of EQAVET (European Quality Assurance in Vocational Education and Training) Italy, briefed members on the Italian QA system. After lunch, members paid a visit to a vocational school, CNOS-FAP, better known by the name of its iconic, 19th century founder, Don Bosco. The school was the setting for a discussion of the methodologies used to develop qualifications, standards and programmes, and to learn more about accreditation in Italy.

Baati and his colleagues are now proposing a QA diagnostic tool, based on the EQAVET framework. It will be piloted in two countries in 2024 before being rolled out to forum countries in following years. It is particularly aimed at an area which often goes under the radar in VET: apprenticeships and work-based learning.

“It’s a handbook for monitoring and evaluating work-based learning”, said the ETF’s Stefan Thomas. “Many of you are familiar with monitoring school-based and VET-learning: we look at enrolment rates, teacher-student ratios, completion rates, how many drop out… we might also look at financial issues, how much invest into buildings, text books, then you might look at what the students are actually doing and so on...

But the story becomes slightly different when you introduce a new type of programme – apprenticeships. Companies come in and play a major role. We need to lead them to be reflective regarding how they do evaluation. One might want to monitor the participation of companies and enquire as to why certain companies aren’t participating.”

In keeping with the forum’s emphasis on inclusivity, Baati underlined that this new tool is about informing rather than finding fault, and about getting to know each other’s systems.

"That’s the basis of cooperation: to find out what areas of common interest there are, to get an overview of the whole system to understand what areas need support. If we start to compete and punish one system against another, it’s not working…”

Collaboration was one of the key themes that emerged from the two-day gathering.

“Quality assurance can’t only be done by a ministry,” said Thomas. “Social partners, and chambers or guilds, need to come in too. They need to be a part of it, they should have an interest in the supply and demand [of future employees]. The associations that represent companies should drive the system with the ministry – that’s what we call the collective skills formation system.”

Thomas also mentioned the importance of statistical evidence and database usage.

“With work-based programmes, always start with a few indicators. If you have 200 placements for cooks and only 20 students who want to do it, there’s something wrong. If you don’t have that information, it would be very difficult to make policy decisions.”

There was a frank discussion about the challenges of implementation. Tina Saric, of the Education Reform Initiative of South-Eastern Europe (ERI-SEE) described a study in the Western Balkans that revealed how “external evaluation is seen as an administrative burden, more a copy and paste exercise”. The study suggested that external evaluators are seen merely as civil servants whose own skills and competences are either insufficient or unrecognised. “The system of training is sporadic and not continuously organised,” said Saric.

The additional administrative burden of QA was echoed by the ETF’s Julian Stanley:

“There’s a lot of innovation in education, and everyone is very, very busy. We feel like we’re saying: ‘You’re doing a lot and would you like to do something extra?’ Even if there’s a benefit, [evaluation] is an additional ask, that’s the reality.”

After break-out groups discussed the proposed new tool, feedback was offered: one group suggested ways in which this new tool might be different to EQAVET that was described as “focusing on outputs and not so much on processes. We feel that this [tool] should be focused on processes, not only on aspects of management, governance and so on, that it could be wider…”

Houssem El Hajj from Lebanon drew attention to some of the blind spots in tracer studies:

“How long do they go on for? If I find work, will I actually go and report back ‘hey guys I found a job’? Who tracks me and what trend would that give, and what decisions would be based on it? Türkiye is a good example where they have a specialised person [tracing career paths], but for how long?”

Anette Curth, from ICF, also raised the key question of conceptual recognition:

“Is there a clear and common understanding of the related concepts?” – emphasising the importance of “a pedagogical concept and vision that accompanies new tools and means of learning”, rather than simply using them as add-ons: “A lot was achieved during Covid, but everything had to be done very quickly. We need to ensure that technological innovation doesn’t bypass everything that we have to develop in terms of pedagogy…”

She also urged for clarity when introducing any new forms of teaching, suggesting clear guiding questions be asked:

“Why we are doing this, what is the gain and what are the risks? What are the strategies and goals? What do we want to achieve and what are the clear, measurable targets?”

In bringing the event to a close, ETF director, Pilvi Torsti, expressed admiration for the courage of the members’ commitment to evaluation and assessment:

“When I came here in April last year,” she said, “I was inspired by quality assurance peer visits. It’s a brave thing to subject yourself to others’ criticisms. But you’re also subjecting yourself to admiration and respect. Once you open yourself up, you learn from others. You gain more than you ever thought you would.”

Did you like this article? If you would like to be notified when new content like this is published, subscribe to receive our email alerts.