Ten years of maturity exams: it is time to change the discourse

State maturity examinations should no longer be just part of the statistics but a tool for the improvement of the quality of education.
Was the confidentiality of the examinations protected, did the last question “leak” on the internet, how many students were caught cheating, how many students failed, how many got excellent scores, was there a typo in any of the test questions or were the instructions misleading… This is what students, parents, government experts, politicians, the media, and the majority of citizens worry about at the end of May. Every year for ten years now.
When in 2008 the then government and parliament finally managed to enforce obligatory maturity exams at the end of twelfth grade, the expectation was that this would establish uniform standards of subject knowledge which would be applicable in the entire country, would test the quality of teaching, and would serve as the basis for reforms – both at government and school level. The public discourse in the past ten years, however, has been far from these goals. And if up until recently we could justify it by saying that there hasn’t been enough time to ensure quality analysis and stricter measures, we cannot keep making the same excuse in 2018. If we want the 2008 reform to meet the expectations that the public had for it, we need to remember how the examination results of the twelfth-graders can help us accomplish our big goal – improving the quality of education in Bulgaria. Here are some ideas.
By paying attention to every school and every student
We see the same stable trend in the results of the maturity exams every year – there is a big gap in the quality of education in different schools – language, humanities, mathematical, and vocational schools. The gap in the quality of education also depends on the region and the size of the town. This conclusion is also apparent in the results from the national external assessments at the end of seventh grade. As well as in the results from PISA, according to which the accomplishments of students in Bulgaria mostly depend on their family environment, the school they go to, and the region they live in. This unequal access to quality education prevents a huge number of children in Bulgaria from developing their potential and limits their chances to be successful in the future. This has a negative impact on the economy and living environment in Bulgaria.
If the results of the maturity exams are analyzed by taking into consideration all factors relevant to the specific school, this could give us a better idea of the conditions in which students learn and teachers and principals work. It could help us get closer to the approach which has been proven to bring better results in education – measures according to the individual needs of every student, teacher, and school, instead of top-down centralized reforms. The Ministry of Education and Science took some steps in this direction in the past year by introducing a new formula for the distribution of the delegated school budgets which takes into account some specific characteristics of the environment. Linking financial measures to the quality of education would be the next logical step towards better results at school.
By showing encouragement and support and not by finger-pointing
Publishing the list of top 10 best performing school based on the results from the maturity exams (even worse – the bottom 10 as we used to do a few years ago) cannot help improve the quality of education in Bulgaria, especially if it only leads to mass hysteria about trying to get into the “good” schools and pointing fingers at the “bad” ones.
In Finland, which has one of the best education systems in the world, the results from the exams at the end of twelfth grade are not public but are sent individually to each school with requirements for improvement and opportunities for support.
It is time the analysis of the maturity examination results included the added value of the school for the students. For example, has the school helped them go from an average of 2.7 (out of 6) to 4.4 or from 5.4 to 5.6. The top 10 best performing schools are most likely to represent the latter group of students (who usually also take extra private classes), even though the progress that the former has made is much more significant.
The true value of external assessments would be to share best practices from the top performing schools. What are the teaching methods in Smolyan, for example, where students have good results in their Bulgarian language assessment? Which are the schools located in small towns and villages which have managed to keep their students and what methods do teachers use there? Sharing such good examples shows that students are able to achieve better results and provides schools with ideas for their practice.
By reevaluating what we want from the maturity exams
Are they measuring the knowledge and skills young people today need in order to be successful? This is another question we need to be asking ourselves. Even though some changes in the format have been made recently, the maturity exams still mostly measure what students have memorized and not key 21st-century skills such as analytical thinking, problem-solving, and communication skills.
The content is important, but the assessment rubric also matters. It currently sets the bar quite low and is not adequate compared to the standards of education in developed countries.
Tackling these questions would provide guidance for what we need to change in the maturity exams so that they could help improve the quality of education in Bulgaria. If maturity exams start serving as the basis for effective reforms, we probably won’t have to talk about whether students have cheated or about the mistakes and typos in the tests in another ten years.
This text was written for The Economist (Bulgaria), issue 22, published on June 1, 2018.