PISA 2018 showed that only 1 in 10 students who took the tests were able to differentiate between an opinion and a fact. The study also found out that students were not able to identify misinformation on the internet. The Pisa 2018 had made drastic changes compared to 2009 and included more computer-related tasks
The OECD’s Programme for International Student Assessment (PISA) tests are held every three years and are used to evaluate students’ maths, reading and science in dozens of countries. The latest tests that were held in 2018 had an emphasis on reading comprehension. This was the first time since 2009 that the reading skills were given the first priority.
The dynamics of reading have changed since 2009, with a lot of adults and children now using online resources for reading. In 2009, it was optional for students to chose if they wanted to test their ability to read online resources. This optional tests were called electronic reading tasks and were evaluating learners on how to use the internet as a form of information.
PISA 2018 tests and results
Fast forward to 2018, and a lot has changed in this digital error. The Internet is easily accessible, and almost everyone is using it. There is an emergence of fake news and searching for information on the internet has become much harder with unverified and inaccurate information spread on the internet.
The 2018 PISA competition had to come up with a way to capture this radical change in the way they tested these students. They introduced a lot more computer-based reading tasks that apply to the current times on how students are accessing their information. This implementation was important in exploring the digital natives who have grown up in the digital era and who are more comfortable in navigating the digital world.
The results from these tests showed that students are still lurking behind. For instance, the results showed that only 1 in 10 OECD students were able to differential online materials into either facts or opinions. When they were required to analyze from different sources, the students did not do well also in this category.
Tasks such as determining the author’s perspective in a given story were also evaluated. This was done by giving students three different stories and asking them to identify the author’s viewpoint. The results showed that students’ ability to correctly answer this question corresponded to level 5. This level meant that only 8.7 percent of students were able to answer these questions correctly.
The level of performance in PISA 2018 may be due to students’ inability to read and comprehend. It may also be due to challenging tasks that adults would also struggle with. Whichever the case, there remains questions on whether digital skills is the right way to assess reading. The methodology of teaching may have also contributed to how some countries performed.
Featured image by Pixabay