Assessment

On-screen Assessment in England’s Exam System

Exploring what on-screen assessment could mean for GCSE and A-Level students in England.

Executive Summary
1.Introduction
2.What is on-screen assessment?
3. What are the benefits of on-screen assessment?
4. What are the issues in introducing more on-screen assessment into GCSEs and A-Levels?
5. Conclusion
Date06/05/22
AuthorDr Jamie Pickering

Executive Summary

Introduction

There is growing interest in the use of on-screen assessment in England’s GCSE and A-level exam system. The Secretary of State for Education recently told an education technology conference:

"It’s possible that more digital assessment could bring significant benefits to students, teachers and schools and I want to start carefully considering the potential opportunities in this area.

A significant shift away from traditional pen-and-paper assessment in England’s exam system would represent one of the most dramatic changes to GCSEs and A-levels since their introduction as national exams.

What is on-screen assessment?

On-screen assessments can take a number of forms, usually imitating an existing paper assessment or comprising new modes of assessment designed to take advantage of the on-screen format.

Within England’s exam system, on-screen assessment is already used for vocational and technical qualifications, as well as GCSEs in Computer Science, Food Preparation and Nutrition, and Geology, as well as A-level Computer Science.

What are the benefits of on-screen assessment?

A number of benefits for moving assessment formats away from pen and paper have been identified: resilience; environmental impact; innovation in assessment and content; student engagement; feedback on student performance; and, coherence with the ‘real world’, including the use of IT in teaching, Higher Education and the workplace.

What are the issues in introducing more on-screen assessment into GCSEs and A-levels?

Policymakers, regulators and exams boards will need to consider:

  • Assessment design - should on-screen assessments simply reproduce existing papers? Or, should qualifications taken on-screen be redesigned?
  • Impact on student performance – evidence is mixed as to whether on-screen assessment leads to a performance increase, decrease or makes no difference at all. Furthermore, there is a little research directly relevant to GCSEs and A-levels is available.
  • Digital literacy - everyday digital literacy does not necessarily prepare students for on-screen assessment
  • Availability of suitable devices - DfE research suggests 61% of secondary schools have a device to student ratio of greater than 1:5 – implying that many would be able to consider on-screen assessment for an entire Year 11 cohort. However, centres will also need to consider rooming, device condition, moderation, IT support and impact on computer-based learning across the school during an examination period.
  • Malpractice – given perceptions that on-screen assessment may lead to increased malpractice, exam boards and regulators will need to react to any potential new opportunities for malpractice as they emerge.

1.Introduction

There is growing interest in the use of on-screen assessment in England’s GCSE and A-level exam system. This reflects:

  • the increasing proportion of young people who are ‘digital natives’ and may be more receptive to technology’s role in assessment design
  • growing use of information technology (IT) in the classroom
  • the dramatic shift to online, remote learning that took place as a result of COVID-19 lockdowns.

The Secretary of State for Education recently told an education technology conference:[1]

It’s possible that more digital assessment could bring significant benefits to students, teachers and schools and I want to start carefully considering the potential opportunities in this area.

Several GCSE and A-level exam boards have announced research projects into on-screen assessment:

  • AQA has launched a pilot involving up to 2,500 students and 100 schools[2]
  • OCR is exploring the possibility of ‘digital first’ qualifications, with computer science, humanities and social science options being considered.[3]

A significant shift away from traditional pen-and-paper toward the use of on-screen assessment in England’s exam system would represent one of the most dramatic changes to GCSEs and A-levels since their introduction as national exams.

This policy briefing explores:

  • what is on-screen assessment?
  • what are the benefits of on-screen assessment?
  • what are the issues in introducing more on-screen assessment into GCSEs and A-levels?

2.What is on-screen assessment?

On-screen assessments can take a number of forms, usually either imitating an existing paper assessment or comprising a new style of assessment that takes advantage of on-screen technology.

In the case of the former, on-screen assessments may look largely identical to exam scripts, but with some changes that exploit the on-screen element, for example, on-screen multiple-choice functions.

In the case of the latter, on-screen assessment has the potential to look very different from pen-and-paper exam scripts, not least because assessments designed from the ground up to be taken on-screen can potentially assess new concepts, or approach them in novel ways. For example, in A-level computer science on-screen assessments, the candidate is able to have a more dynamic interaction with a programming environment.

Some on-screen assessments may require Internet connections. However, this is not a definitive feature of on-screen assessment, and some current on-screen assessments are taken offline with student responses uploaded at the end.

What high-stakes on-screen assessments are currently available in England?

On-screen assessment in high-stakes external assessment should be considered separate to several other phrases used interchangeably in discussion of assessment using computers instead of pen-and-paper:
 
•  Digital assessment refers to the increasing digitisation of assessment, and could relate to anything that involves moving from traditional pen and paper assessment to the use of digital technology in assessment.
•   Electronic assessment - can be used to describe any electronic methods to support the delivery of assessment.
•  Online assessment - can describe elements of assessment provision and support that have moved online, e.g. online results analysis and past paper downloads.
•  Adaptive assessment - refers to a specific type of on-screen assessment that makes use of computer algorithms to adjust the questions that are posed to the candidate in real-time, depending on how the candidate is performing. This results in an assessment that is better tailored to the candidate’s ability, with students responding to different sets of questions, but is not without its challenges.

Outside the education system, many people have already undertaken on-screen assessments, such as driving theory or citizenship tests.

Within England’s education system, there is already a significant amount of high-stakes on-screen assessment, including GCSEs and A-levels, as well as vocational and technical qualifications. This includes:

  • GCSE Computer Science
  • GCSE Food Preparation and Nutrition
  • GCSE Geology
  • A-level Computer Science

On-screen assessment is also used by a number of providers to deliver assessments of Functional Skills Qualifications

The use of on-screen assessment in external exams in England is regulated is regulated by Ofqual. The most recent regulatory principles relevant to on-screen assessment were published in 2007 by a predecessor body to the current regulator,[4] Ofqual, and are outlined in the Appendix. Overall, the principles are designed to ensure design continuity and comparability between electronic and traditional assessment.

3. What are the benefits of on-screen assessment?

A range of potential benefits to adopting on-screen assessment in external exams such as GCSEs and A-levels have been identified:

  • resilience - improved resilience in the exam system to timetable disruption and other challenges, e.g. as a result of a pandemic
  • environment - reduced environmental impact of exam paper printing and distribution
  • innovation - opportunity to assess existing content in new ways, or new content that cannot currently be assessed using pen-and-paper
  • engagement - student engagement may be improved by taking exams on-screen
  • feedback - implications for richer post-exam feedback and wider use of external exams as a formative tool for teachers and schools
  • coherence with ‘real world’ – i.e. on-screen assessment is consistent with the widespread usage of computers in teaching, higher education and the workplace
  • adaptive - in some subjects, assessments can be tailored to individual candidates via adaptive assessments, based on difficulty, making them more inclusive and fairer.

4. What are the issues in introducing more on-screen assessment into GCSEs and A-Levels?

There are a number of key issues that policymakers will need to consider:

  • assessment design
  • student performance
  • digital literacy among students and staff
  • availability of suitable devices
  • malpractice.

Assessment design

Transitioning more qualifications to on-screen assessment will require careful decisions about assessment design. 

One option for exam boards is to simply convert existing papers to on-screen, or at least closely imitate them. However, this is not without its difficulties. For example, academic research has highlighted that when digitising existing papers, exam boards need to be mindful of language,[5] given some adapted phrasing that converts a task into technical or digital language may result in invalid tests, which ultimately assess how familiar the candidate is with on-screen assessment.[6]

However, on-screen assessment also presents assessment organisations with the opportunity to completely re-design their approach to assessing specifications. Recent research suggests that test setters will need time and training in order to be able to develop new on-screen test questions.[7]

One of the opportunities to rethink assessment design in light of on-screen assessment is the use of adaptive assessment, which tailors a test to each candidate’s performance by selecting the next test item based on their performance at that stage in the assessment. However, while its benefits for tailoring content and ensuring a more engaging experience for all candidates is clear, this may make it less suitable for assessment that is designed with comparability in mind, especially when assessing a discrete body of knowledge as part of a high-stakes assessment. For example, it has been suggested that adaptive assessment is better suited to formative assessments, and particularly those that gauge a learner’s progression along a spectrum of learning.
Source: Masters, G. (2021). Computer Adaptive Testing – challenging traditional thinking. Teacher Magazine. Computer Adaptive Testing – challenging traditional thinking (teachermagazine.com)

Student performance

Any impact on the overall performance of the first cohorts taking on-screen assessments will need to be considered.

There are a variety of research findings regarding whether there is a performance increase, decrease or no difference at all as a result of switching to on-screen assessment from pen-and-paper. Examples include:

  • In Germany, there was no difference in attainment for fourth year medical students between on-screen and paper assessments, although on-screen tended to be faster and encourage guessing behaviour.[8]
  • In the USA the Partnership for Assessment of Readiness for College and Careers[9] (PARCC) assessment found students perform better in the pen and paper assessment compared with on-screen, although researchers argue this should be used to improve on-screen provision rather than abandon it[10], and that the difference begins to decrease in the second year of on-screen testing.[11]

Studies investigating performance at a more granular level, e.g. understanding the performance differences between comprehension, reading speed and accuracy, also show variation across different contexts.[12]

Relatively few studies are directly comparable with the on-screen assessment at GCSE and A-level, and therefore do not reliably indicate the potential impact of on-screen assessment in England.

Digital literacy

Alongside student performance, policymakers will need to consider the specific issue of digital literacy, and the preparedness of students for on-screen assessment.  

It is often assumed that the current generation of students are ‘digital natives’, i.e. their familiarity with technology in everyday life means that they would be capable and confident to access an on-screen assessment. However, for many, this involves the usage of a smartphone or a tablet, as opposed to PC-specific skills, and everyday digital literacy does not necessarily prepare students for on-screen assessment.

As such, variations in PC-specific digital literacy, in particular, typing skills and speed, may create challenges to perceptions of fairness if on-screen assessment is adopted more widely in GCSEs and A-levels in England - although it should be noted not all on-screen assessments involve extended written tasks.

In exploring the scope to move more GCSE and A-level assessments on-screen, policymakers will need more evidence than is currently available on issues such as:

  • Do students possess a type speed sufficient to demonstrate their knowledge?
  • Will students with higher type-speeds be at a significant advantage?
  • How does the advantage compare with the advantages that faster and clearer hand-writers currently enjoy?

Digital literacy of staff is also a challenge to the national rollout of on-screen assessment. It is likely that teachers, IT support staff and invigilators alike would require some further training to ensure a consistent level of digital literacy across the country, which in turn would be costly and time consuming.

Availability of suitable devices

A key consideration for increasing the amount of on-screen assessment in GCSE and A-level exams in England will be ensuring that centres are equipped to deliver them.

The number of suitable devices available in a centre will determine:

  • the number of students able to take on-screen assessments simultaneously
  • the qualifications that can taken be on-screen, e.g. large-entry subjects such as GCSE maths and English would require a centre to be able to accommodate an entire cohort simultaneously
  • impact on wider schooling, since deploying some or all of a centre’s IT resources (including networking and support staff) in support of external assessments for the duration of the exam seasons will have implications for day-to-day IT provision and teaching.

In July 2021 the DfE published a report into the use of technology in schools which provides some relevant insights.[13] It found that 61% of secondary schools have a device to student ratio of greater than 1:5 – implying that many would be able to consider on-screen assessment for an entire Year 11 cohort.[14]

Nevertheless, as well as the number of devices, centres have to consider rooming, ensuring sufficient spacing for assessments to take place under exam conditions, device condition, moderation, IT support and impact on computer-based learning across the school during the summer examination period.

Indeed, variations among centres makes estimating the hardware readiness of England’s secondary schools challenging without further evidence. Ofqual’s 2020 report into on-screen assessment noted similar concerns, while highlighting Finland’s solution in terms of downloading assessments well ahead of the scheduled time for the exam[15].

In considering the availability of necessary devices in schools and colleges, it is also important to note issues around the characteristics of the devices that students would use in their assessments. For example, research has explored what impact screen colour and lighting has on test performance,[16] and some research has suggested that higher screen resolution17] has relatively little impact on maths test performance, but may improve verbal test performance.[18]

It is for this reason that stakeholders such as Ofqual have expressed scepticism about a ‘Bring Your Own Devices’ (BYOD) approach to on-screen assessment in high-stakes exams, given the challenges of technical difficulties, ensuring equity and the risk of installing an executable file. Nevertheless, BYOD has been implemented in Finland by ensuring that devices pass a compatibility test, and in New Zealand it has been partially implemented in the form of a school-by-school ‘opt in’ scheme.

Malpractice

The risk of malpractice in England’s exam system has evolved in recent years with the advent of the mobile phone, smartphone and smart watch[19]. As a result, the rules and guidance for centres to mitigate this risk have also changed.

Much greater use of on-screen assessment in England’s exam system could theoretically pose new or different risks of malpractice – although it is notable that existing GCSEs and A-levels that deploy on-screen assessment have not been characterised by higher levels of malpractice. 

Ultimately, the nature of the evolving malpractice risk arising from much greater on-screen assessment will depend on how on-screen assessment is deployed. For example, adopting a BYOD approach with a downloadable but offline assessment will require different steps to reduce malpractice, compared with an online assessment taken on a centre device.

As Ofqual has pointed out, malpractice opportunities will simply need to be reacted to as they emerge, and the education sector will need to be mindful of the diverse ways in which malpractice opportunities will present themselves.[20]

5. Conclusion

In summary, there is significant interest in the scope and potential for the greater usage of on-screen assessment in England’s exam system.

However, it is clear that more research is needed across a number of issues to support policymakers. Nevertheless, a number of key observations can be made:

  • on-screen assessment is already a feature of high-stakes examination in England, both in VTQs and some GCSE and A-level subjects
  • the readiness of schools in the UK to deliver on-screen assessment is affected by a range of factors, the true extent of which is difficult to evaluate in the absence of a comprehensive review - and lack of data - in regards to performance outcomes and the ease of implementing on-screen assessment
  • there is a substantial body of research literature that considers the implications of on-screen assessment – in terms of both design and broader implications – although this is focused on international or higher education case studies. This will likely change as more on-screen assessment is adopted in England.

Given the limited research into on-screen assessment directly relevant to GCSEs and A-levels, policymakers can also study those countries that have successfully introduced on-screen assessment.

A recent Ofqual report drew upon evidence from three countries leading the way in terms of on-screen assessment provision: Finland; Israel, and New Zealand. A detailed description is available in the annex, but key lessons from all three jurisdictions include:

  • the importance of a strong political commitment to re-design assessment to reflect society’s needs and objectives
  • an acceptance that implementation will feature setbacks, and a resolve to overcome them.

APPENDICES

Appendix 1: Lessons learnt from other jurisdictions that have successful on-screen and online assessment systems. Source: Ofqual[21]

IsraelFinlandNew Zealand
•      Libraries and community centres used if schools lack devices.
•      Use of scientific notation (e.g. formulae) currently being considered regarding speed and convenience.
•      Pre-exam system test is required.
•      Students can choose to listen to questions and record answers by voice.
The exam is sent to school network 24 hours before, along with an encrypted backup two days before.
•   Motivated by a desire to assess new constructs.
•   Political decision driven by a desire to ensure that assessment supported real ways of working after education.
•   Social media used to create communities for those involved in the rollout.
•   Encrypted assessments distributed ahead of time to tackle rural power-cuts.
•   BYODs permitted if laptop passes a compatibility test- schools provide devices for those who cannot afford one.
•   AOs send a USB stick to candidates, which converts each device to have identical functionality
•   Once a subject converts, there is no paper option
•   Opted for online assessment as government also committed to ensuring minimum 1GB internet connection across schools.
•   A gradual, voluntary school-by-school approach was adopted to build up trust and spread enthusiasm via word of mouth.
•   Recommendation that only schools using digital learning systems in the classroom make the leap to on-screen assessment.
•   BYOD policy in schools that 'opt-in' and are therefore able to support this approach.
•   Support offered to help students improve typing skills.
•   Students can revert to paper at any point- including during the exam itself.

•   Israel and Finland distribute directly to local networks.

• In New Zealand and Finland there was an acceptance by sponsors that things could go wrong at any point.

• New Zealand and Finland emphasised a 'comprehensive communications and engagement plan'.

Annex 2: Summary of key regulatory principles of e-assessment according to QCA’s ‘Regulatory principles for e-assessment.’[22]

A broad conceptualisation of e-assessment (as described above), covering ‘electronic systems for the development, operation and delivery of accredited qualification assessment’.

  • Validity
  • Security
  • Data Integrity
  • Operation and integrity of e-assessment systems
  • Access to e-assessment
  • Business continuity
  • Automatically generated online tests
  • Test conditions and environment
  • System familiarisation for centres and assessors
  • Adaptive testing
  • The use of e-portfolios for assessment.

[1] https://www.gov.uk/government/speeches/education-secretary-delivers-speech-at-bett-show

[2] AQA. (2022). AQA to launch major on-screen assessment pilot. AQA. https://www.aqa.org.uk/news/aqa-to-launch-major-on-screen-assessment-pilot

[3] Lough, C. (2021). Revealed: How online GCSE trials have already begun. TES Magazine. https://www.tes.com/magazine/news/general/revealed-how-online-gcse-trials-have-already-begun

[4] Qualifications and Curriculum Authority. (2007). Regulatory principles for e-assessment. https://publications.parliament.uk/pa/cm200607/cmselect/cmeduski/memo/test&ass/ucm3102paper4.pdf

[5] Khan, R. & Shaw, S. (2018). To "Click" or to "Choose"? Investigating the language used in on-screen assessment. Research Matters. 26. 1-9.

[6] It is also worth noting that Khan and Shaw recommend digital literacy assessments of candidates to establish a baseline, and  requiring that on-screen assessments have to meet that baseline in terms of their technical complexity, to become accredited.

[7] Crisp, V. & Shaw, S. (2020). Writing and reviewing assessment questions on-screen: issues and challenges. Research Matters. 30. 19-26.

[8] Computer versus paper--does it make any difference in test performance? - PubMed (nih.gov)

[9] The PARCC test also differs substantially from high-stakes examinations in the UK, in that it seeks to assess a variety of skills and knowledge simultaneously in a snapshot of ‘readiness’ for progression.

[10] Future-Ed. (2018). Paper vs. Online Testing: What's the Impact on Test Scores? https://www.future-ed.org/work/paper-vs-online-testing-whats-the-impact-on-test-scores/

[11] Backes, B. & Cowan, J. (2019) Is the pen mightier than the keyboard? The effect of online testing on measured student achievement. Economics of Education Review. 68. 89-103.

[12] Ceka, B., & O’Geen, A. (2019). Evaluating Student Performance on Computer-Based versus Handwritten Exams: Evidence from a Field Experiment in the Classroom. PS: Political Science & Politics. 52(4). 757-762.

[13] CooperGibson Research. (2021). Education Technology (EdTech) Survey 2020-21: Research report. Department for Education. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/996470/Education_Technology__EdTech__Survey_2020-21__1_.pdf

[14] Since Year 11 would represent 20% of students in a school without a sixth form.

[15] Ofqual. (2020). Online and on-screen assessment in high stakes, sessional qualifications.  https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/943382/Barriers_to_online_111220.pdf

[16] Houselog, M.E. (2019). Effect of Computer Screen Color and Brightness on student Achievement during Computer-Based Reading Comprehension Assessment. [Thesis] https://repository.stcloudstate.edu/edad_etds/64/

[17] I.e. the number of pixels horizontal by vertically, or the number on a screen.

[18] Bridgeman, B., Lennon, M.L. & Jackenthal, A. (2003). Effects of Screen Size, Screen Resolution and Display Rate on Computer-Based Test Performance. Applied Measurement in Education. 16:33, 191-205.

[19] Joint Council for Qualifications. (2021). Suspected Malpractice Policies and Procedures. https://www.jcq.org.uk/wp-content/uploads/2021/09/Malpractice_21-22_FINAL.pdf

[20] Ofqual. (2020). Online and on-screen assessment in high stakes, sessional qualifications.  https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/943382/Barriers_to_online_111220.pdf

[21] Ofqual. (2020). Online and on-screen assessment in high stakes, sessional qualifications.  https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/943382/Barriers_to_online_111220.pdf

[22] Qualifications and Curriculum Authority. (2007). Regulatory principles for e-assessment. https://publications.parliament.uk/pa/cm200607/cmselect/cmeduski/memo/test&ass/ucm3102paper4.pdf

Computing

Could girls be the secret to boosting the UK’s growth as a technology superpower?

What if women played a more central role in responding to the rapid technological changes in our world? Girls in England outperform boys at every grade level but disproportionately don’t take Computer Science at GCSE.

Read Moreicon / report
Assessment

What can this year’s GCSE entries tell us to look for in tomorrow’s results?

With GCSE Results published on 22nd August, Dr Chinwe Njoku looks into the underlying data on what subjects this year's cohort took and how this has changed from previous years.

Assessment

A-level maths students hit six figures – what’s behind its popularity?

On Thursday, more than 100,000 A-level maths students in England will find out their results – 11.4% more than last year. Why the upturn? Dr Chinwe Njoku, AQA Education Insights Lead and former maths teacher, was heartened by the news and keen to look at the story behind the data.

Education Policy

What comes after ‘urgent’ for the new Education Secretary?

After the burning issues are addressed, what should come next for the new Education Secretary?

Education Policy

Labour’s oracy plans: They need clear goals

Sir Keir Starmer has said he wants to boost students’ confidence by raising the importance of speaking skills – oracy. In this previously published blog, Reza Schwitzer, AQA’s director of external affairs, applauds the ambition but warns there needs to be clear goals

Education Policy

Through the looking glass: How polling the public can help policymakers learn about themselves

Public attitude data is key to effective policymaking. Proper polling can reveal what people think about existing policies and what they want for the future. But, if looked at from a different angle, it can also help policymakers question themselves and their assumptions about the public. In this blog, AQA’s Policy and Evidence Manager Adam Steedman-Thake, reveals the lessons he learned about himself while reading a recent public attitude survey.

Assessment

Assessing oracy: Is Comparative Judgement the answer?

Oracy skills are vital to success in school and life. And yet, for many children, opportunities to develop them are missed. Educationalists are engaging in a growing debate about where oracy fits into the school system. Labour has put it at the heart of its plans to improve social mobility and an independent commission is looking at how it is taught in the classroom. This renewed focus on oracy means it is more important than ever that teachers have a way to reliably assess and understand their students’ attainment and progression. Amanda Moorghen of oracy education charity Voice 21 explains how Comparative Judgement can help with that and why it may be a game changer.

Education

TV subtitles as an aid to literacy: What does the research say?

Jack Black is probably best known in educational circles for playing a renegade substitute teacher in School of Rock. But the Hollywood star has made a more conventional foray into education by backing the use of TV subtitles to improve child literacy. Stephen Fry and the World Literacy Foundation also want parents to use their TV remotes to get children reading. So, could this simple click of a button be a solution to boost pupils’ reading skills? AQA’s resident expert on language teaching, Dr Katy Finch, casts her eye over the research to see whether it stacks up.

Data Analysis

What is left behind now education’s Data Wave has receded?

Is data the solution to all education’s issues? About a decade ago the prevailing wisdom said it was. Advocates of this Data Wave argued that harvesting internal statistics would help schools solve issues such as teacher accountability and attainment gaps. As with all waves, after crashing onto the beach they recede, leaving space for another to roll in. In this blog, teacher, author and data analyst Richard Selfridge looks at the legacy of the Data Wave to see what schools can take from it.

International Approaches

Finland & PISA – A fall from grace but still a high performer?

Finland was once recognised as one of the most successful educational systems in the world. At the turn of the millennium, it topped the PISA rankings in reading, maths and science. But by 2012, decline set in. The last set of results showed performances in maths, reading and science were at an all-time low. In this blog Dr Jonathan Doherty of Leeds Trinity University outlines some reasons that may account for the slide.

Join the conversation on Twitter

Follow us for discussion and new post alerts

Download a PDF version.

Download a copy of this content to your device as a PDF file. We generate PDF versions for the convenience of offline reading, but we recommend sharing this link if you'd like to send it to someone else.

Sign up to our newsletter

Sign up for news and alerts

Pages
BlogsPublicationsData StoriesEventsAbout
Hosted and Curated by AQA with AQA logo
© 2024 AQA