UCL Exams on Laptops project summary - 2009-10

Benefits

The following benefits of running exams on laptops as a full service at UCL were anticipated at the outset:

  • UCL will demonstrate that it is a forward-thinking institution and that it recognises that assessment practices need to align with modern day study and working practices.
  • Anonymity will be preserved as staff will not be able to identify scripts by the handwriting.
  • Problems of physical discomfort caused by writing by hand for extended periods will be removed.
  • Staff will no longer have to struggle to read illegible handwriting which may lead to either unfair penalisation if writing cannot be read, or to unfairly giving the benefit of the doubt.
  • In a handwritten exam there is a need to start at the beginning and write linearly till the end. Few people work in this way these days and instead most written work involves ‘outlining’, creating sections, and then building content in a non-linear way, with extensive copying, pasting and inserting text.  It can be argued that it is unfair to expect students to write linearly for exams when they don’t do this normally.
  • UCL may gain a competitive edge for student recruitment.

Issues and recommendations

The following issues and recommendations emerged from the Exams on Laptops Pilot project

#

Issue / Topic

Recommendation

1.                 

Student demand
The true level of student demand and enthusiasm for this mode of examination should have been established at the outset of the project. The ISD survey results suggest that this is not a priority for students.

Conduct a survey of students to establish levels of enthusiasm and resistance to this approach.
 

2.                 

Lack of student engagement in the pilot
·         Not a priority for students
·         Not wanting to be a guinea pig
·         No long term benefit
·         Lack of promotion
·         Lack of awareness
·         Limited modules were offered
·         Not preferred method for all

More effort needed to promote the pilot to students, explain benefits and answer concerns. Approaches include information and presentations during induction period; online guidance; briefings for academics; promotion from within the department; drop-in surgeries; Q&A sessions.

3.                 

High stakes
The summative exam is worth 15% of the student’s overall Masters grade, so is very high stakes. This may be a reason for lack of take-up.

Consider whether piloting with students in lower stakes exams would be more suitable.

4.                 

Staff engagement
It is unclear what levels of enthusiasm academics  may have, especially given the additional work required to:
·         attend project briefings
·         promote the system
·         brief and advise students
·         set and mark mock exams

Consider surveying teaching staff across UCL to explore attitudes to this approach.
Any pilot or implementation will need engagement and time from academics who teach on affected modules to perform additional work.

5.                 

Software
The Exam4 system is cumbersome and the interface falls short of the kind of word processing environment that students are used to, especially in terms of screen size. Mac laptops have problems initially submitting to the server over wireless. The vendor has indicated resolving these issues is not a priority.
There is a risk that the use of the software could become unsupported in future – this is more significant than with many systems as the vendor is very small (only 2+ staff).

Larger scale pilot to investigate whether students are happy to use the software as is. Determine whether the limitations will cause students to opt out.
Better understand and document the risk UCL would be exposed to should the software become unsupported in future.

6.                 

Venue
If a module exceeds the capacity of the exam room either another venue or a supervised area to isolate each cohort of students (for consecutive sittings) would be required.  Running exams simultaneously in different rooms would increase the cost of invigilators (a minimum of 3 invigilators per room is required).
Assigning students to different rooms would increase the administration effort involved in organising the mocks and exams.

Only modules of up to ~55 students can be considered to be piloted at this stage given the restrictions on suitable venues for running these exams and the issues with running consecutive exam sittings.

7.                 

Typing noise
Need to establish whether students sitting exams on laptops can be in the same venue as students writing by hand.
·         Separate locations requires more invigilators.
·         For co-location, typing may distract those hand-writing; would students have grounds for complaint if they claim they were distracted?
·         Where will students sit the exam if they change their mind at the last minute? In the same room or elsewhere (having spare rooms will impact on invigilation, costs and logistics)
NOTE: Edinburgh run the laptop and handwritten exams at different ends of the same exam hall, but don’t have the majority of students doing exams on laptops, so the noise is limited.

Further research and investigation is required into whether students could/should sit their exams on laptops in the same venue as those who write the exam by hand.

8.                 

Technical set-up/submission
The set-up and finishing/submitting process is time-consuming adding 30 minutes to both the start and end of each exam.

Exam scheduling needs to account for this.

9.                 

Scheduling & invigilation
The need to organise computer-based exams and fund and resource invigilators at departmental level is a significant barrier.
If exams on laptops became accepted practice the Examinations Office would need to schedule the exams and provide invigilators. ISD would need to provide technical support.

The Examinations Office should be resourced to support, manage and provide invigilation for computer-based exams across UCL. ISD should be resourced to provide technical support.

10.             

Scheduling mocks
Each student’s timetable is unique and therefore to schedule mock exams involves checking every student’s individual timetable; doing this manually is not feasible for more than a handful of students. A student timetable availability search system linked to the Common Timetable is available at: www.ucl.ac.uk/ct/support but some postgraduate modules are not currently available in the common timetable, so these mocks can only be scheduled by polling students via doodle.com – if any students fail to fill in the poll it will not be possible to schedule mock exams during teaching times without clashes with individual timetables.

Any modules to be included in a pilot or subsequent implementation need to be using the common timetable in order to allow scheduling of mock exams.

11.             

Breaking new ground
The vendor and staff at Edinburgh University, advise that universities don’t make it compulsory to sit exams on laptops. Such an approach (even with an opt-out option) would be untested.

Further research, consultation with UCLBE/Education Committee and a larger scale pilot may be needed before laptop-exams can be made compulsory.

12.             

Staff resource
The staff resource is significant and greater than anticipated, in particular scheduling technical drop-ins and mock exams (admin) and setting up the software on student laptops (tech support). The small scale pilot has taken ~300 hours staff time to date.
It is estimated that running a pilot – including mock examinations – for a single 40 student module will involve 110 staff hours.  Economies of scale mean that a second mock would require an additional 60 hours.

Consideration needs to be given to exactly how much time is required from all staff to manage exams on laptops.

Student feedback from questionnaire

Positives

  • 3 students found the format allowed them to write faster
  • 3 students felt their answers were better because they could type them
  • 1 student felt it allowed more time to think about the answers
  • 1 student felt typing the answers alleviated their concern that their handwriting is illegible
  • 1 mentioned they liked being able to write directly into their outline and edit the text

Negatives

  • 1 student found the noise of typing a little distracting
  • 1 student found typing for a long time uncomfortable – also has this problem with hand-written exams  (but the others found typing more comfortable)
  • 1 student found it more difficult to structure ideas on the computer than by hand (and has now dropped out of the pilot)

How the Exam4 software was perceived

  • The students found the software easy to setup and use and 3 would use it again
  • The students wanted a larger screen (the software only allows the screen to be made longer, not wider)
  • 2 candidates had confidence in the system (2 remained neutral): This may be related to the fact that most students used Macs which would not submit to the server the first time, as reconnecting to the wireless took a lot longer than on a PC

General Points of Interest

  • 1 student found they still needed to write their outline on paper before typing

Lessons Learned

The following to be considered for future pilots & launching a full service:

Promote the service/pilot using:

  • Info in induction packs
  • Info presented at student inductions
  • Info on departmental website
  • Brief academics and show them how the software works and explain the benefits so they know what to sell to the students
  • Ask Academics to recommend the pilot / service in one of the first sessions for the teaching term
  • Info on module Moodle pages (and link to Exam4 service/pilot page)
  • Run induction drop-in clinic/s for further explanation to students and a chance for Q&As
  • Students may be anxious about their exams not being received. It would be good to notify them when their exam scripts have been received and printed, especially if they have to submit on USB, as they don't get a receipt if they do this like they would by submitting electronically.