Posted: Sunday 9th July 2017
In today's post, we'll be talking about the data we collected while the exams were going on:
Alternative Exam Submissions
Technology is by no means foolproof, and this was a consistent concern voiced by students in our exam practice sessions and staff from other institutes in our first event held back in March. A number of failsafes were in put place, both by UNIwise and by us, in order to anticipate any issues while still enabling the students to submit their exam "papers" online:
- An option to manually reconnect the device if the wifi connection is lost
- Autosaving of students' progress at regular intervals, removing the risk of lost scripts if the device crashes or runs out of power
- The ability to download a local copy of the students' work if the wifi connection cannot be re-established (allowing the student to continue to work offline and upload the submission at the end)
- The option to disable the locked exam environment and increase monitoring by invigilators, if the locking feature is not compatible with the device
- A reserve of emergency devices at the exam venue which students can be switched to if their own device fails
However, best laid plans of mice and men go oft awry, and as such we also made provision for traditional paper submissions to be used as an absolute last resort. Just 0.2% (4 submissions) were made on paper, and it should be noted that these all occurred during the first few days of examinations. A review of events suggests that allowing these paper submissions was a snap decision due to being in the early stages, and that on reflection one of the other choices outlined above could have been made to allow these students to continue their exam on a computer.
Additionally, all students in the 18 digital exams were provided with booklets for rough work and content creation (i.e. drawing diagrams or equations) which could be added to their exams by using the software's webcam function to capture any images. Any paper used for rough work was collected at the end and treated like part of the submissions as a fallback. Though it was more taxing on administration during and immediately after the exams, we believe this hybrid digital/ paper system was a necessary step in determining how many precautions need to be taken when running digital exams. A cursory review suggests that actually very few students relied on paper to any extent, so going forward we feel confident in distributing paper for rough working only, which will not be collected or assessed post-exam.
BYOD/ Device Allocation
Students were advised that their laptops should meet the following criteria in order to be suitable for the exams:
- Capable of lasting for as least 3 hours on battery power (unplugged)
- Has the FLOWlock browser installed
- Is able to connect to Brunel's wifi
98 students notified us that they did not have a suitable device available to use for digital exams. We made alternative arrangements for these students by seating some in a lab with desktop computers in the John Crank building which had been set up as an alternative exam venue, and by allocating the remainder an emergency laptop from our reserve for use at the Sports Centre (main venue). Across all exams, about 80% were successfully completed using the student's own device (range 73-100%) and 8% using the University's emergency laptops (range 0-15%). About 12% of students were assigned to the alternative venue for the duration of their exams (range 0-17%). Note that in the table below the total uses of University PCs/ laptops is higher than 98 because our students often had multiple examinations.
Given that portable technology is becoming more and more accessible, it may be the case that a lot of this issue will simply be solved by time - but it's still crucial for us to know how we can best support these students.
This was the first time 3 hour long BYOD exams had been conducted, and there was uncertainty regarding laptop battery life. Additional power sockets were installed in the Sports Centre to ensure that power was available for 30% of the total venue capacity after the first 4 exams. Overall 19% of students (range 7-35%) moved to power sockets, and there was a clear difference between 1.5-2.5 hour exams (7-10% of students required power) and 3 hour exams (21-35% of students required power). An analysis of when students needed to move to power sockets conducted on 10th May 2017 revealed that 7 students were plugged into sockets from the beginning of the exam. Of the 8 students who moved during the exam, 5 did so in the third hour, and 3 within the last few minutes of the exam.
Our biggest digital exam had 218 students, and power is one of the main challenges we will need to tackle as part of our eventual goal to scale up for larger cohorts. Some of our largest exams at Brunel accommodate over 350 students and are run side by side in venues which can seat 1000+ - a considerable increase in numbers.
In the third and final instalment of the Post Mortem series, we'll cover the aftermath! Have student opinions changed between the concept and the reality of sitting digital exams? Has there been a noticeable impact on existing practices for administrative and academic staff?