Skip to main content

Digital Examinations Event March 2017

Posted: Wednesday 19th April 2017

Author: Claudia Cox

The Digital Examinations Project team hosted the first ever Digital Examinations Event at the Hamilton Centre on 17th March 2017. Attendees from over 20 different institutes joined us to explore the potential that digital assessment can unlock, and it went fantastically well. Conversation was flowing both in person and on a Twitter live feed under #bruneldigitalexams, as speakers covered a range of topics including digital exam software in general, personal experiences at Brunel University London, and the pros and pitfalls of embracing the change.


As well as open discussions, we got guests engaged in a practical demo which covered setting up digital exams, authoring exam content, and sitting a short exam just like a student does. The demo exam doubled as a feedback questionnaire, the results of which you can see below:

1. Institutional role of participants (18 responses)

Technology-related     61% (11)

Academic                    22% (4)

Admin (Exams)           11% (2)

Senior Management    5% (1)

2. Motivation for attending the event (20 responses)

25% (5) respondents indicated that their institutions were already doing something with digital examinations, but appeared to be at an early stage. 50% (10) attended event to find out more about digital examinations, with the remainder 25% (5) claiming a general interest.

3. Is your institution currently running summative digital exams? (19 responses)

79% (15) of institutions are already using computers to run summative exams (21% are not), of which 80% (12) are carried out using PCs and 20% (3) using BYOD (bring your own device). A lockdown browser or alternative is used by 47% (7), whilst 40% (6) do not use a lockdown browser.

4. Type of summative exams (15 responses)

Of those running summative digital exams, 100% (15) was MCQ; 87% (13) written short answers; and 40% (6) essays.

40% (6) also responded ‘other’, but we did not ask what type of assessments these were.

5. Formative online quizzes/class tests were run by 100% (18) of respondents, with the vast majority (94%; 17) using the VLE. 33% (6) of respondents also used a separate platform. Only 17% (3) respondents reported students using their own devices (see 3 above).

17% (3) reported students having off-campus access to quizzes/tests, but this seems rather low... respondents [may have] interpreted this as being about distance learning.

6. Online submission/marking (16 responses)

Online marking is lagging behind online submission for coursework, with 75% of institutions claiming ≥50% online submission rates, but only 44% claiming ≥ 50% of marking being done online.




Average ± SD

61.25 ± 28.48

43.75 ± 26.9








60 and 80


7. Institutional guidance for students using own device (17 responses)

The vast majority of institutions (71%; 12) did not have any guidance for students using their own devices. BYOD was used for in-class quizzes in 3 cases (18%), but respondents did not know whether any particular guidance existed.  12 % (2) reported it was part of the IT policy.

8. Is it reasonable to expect students to have own device? (18 responses)

A third (67%; 12) believed it is reasonable to expect students to have their own devices, but a significant minority said it was not reasonable (33%;  6).

9. Barriers to adopting digital examinations (13-15 responses)

Whilst there were significant differences between respondents as to what they perceived the greatest barriers to be, academic adoption was regarded as the most significant, followed by IT infrastructure and cost.






Academic adoption

2 ± 1.32




IT infrastructure

2.67 ± 1.19





3.13 ± 1.41



3 and 5

Lack of support staff

3.71 ± 1.53




Lack of SMG buy-in

4.08 ± 1.77




Provision for students/staff with disabilities

5 ± 0.93



5 and 6

* most significant = 1; least significant = 6

10. Benefits of adopting digital examinations (42 comments)

The top five answers were:

Efficient workflows (7)

Opportunities for analytics (6)

Student satisfaction (5)

Easier marking (4)

Exam feedback and transparency (3)

Perhaps reflecting the participant demographic, a third (33.3%) of comments related to benefits for students; a third (33.3%) to administrative benefits; and only 12% to benefits for academic staff.

One respondent commented there was “no demand at my institution”, but overall it is clear that the potential for workflows and analytics are attractive.

UPDATE 23/06/17 -  UCL (University College London) has also written their own post about the event, see what they have to say here!