Skip to main content

Part 3: May 2017 Exam Post Mortem (After)

Posted: Monday 10th July 2017

Author: Claudia Cox 

Exploring Students Without Laptops in Depth

 A large amount of data interpretation went into our report on students without laptops, and we are finally able to share our findings with you here. First we'll talk about how exactly the requests from 98 students came through to us, between the first and last dates that any notifications were received:

  

Students were informed about who to contact if they were unable to bring their own laptops to exams on the Blackboard Learn resource, via reminder emails and texts, and face to face at the end of every practice session. Large spikes in activity can be observed for all students shortly before their first exams, but it's especially notable that two of the largest spikes followed the two practice sessions (aimed at second and final year undergraduates) that had the highest attendance rates. This is good news for us since it suggests that the practice sessions have been helpful for the students that engaged with them in some capacity, and that it's definitely worth continuing to run these sessions prior to digital exams.

Looking at the graph, it would also appear that first year undergraduate students were the group that frequently left their requests fairly late - there's a huge rise in requests the day before their first exam, and even a couple that were made on the day! A factor to bear in mind is that it was always made clear that the practice sessions were optional; in order to target groups with lower engagement rates, it may also be effective to filter some of the key practice session information into their more compulsory engagements (i.e. at the end of lectures/ seminars) to prompt more timely responses.

Another important bit of information can be more readily observed in this graph:

Based on a previous report about the digital examinations held for Sports Science students earlier in January this year, it was anticipated that undergraduates in their first year of study may experience more difficulty in being able to bring their own devices. This exam period has been no different - once again they account for the largest group of students unable to participate in BYOD. Moreover, we found that 23% of first year undergraduates unable to participate in BYOD had applied for additional funding in the form of scholarships, bursaries aimed at those from low income backgrounds, or emergency loans.

The report's findings begin to suggest that finance could be an issue which prevents some students from obtaining a suitable device. This is reflected, in part, by the fact that only 3 postgraduate students made requests for loan PCs/ laptops. A more mature demographic with an average age of 30 (compared to the undergraduates' early 20s), our Computer Science postgraduates are more likely to be in employment and have had more time to accumulate savings and personal funds - and could therefore be in a better position to own an appropriate device.

While the report hints at what issues may need to be tackled in order to minimise the number of students that require a loan device from the University, we also thought it prudent to get some answers straight from the source. So that's what we did:

This question was part of a feedback survey which is discussed in more detail in the next section, but it's interesting to note that of the responses we gathered only two students perceived money as an immediate barrier to BYOD. The primary issue from their viewpoint appears to be that their existing devices did not meet the minimum suitability criteria for exams or otherwise had technical failings that were revealed on the day. By extension it could still be said that finance is an issue, given that it may not be affordable or practical to replace a current laptop for the sole purpose of upcoming examinations. Since it appears that some students did still own a device in the first place, perhaps non-BYOD numbers will be minimised in the future as laptops are viewed as essential equipment and the minimum specifications for exams become an influential factor in purchasing decisions.

Student Performance and Feedback

A commonplace concern among students was the fear that they would be disadvantaged by digital examinations because they couldn't type as fast as they could handwrite, or that other students might be able to type faster than them. It is a wonder if this was ever a worry when handwritten exams were the only option available - it stands to reason that handwriting speeds are just as varied, and there is a good chance that somebody can outpace you there as well! To this end, we took a sample of WISEflow's data on students' progressing character counts for the duration of a 3 hour exam* with a long essay style question and compared it to the grades these students received. As this is only a sample of students, it is not indicative of the average grade achieved for this exam. Click on the image below or here to access the interactive graph, where you can hover over individual lines to see the grades:

*Note that some students had extra time requirements.

The data appears to ring true to the phrase "quality over quantity" - given that the student who wrote significantly more than any others achieved a C- grade. One of the students that achieved the highest grades only wrote half as much. Given that there is so much variance in character count and grades achieved, it seems difficult (or perhaps not even relevant) to say how much students should be writing in order to achieve top marks; although there is clearly a threshold where lower character counts are more likely to be associated with lower grades.

Our eventual hope is to collate further samples like the one above and determine whether any distinct activity patterns emerge, allowing us to build a picture of the average digital profile of students that achieve certain grades. For example, is the typical A grade student writing in a constant burst of energy until the end of the exam, or are there frequent plateaus in activity while they stop to think or review what they've just typed? Is there a point where a lull in activity can be an indicator of poor performance?

As mentioned in an earlier post, (admittedly anecdotal) feedback from students leaving the exam hall suggests that digital exams are actually "alright!"; those that we managed to ask seemed to be more concerned with how long they spent answering question 5 than the format in which it was delivered. We also followed the exams up with another survey - though the response rate was low as students had spent the last 3 weeks answering questions and were understandably quite fatigued by it. Still, we were particularly interested to see whether there had been a shift in student opinions and perceptions between the practice session and their real exams.

See the WISEflow student survey results here.

It is likely not a coincidence that first year students had the lowest attendance rates during practice sessions and felt more divided about the information and opportunities they had to practice using WISEflow before exams. It is also reassuring that where there were issues, these related more to other aspects of the examinations (content, environment, etc.) than to the software itself. The biggest surprise by far has been the largely positive response from final year undergraduates, given that before the exams they were among the most vocal in their concerns.

Staff Feedback

The post mortem write up has been very student-centric so far, but we cannot overlook the experiences of Brunel's administrative and academic staff who have had much more to adapt to in this ongoing process. As one of the first higher education institutes in the UK to make strides within this field, we knew that the transition to digital exams would not be without its teething problems. Administrative processes were slowed down somewhat by the fact that we provided paper for rough working and drawing diagrams to students during the exams, and that these papers still needed to be collected in at the end and treated as traditional paper submissions.

Although initially inefficient, having a hybrid system of digital and paper materials was an essential step in determining how heavily we would need to continue to rely on paper for Computer Science examinations, and the answer is barely at all. For the 2017/8 examinations period, we are confident that students can continue to make use of this rough paper if needed, but that it can be disposed of at the end of exams instead of being collected in for marking.

In spite of these hurdles, the administrative response to using WISEflow has been extremely positive. The Senior Programmes Administrator for Computer Science, who helped lead the administrative changes at ground level, said:

For me, I think it's gone very well. I think at the start [October 2016] a bit more direction would have been helpful... but it's easier for us in that there aren't [physical] paper hand-ins anymore. I think the exams went really well, much easier for administrators.

I like the fact that I have Supporter permissions so I've got the ability to log in and see the progress of marking; it gives me a clearer picture of what's going on so that I can better advise students with any queries. It's easy to set up coursework and exams too. But yeah, I love it. I enjoy working with it.

Academic issues will require a few more steps to address. In theory, WISEflow's Assessor tool streamlines the marking and grading process, and makes co-assessing considerably easier by removing the need to exchange physical scripts or wait for your colleague to finish marking before you can begin. It also provides a host of data analytics tools that can be used to inform the content of future assessments, and a points-based assessment feature which speeds up the process of grading. In practice, several academics missed the ability to incorporate rubrics. Some felt that WISEflow's interface could be made more efficient by minimising the number of clicks needed to perform a task, and that marking for large cohorts of students could be tricky - though it should be noted that all of these issues were relatively minor and didn't prevent any academics from successfully completing their marking in WISEflow.

The Digital Examinations Project Team is in constant contact with UNIwise while implementation is ongoing and the WISEflow development team has thusfar been very responsive to any suggestions and requested changes that accomodate Brunel's needs - they have confirmed that the facility for rubrics will be available in their next key update, in time for the start of the 2017/8 academic year. We have also recently published a detailed staff user guide and are scheduling further WISEflow training sessions, and so anticipate that these issues will diminish over time.

One academic in particular wrote some very comprehensive feedback in a multitude of areas, and noted the change in behaviour of students during exams:

Software Project Management was the third or fourth exam that the cohort had taken in this way, so any bugs with the system had been ironed out.  About a third either started plugged in or moved desks to be near the power points during the exam. This was done quietly and without fuss.

For the half hour or so that I sat with the students at the start of the exam, I would say that levels of anxiety were noticeably lower than in previous years.  Normally, I get asked maybe a dozen trivial questions about the paper in that first half hour – this year there was only one.

In terms of what was produced, the most obvious observation was that it was all readable.  Not everyone presented the same sized font and some submissions strained the eyes a little.  On the whole, students appeared to be able to write a lot more – 2,000-3,000 words was typical (from memory: I have not analysed this). 

Moving Forward

We're now preparing for the August resit period, where any students in need of reassessment will be taking WISEflow exams once again - though we are pleased to say that these are few in number. Additionally, following extensive discussions we'll begin enacting the practical parts of expanding the project to the rest of the College of Engineering, Design and Physical Sciences, whose staff are optimistic about the positive changes that will be brought about by its implementation. The Student Associate Learning Technologists (SALTs) who were indispensible during the main examination period will be working closely with us once again as we prepare further training sessions, providing WISEflow support to students and staff alike.

We hope you've found this post mortem series of entries insightful, and anticipate a future series of posts after the resit examinations period.