Ahead of the release of the 2018 Durham Student Digital Experience Tracker survey results, Malcolm and Candace were invited to speak about how we used the Tracker at Jisc’s Connect More event in Newcastle on 10 July 2018.
Our presentation was part of a session entitled How are students’ expectations and experiences of their digital environment changing? and we highlighted how different data analysis techniques revealed some key findings from the Tracker.
Open to first- and second-year undergraduates and taught Master’s students, the survey received 877 responses. This was a representative number for this group (with a confidence level of 95% and a confidence interval of 3.19), and the data reveal that this was largely due to two emails that went out while the survey was open.
SPSS was used to analyse the quantitative data. Even simple frequency tables were useful in gauging student attitude toward digital. For example, here is how students responded when asked ‘How much would you like digital technologies to be used on your course?’
The Tracker allowed each institution to create its own questions as well, and we asked students about online assessment. This revealed that the majority of students liked almost all types of online assessment, although some types are rarely used at Durham.
This question also provided an unanticipated insight into the types of assessment that students expect: many students who had not submitted an assessment online considered it inappropriate to their subject.
The Tracker survey allowed us to benchmark student responses against last year’s Durham survey and against this year’s responses from all participating universities and from Russell Group institutions. Significance tests revealed where Durham was ahead or behind for every multiple-choice and Likert question in the Tracker. For example, when asked whether they had access to reliable wifi whenever they needed it, Durham students were significantly less positive than last year, but more positive than students at other UK universities.
Correlations for composites
SPSS made it easy to get overall impressions of student responses. For example, if multiple questions that measured a similar attitude were all strongly correlated (p < .01), they could be combined into a single composite to represent that attitude generally.
We used Nvivo to analyse the free-text responses, coding over 2,500 discrete comments into cascading categories. This was extremely useful in drawing out individual student experience narratives and in quantifying trends.
The most popular free-response topics could then be mapped back onto student responses in SPSS, allowing for correlations between multiple-choice responses and free-text responses to be discovered. A striking example is the strong correlation between attitude toward the use of digital technologies on courses (as shown above) and free-text mentions of in-class polling as a useful digital activity. Those who did not mention in-class polling as useful…
…had a significantly less positive attitude toward technology use in courses than those who did.
Look out for the full Durham report, which will be available on this site soon! Stakeholders across the University will also be approached to discuss deeper analysis that would be particularly useful to them.
Jisc’s analysis of the data from all participating institutions was published today on the Digital Experience Insights site.