Durham students’ top picks for productivity and organisation tools 2019

student with laptop

Which digital tools do students use to organise themselves and their work? Durham students let us know the tools that they use in the 2019 Digital Insights survey. We count down the top ten answers here:

Forest logo10. Forest

Forest is an app that encourages you to stay off your phone for a specified amount of time…and plants real trees if you succeed!

Mendeley logo9. Mendeley

This reference manager was popular with students, and is free to download: Mendeley.

Google drive icon8. Google Drive

Students particularly liked being able to instantly save their notes to the cloud and the free storage space that Google Drive provides.


Another option for online storage is Office 365, which is available to every undergraduate and taught post-grad for free. This includes 1TB of space on OneDrive and the ability to share files with anyone else at the University.


notability-logo7. Notability

With the ability to use free-hand writing and drawing, photos and typed text, Notability was a favourite with many students.

If you’re interested in note-taking apps, have a look at OneNote: it comes in at number 1 and is free for Durham students!


Google Docs icon6. Google Docs

Students especially mentioned Google Docs as helpful in collaborative work, from presentations to translating texts.


You can also share and co-edit documents in Office 365 without having to create a new account for everyone.


Excel icon5. Excel

Students reported that they used Microsoft Excel for coursework as well as general study purposes. It’s available for free as part of the Office 365 suite.

google calendar4. Google Calendar

Making its first appearance in the top ten is Google Calendar, which student reported using to keep track of their timetable.

Cite this for me logo3. Cite This For Me

Another frequently mentioned referencing aid was Cite This for Me, which extracts reference information from webpages and produces citations in different formats.

Word logo2. Word

Proving that the newest tools are not always the most popular, Microsoft Word was cited as useful for notetaking, writing essays and referencing. It’s also available for free as part of the Office 365 suite.

OneNote logo1. OneNote

Microsoft OneNote garnered more mentions than Word and Notability combined. Part of the Office 365 suite available to all students and staff members, it has been the most popular tool three years in a row.


Please note that this list is drawn from the student survey and is not necessarily endorsed by Durham University. When using any third-party tool, please read the Terms & Conditions carefully. Durham University is not able to provide support for third-party tools used by students or staff.

Durham students’ top picks for productivity and organisation apps

Swiss army knife

Which digital tools do students use to organise themselves and their work? This was one of many questions Durham students answered in the 2018 Student Digital Experience survey. We count down the top ten answers here:

Sonocent icon10. Audio recording apps

Students reported using various audio recording apps, including Audio Notetaker, for making their own notes as well as recording lectures.

Google drive icon9. Google Drive

Students particularly liked being able to instantly save their notes to the cloud and the free storage space that Google Drive provides.


Another option for online storage is Office 365, which is available to every undergraduate and taught post-grad for free. This includes 1TB of space on OneDrive and the ability to share files with anyone else at the University.


Excel icon8. Excel

Students reported that they used Microsoft Excel for specific coursework as well as for more general study purposes.

Mendeley logo7. Mendeley

This free reference manager was popular with students, with one explaining how Mendeley ‘revolutionised how I consume literature, as well as how I write and reference’.

Cite this for me logo6. Cite This For Me

Another frequently mentioned referencing aid was Cite This for Me, which extracts reference information from webpages and produces citations in different formats.

Forest logo5. Apps for focus and productivity

This category included several different digital aids, but garnering the most mentions was Forest, an app that encourages you to stay off your phone for a specified amount of time.

Evernote icon4. Evernote

Students mentioned the Evernote app in relation to both note-taking and general organisation.


If you’re interested in note-taking apps, have a look at OneNote: it comes in at number 1 and is free for Durham students!


Google Docs icon3. Google Docs

Students especially mentioned Google Docs as helpful in collaborative work, from presentations to translating texts.


You can also share and co-edit documents in Office 365 without having to create a new account for everyone.


Word logo2. Word

Proving that the newest tools are not always the most popular, Microsoft Word was cited as useful for notetaking and ‘all aspects’.

OneNote logo1. OneNote

Microsoft OneNote garnered more mentions than Word and Google Docs combined. Part of the Office 365 suite available to all students and staff members, OneNote was described by one student as ‘amazing’.


Please note that this list is drawn from the student survey and is not necessarily endorsed by Durham University. When using any third-party tool, please read the Terms & Conditions carefully. Durham University is not able to provide support for third-party tools used by students or staff.

Xerte facts and figures 2018

Xerte logo

Xerte is a fully online tool for creating interactive learning content, available to all University staff and students.

What’s new?

Xerte logoXerte underwent a major upgrade in summer 2016, which greatly enhanced its functionality and ease of use. It was upgraded again in autumn 2017, with a number of new features including flash cards, interactive text and better integration with the duo Grade Centre. We plan to review the latest version to decide whether to upgrade again this year.

How much is it used?

215 new projects have been created since June 2016.

 53 new people have begun using Xerte since June 2016.

The Xerte Guides & Videos page was visited an average of 27 times per month so far in 2018, three times as often as in 2016.

What is it used for?

  90% of projects were created by staff and include learning and training resources.

 10% of projects were created by students, mostly as assessments.

What do students say?
Online quizzes with immediate feedback on where I went wrong are useful - Second-year in Science

While students didn’t mention Xerte specifically in the 2018 Student Digital Experience Tracker survey, they were positive about the types of features that it offers:

32 Durham students named online quizzes as useful course activities.

 94 Durham students said that online tools for learning were useful in their study.

 100 Durham students cited online video as useful for learning.

Have there been any issues?

Xerte has a few minor bugs that are flagged in the user guides.

In 2017-18, there was only one IT service desk incident involving Xerte.

Insights into Durham students’ digital experience

Students with digital devices

Students with digital devicesAhead of the release of the 2018 Durham Student Digital Experience Tracker survey results, Malcolm and Candace were invited to speak about how we used the Tracker at Jisc’s Connect More event in Newcastle on 10 July 2018.

Our presentation was part of a session entitled How are students’ expectations and experiences of their digital environment changing? and we highlighted how different data analysis techniques revealed some key findings from the Tracker.


Open to first- and second-year undergraduates and taught Master’s students, the survey received 877 responses. This was a representative number for this group (with a confidence level of 95% and a confidence interval of 3.19), and the data reveal that this was largely due to two emails that went out while the survey was open.

Graph showing responses peaked when emails were sent


SPSS was used to analyse the quantitative data. Even simple frequency tables were useful in gauging student attitude toward digital. For example, here is how students responded when asked ‘How much would you like digital technologies to be used on your course?’

44% said more, 53% said same, 3% said less

Custom questions

The Tracker allowed each institution to create its own questions as well, and we asked students about online assessment. This revealed that the majority of students liked almost all types of online assessment, although some types are rarely used at Durham.

Of students who submitted the following online: Essay: 80% liked, 6% didn't; Presentation 25% liked, 3% didn't; Portfolio: 10% liked, 3% didn't; Audio: 5% liked, 3% didn't; Video: 6% liked, 2% didn't; Quiz: 17% liked, 3% didn't; Other: 11% liked, 2% didn't

This question also provided an unanticipated insight into the types of assessment that students expect: many students who had not submitted an assessment online considered it inappropriate to their subject.

Results in this order: No but would like to, No and wouldn't like to, No not appropriate to my subject. Essay: 4%, 2%, 8%; Presentation: 20%, 16%, 8%; Portfolio: 18%, 8%, 60%; Audio: 13%, 16%, 63%; Video: 15%, 16%, 62%; Quiz: 14%, 10%, 56%; Other: 10%, 7%, 70%


The Tracker survey allowed us to benchmark student responses against last year’s Durham survey and against this year’s responses from all participating universities and from Russell Group institutions. Significance tests revealed where Durham was ahead or behind for every multiple-choice and Likert question in the Tracker. For example, when asked whether they had access to reliable wifi whenever they needed it, Durham students were significantly less positive than last year, but more positive than students at other UK universities.

90% at DU in 2017, 86% at DU in 2018, 83% at Russell Group in 2018, 82% in all UK HEIs in 2018

Correlations for composites

SPSS made it easy to get overall impressions of student responses. For example, if multiple questions that measured a similar attitude were all strongly correlated (p < .01), they could be combined into a single composite to represent that attitude generally.

Composite answers to 'When digital technologies are used on my course...' questions were 42% positive, 51% neutral, 7% negative

Free-text analysis

We used Nvivo to analyse the free-text responses, coding over 2,500 discrete comments into cascading categories. This was extremely useful in drawing out individual student experience narratives and in quantifying trends.

consider embedding digital tools more in lecture and seminar content PGT in Social Science & Health; Embrace digital learning as much as you already do First-year in Arts & Humanities; often time is spent inefficiently using digital teaching First-year in Science


The most popular free-response topics could then be mapped back onto student responses in SPSS, allowing for correlations between multiple-choice responses and free-text responses to be discovered. A striking example is the strong correlation between attitude toward the use of digital technologies on courses (as shown above) and free-text mentions of in-class polling as a useful digital activity. Those who did not mention in-class polling as useful…

Of those who didn't mention in-class polling, attitude was: 40% positive, 52% neutral, 7% negative

…had a significantly less positive attitude toward technology use in courses than those who did.

Of those who did mention polling, 62% positive, 37% neutral, 1% negative


Look out for the full Durham report, which will be available on this site soon! Stakeholders across the University will also be approached to discuss deeper analysis that would be particularly useful to them.

Jisc’s analysis of the data from all participating institutions was published today on the Digital Experience Insights site.

Turnitin facts and figures 2017-18


As we prepare for a new academic year, we’ve been looking at how different learning technologies have been used in 2017-18 across the University. First up is Turnitin, our application for assessment submission, originality checking, marking and feedback.

What’s new?

In August 2017 Turnitin introduced Feedback Studio, an upgrade to the marking and feedback interface. Over a quarter of all submissions at Durham were marked with Feedback Studio. A few people reported experiencing issues, and Turnitin responded by adding a button to toggle the High Resolution view on or off.
Turnitin works well and makes a stressful time a lot easier. - Master’s student Social Science & Health

How much is it used?

113,071 documents were submitted to Turnitin in 2017-18.

 This represents an average of 7 submissions per taught student.

Submissions to Turnitin have been slowly but steadily increasing over the past five years:

Turnitin submissions 2013 to 2018

What is it used for?

31,704 scripts were marked in Feedback Studio in 2017-18, 28% of all submissions.

 75% of these included general feedback.

Markers used an average of 6 bubble comments and 3 Quick Marks per script.

Originality reports were run for almost all submissions.
Turn it in is simple and easy to use - First-year undergrad Social Science & Health

What do students say?

(All student feedback from 2018 Student Digital Experience Tracker survey)

93% of Durham students who submitted essays online liked it.

Only 9% of Durham students disagreed that online assessments are delivered and managed well.
"Turnitin works well, it's a good system" Master's student Social Sciences & Health

Have there been any issues?

Turnitin experienced service degradation several times in 2017-18, totalling 8.5 hours, and unscheduled outages totalling 7.5 hours. There were also approximately 30 hours of scheduled maintenance.

The IT service desk received 98 calls from staff and students about Turnitin in 2017-18, which translates to roughly one call for every 62 duo sites that use Turnitin submissions.

Social Media in HE Conference 2017

Social Media in HE logo

The annual Social Media in HE Conference (#SocMedHE17) was held at Sheffield Hallam University on 19 December. Universities from across the UK and beyond were represented by both staff and students, with presentations on a wide variety of uses of social media in higher education. Highlighted here are just a few key topics that emerged throughout the day:

  • Social media is used across many different streams of HE activity, including: marketing to potential students; institutional and departmental communications with current students; dissemination of research; student support and retention; careers news and advice; library updates; and alumni relations.
  • Universities’ social media policies should aim to address all uses of social media. While most institutions’ policies focus on academics’ personal use of social media and on marketing campaigns, many do not provide guidance on other uses, such as recommending/requiring student engagement. This leaves some staff unsure of the extent to which they can use social media for teaching or communications with students.
  • Student privacy is a key concern when implementing social media initiatives. Expecting students to engage with social media at any level involves, at the very least, requiring students to creating public online profiles. This, and any activity that follows, has implications for the student’s privacy and online personae. Care should always be taken to ensure that students are clear about exactly what they are posting online and, wherever possible, alternatives to public involvement.
  • Student preference for when and how social media is used should be carefully considered. Institutions can fall into the trap of assuming that students want to use social media to engage with the university because they choose to use it outside of their studies. This isn’t always the case, and consideration should be given to individual student preferences as well as those of the student body as a whole.
  • Both staff and students should have opportunities to learn how to best use social media. Safe, responsible and effective use of social media is increasingly important for students to grapple with, and this can be both explicitly and implicitly embedded into the curriculum as well as taught as part of a larger ‘digital literacies’ initiative. Staff members should also have the opportunity to develop their own social media knowledge and skills, which will in turn enable them to take the lead in deepening student understanding and use.
  • Both staff and students should have opportunities to develop meta-skills for adapting to new technologies generally. As the digital world changes so rapidly, development like that suggested above should ensure that students and staff possess the skills to evaluate and experiment with any new platforms and technologies–including social media–that might emerge in the future.

Xerte upgrade

Xerte logo

Xerte, the online platform which allows Durham staff and students to create multi-media, interactive learning objects, has recently been upgraded to the latest version. The upgrade fixes a number of bugs and offers the following new features:

 ‘Flash cards’ page type: help students learn a language or remember terminology by creating flash cards–use text only or combine text and images.

‘Word search’ page type: define your own words to create a search game that changes every time.

SCORM tracking in duo: upload Xerte projects to duo to record user progress and scores with increased accuracy.

Hiding pages: hide pages from users without deleting them from your project.

If you’re interested in getting started with Xerte, come along to the Creating interactive content with Xerte workshop on 13 December 2017.

We provide more information on Xerte, plus how-to guides and videos, on our Xerte Product page.


Association for Learning Technologies Conference 2016


The ALT conference was held at Warwick University this year, bringing together learning technologists, academics, Phd students and a broad range of others from the UK and beyond. Presentations were based around the theme of ‘Connect, Collaborate, Create’. The following are notes and observations from the keynotes and a selection of the conference presentations.

 This icon flags up items that might be of particular interest to learning and teaching practitioners at Durham.

Day 1

Keynote: In the Valley of the Trolls, Josie Fraser

This opening plenary tackled the issue of online trolling, looking at a few case studies and the part that the media plays. Fraser discussed motivations for trolling, explaining that many online trolls do not believe their own inflammatory rants, but instead are entertained by others’ anger or offence (termed ‘lulz’ by Whitney Phillips in This is Why We Can’t Have Nice Things) regardless of the subject. She argued that this kind of behaviour should not push online contributors into ‘safer’ walled gardens, nor is it a reason to abandon anonymity on the web. Rather, educators should ensure that digital literacy directly addresses online behaviour and its implications for the individual and for society.

 Conjuring Helen Beetham’s model of digital wellbeing as encompassing all aspects of an individual’s digital capabilities, she suggested that combating trolling requires a long-term cultural shift that should begin with education.

Collaborate session

Learning the Hard Way: Lessons in Designing Open Educational Resources in, for and through Partnership, Anna Page

This presentation introduced the Open Educational Practices in Scotland project and its evolution as project members worked with external organisations to develop Open Educational Resources (OERs). Learning points ranged from the administrative (e.g. asset registers to ensure that content is copyright-cleared and easily retrieved) to the philosophical (what happens if people use open resources in ways that were not intended?).

The OEPScotland project has also produced its own open online course: Becoming an open educator.

Collaborative technologies, higher order thinking and self-sufficient learning: a case study of adult learners, Clare Johnson

In this study, course lecturers used an amalgamation of Salmon’s Five-Stage Model for online interaction, Gunawardena et al’s Social networking spiral and Garrison’s Community of Inquiry model to develop an online platform to supplement face-to-face learning. Students were observed to reach the higher levels of Salmon’s model, answering each others’ questions and taking responsibility for their own understanding. Learning points included ensuring that tutors posted introductory messages and maintained a presence throughout, and signposting activities and deadlines.

Trends in on-line peer-review, Helen Purchase

This presentation centred around Aropä, a free, online peer review tool. Designed by two academics at the University of Glasgow, it automates the process of distributing assignments for peer review and/or marking. The designers analysed system data and email correspondence to identify trends in peer review, concluding that it is important that students are able to respond to feedback (both informally in reply to comments and formally in submitting revised work). It was also evident that students often required extrinsic reward to engage with peer review.

Aropä looks to be a useful and effective tool, but Durham academics interested in using it would need to ensure that data protection was not compromised, especially where students are concerned.

An online resource to support research students: issues of collaboration, viability and design, Michael Hammond

To help Phd students grapple with the difficult concepts involved in social research theory, academics developed a website where students could move from viewing information (e.g. video interviews), to discussing topics introduced in face-to-face sessions and on the site, to actively curating new content for the benefit of their fellow students. The project seemed to be a success, and it was noted that, while the videos that the team had produced were useful in a number of contexts, the academics’ optimal role (as well as the students’) was to curate rather than to create.

‘Wildcard’ session

University teachers’ experiences, and impact on academic practice, of a course in technology-enhanced learning, Vicki Dale

The University of Glasgow piloted an optional PGCAP module which sought to help academics to evaluate different learning technologies for their teaching. Also addressing themes like digital literacies and pedagogies, the aim was to introduce lecturers to new technologies in a thoughtful and reflective way that would have a long-term impact on their practice. The module leaders felt that bite-sized sessions worked well, giving academics the opportunity to try out new  ideas in their teaching over the course of an academic year.

Computing, Chemistry and Business…Oh My! Learning Technology is everywhereLisa Donaldson and Mark Glynn

This presentation introduced the What works and why? project in Ireland, focused on helping educators and students to evaluate effective use of technology in discipline-specific contexts. The project had a number of streams, including traditional workshops and ‘exploration sessions’, but also innovative teaching projects and the formation and development of Teaching Groups. Student perception was seriously considered, and student-produced videos can be found on the website: What works for students. Participants in the Teaching Groups and innovation projects share their discipline-specific findings online as well: What works for teachers.

The project team explained that Teaching Groups were particularly successful in helping academics to share good practice in a more holistic manner. They emphasised the success that lecturers had when evaluating technology pedagogically, and then embedding it into the design of a course from the beginning.

Create session

Creating a k-fffufffl: fast flipped feedback using feed-forward for learning in labs and assessments, Guy Saward

In this study, electronic voting systems (i.e. ‘clickers’) were used to provide students with quick feedback on summative work. Students completed a multiple-choice test individually and then answered the same questions again via the voting system. This allowed students to gain immediate feedback and to discuss their answers with their peers. A similar scenario was also used with lab and tutorial exercises that small groups or individuals completed outside of the classroom. The lecturers found that students were much more interested in their feedback when received immediately after the assessment, and when they had a chance to discuss it with their peers.

Impact of visualization and learning environment on the effectiveness of interactive simulation, Niels Walet

To investigate the effectiveness of student sketching when working with interactive computer simulations, lecturers in a physics module had some students write about their observations and some draw sketches. Using screen capture, observation notes, interviews and the results of assignments and tests, the lecturers concluded that sketching did have a positive effect on student understanding, and even on their stress levels.

Examining the role of ‘Carpe Diem’ learning design in improving the learner experience in a Western Australian context, Astrid Davine

This presentation reviewed how the University of Western Australia has been using the Carpe Diem learning design process to bring academics, learning technologists and librarians together from the beginning when (re)designing modules. UWA actually took the decision to stop running ‘how-to’ sessions about learning technologies in favour of this holistic approach. The current study investigating the impact of Carpe Diem indicates that academics find value in the process, and will be published when the data analysis is complete.

It is interesting to note that here (as in the What works and why? project), learning technologies were seen as part of the bigger picture from the start. An equally important facet of both of these projects was the opportunity for academics to work together to share good practice and develop innovations in teaching together. This is a theme that emerged at the Inaugural Learning and Teaching conference at Durham a week later, in Contrasting experiences of postgraduate and staff education forums in Earth Sciences (Dr Christopher Saville, Earth Sciences).

Keynote: Education and Neuroscience: Issues and Opportunities, Lia Commissar

Introducing the Education and Neuroscience Initiative, this presentation explored the developing links between research in neuroscience, psychology and education. It also flagged up popular ‘neuromyths’ that have little or no evidence base, such as individual learning styles and the right brain / left brain divide. Commissar encouraged educators to be careful to adopt learning theories that were the result of rigorous research, pointing to sources of information such as the Education Endowment Foundation and the Digital Promise project.

Day 2

Keynote: Code Create Collaborate, Ian Livingstone

Co-founder of the Games Workshop and author of the Fighting Fantasy gamebook series Ian Livingstone discussed his work in the evolving world of gaming and its relationship to education. He emphasised how games in education help learners to develop problem-solving skills in a risk-free environment where they can receive instant feedback. The social nature of gaming, Livingstone suggested, and its experiential nature, make the skills learnt in this context transferable to other environments. He also discussed the importance of teaching coding in schools, as per the Livingstone-Hope Next Gen review.

Connect session

The implementation of Blackboard Analytics: A partnership with academics to improve the Student Experience, Chris Bell

This presentation reviewed how one university implemented Blackboard Analytics and a dashboard application (Cognos) to improve the student experience (defined as ‘attainment, satisfaction, engagement, retention, personal tutoring and progression’). Academic staff opinion was garnered to ensure that useful data would be provided, informing the development of the dashboard. Initial findings showed that academics appreciated the tool and suggested that engaging with analytics could have a positive effect on online course design.

A multivariate exploration for potential predictors of educational achievement in a technology enhanced environment for learning computer programming, Nick Day

This study looked at a number of variables to predict student attainment in undergraduate computer programming modules. It was found that factors like UCAS points were not good predictors, but that attendance and previous resits were correlated with overall marks.

Evaluating Evaluation! – A four tiered approach encapsulating evaluation techniques and methods in staff training and delivery, peer review, participant experience and formal feedback in Higher Education, Rebecca Vickerstaff, Emma Purnell & Liz Mcgregor

Presenters explained how the Academic, Support, Technology and Innovation team at Plymouth University overhauled their training programme using a four-stage model:

  • ensuring consistency across resources
  • introducing a new, long-term participant engagement process
  • evaluating course numbers, feedback and team reflection to make necessary changes
  • reviewing the above process and planning its next iteration

Connections between theory and practice: rhizomatic teaching with digital technologies, Louise Drumm

In this study, academics were interviewed about their use of technology in teaching. The researcher analysed the interviews in terms of how theory was applied to practice, finding that most interviewees drew from a number of theoretical frameworks (some more robust than others). She suggested that it was helpful to draw on multiple theories as well as personal experience, but that academics should be supported in finding evidence-based principles on which to base digital teaching practice.

E-portfolios as communication and sharing tool: students’ perspective, Eman Ghallab

Based in activity theory, this longitudinal study investigated students’ perspectives on the extensive use of an online portfolio tool in a healthcare education setting. The tool allowed students to decide with whom they would share each portfolio object and how that person could interact (e.g. view-only, comment, edit).

Lessons learned for e-portfolios included:

  • Tutors found it useful to be able to give students instant feedback, but needed to manage student expectations about when to expect feedback and how detailed it would be.
  • Both staff and students required training in using the system.
  • The ‘three-click rule’ seemed to apply to e-portfolios as well–material could get lost if buried too deeply.
  • Group submissions needed to be carefully handled to avoid confusion.

Create session

Exploring the educational implications of ‘making construals’, Wiliam Beynon, Steve Russ, Piet Kommers, Hamish Macleod, Rene Alimisi, Ilkka Jormanainen, Russell Boyatt and Emma King

This presentation introduced construals, a broadly-defined term for digital artefacts that allow the user to manipulate multiple aspects of a simulation, model, visualisation, etc. The Construit! project has developed an environment in which those with some programming knowledge can create construals, and where users can engage with them. The intention is that practitioners will be able to easily share their construals as Open Educational Resources.

Connect session

Tracking students’ digital experience: development and use of a cross sector benchmarking tool, Tabetha Newman, Rob Howe, Gunter Sanders, Andy Taggart and Helen Beetham

Representatives from JISC gave an update on the pilot of the Student digital experience tracker, a short survey instrument developed to garner information on students’ expectations and experiences of technology in learning and teaching. The tracker allows institutions to better understand their own students and students across the sector. Several representatives from participating universities and colleges spoke about implementing the survey and their findings (case studies are also available online: Tracker case studies).

The report from the pilot is now available, and interestingly echoed several conference topics including lecture capture, digital literacy and internet safety. JISC are currently recruiting institutions to take part in the next phase.

Day 3

Keynote: Copyright and e-learning: understanding our privileges and freedoms, Jane Secker

Dr Secker, the Copyright and Digital Literacy Adviser at LSE, provided a nuanced perspective on protected content in a digital environment. While likening some breaches of copyright to theft, she emphasised that it does not commoditise ideas themselves, but rather the unique ways in which they are expressed. She also addressed the issue of copyright as gatekeeper, applauding the move to open publishing in academia as an important step forward.

She suggested that educators think about copyright in terms of the following:

  1. Attribution and credit: think of attribution just as you would citations in a piece of academic writing
  2. Value and empathy: remember that every digital artefact originated with a real person
  3. Collaboration and communities of practice: engage with colleagues inside and outside your institution to ensure that you understand your own rights and how to protect your intellectual property whilst being as open as possible

‘Wildcard’ session

How best should a VLE be designed to enhance learners’ experience?, Emmanuel Isibor

This study looked at how an institution customised its virtual learning environment (VLE). Interviews with staff and students revealed that, while tailoring the VLE to the university’s needs was beneficial, customisation needed to occur at the departmental level and within individual subject areas as well.

Learning Spaces: Roles and Responsibilities of the Learning Technologist, Kristian Roger and Sarah Ney

Learning technologists at LSE discussed how they were involved in the design of new learning spaces at the university. Academics, other learning and teaching staff, estates and buildings, the audio-visual team and learning technologists worked together to create learning spaces that met a diversity of needs from the outset.

Those involved in the project found the following book to be helpful in framing their discussions: Learning Spaces in Higher Education: Positive Outcomes by Design, eds D Radcliffe, H. Wilson, D. Powell & B Tibbetts, University of Queensland and the Australian Learning and Teaching Council, Brisbane.

Evaluating Webinars as a Tool for Delivering Lectures and Seminars at Distance in a Healthcare Setting, Daniel Metcalfe

This study compared webinars to traditional lectures, using a student survey. Students overwhelmingly agreed that webinars were as good as or better than face-to-face lectures. While students in the study were dispersed in a fairly large geographic area, which meant that webinars were more convenient, they identified several other advantages to webinars, including re-watching recordings, a more relaxed atmosphere and varied opportunities for interaction.

Designing for Flow, Leonard Houx

Tasked with tailoring a standard VLE for an online programme, instructional designers customised the out-of-the-box platform to improve and streamline the user experience. The following issues were addressed:

  • Clutter and redundancy: unnecessary and repeated information and navigation was removed
  • Your metadata is showing: information that only helped academics and developers was hidden
  • Too many choices: confusing and distracting navigation options were taken out

Strategies for supporting effective student engagement with lecture recordings, Matt Cornock

This study looked at how students at the University of York use recorded lectures, and identified several different types of workflows that they employ (e.g. self-checking, preparation for tasks and revision). It also raised difficult questions about: what lectures are for; what students are meant to be doing during lectures; how lecturers expect students to engage with recordings of lectures; and how lectures relate to other module components and assessment. To help students address these questions, staff at York produced resources to put recorded lectures in context. However, it soon became apparent that the answers were different for every discipline, and potentially for every individual lecturer.

It was concluded that each lecturer should explicitly communicate to students how they are expected to engage during a lecture and with the lecture recording, and how this relates to the rest of the module.

Collaborate session

Gone in a Flash: Adapting to New Technologies, Cherry Poussa, Mike Taylor, Aaron Fecowycz and James Henderson

Learning technologists from the Health e-Learning and Media (HELM) team explained their project of converting over 200 open digital resources from the increasingly irrelevant Adobe Flash. The team described the process of reproducing the materials using HTML5, CSS3 and javascript, testing, piloting and evaluating using feedback questionnaires and data analytics. This project also raised the question of whether, and to what extent, it is possible to future-proof online resources.

Into the Open – a critical overview of open education policy and practice in Scotland, Lorna Campbell

This presentation reviewed current Open Educational Resource (OER) provision in Scotland as part of the Open Scotland initiative. Any organisation that produces digital educational resources is encouraged to adopt policies to make their resources publicly available. Several Scottish universities have developed OER policies, and the University of Edinburgh currently provides the platform for Open Scotland. Other institutions choose to host their materials on their own sites, still fully open to the public.

Secrets of Scale and Adoption: The Value and Impact of Open, Common Data Definitions in Student Success Research, Evaluation and Implementation, Ellen Wagner

The PAR Framework allows the aggregation of data from multiple institutions to attempt to identify variables that are likely to negatively affect student attainment. The goal is for institutions to be better able to support at-risk students as early as possible.

Developing literacies of ‘open’ across an institution, and beyond… Stuart Nicol

This presentation highlighted the work that the University of Edinburgh has done around OERs. The platform itself (Open.Ed) was discussed, but also the work undertaken to educate the university community. This included workshops, integration into the institutional Learning Design framework, and provision of a media management platform to make sharing and licensing simple for staff and students.

Love, hate and online collaboration, Gerald Evans and Rebecca Galley

Based around investigations into student collaboration in online modules, this presentation shared findings from research and case studies that informed a guide for Open University staff. The presenters noted that, while students often complained about group work, modules with collaborative activities tended to have higher retention rates.

Some of the key recommendations are:

  • If introducing a new tool or platform, embed it into the module throughout so that students are confident in using it when it comes to group work.
  • Lead students through simple online engagement, working up to the collaborative task.
  • Ensure that the group activity is linked to module assessment, and has some degree of authenticity (e.g. the type of task that a group of researchers in this field would really do).
  • Support students before, during and after the task, clearly communicating your expectations throughout.
  • Consider how the work will be assessed. Will you mark the product/result, the process, how well the group worked together and/or individual contributions?
  • Evaluate the project while it’s running and when it is finished (data analytics, student surveys, etc.).

Keynote: Being human is your problem, Donna Lanclos and David White

In this slightly subversive double-act, the presenters argued that the role of technology in education has tended to be administrative rather than transformative. That is, the same types of technologies that deal with data and transactions (fee payment, enrolment, marks) are used for teaching and learning. They suggested that this leads to low-risk, input-output scenarios rather than ‘messy’, interpersonal, complex learning.

The presenters challenged the audience to engage with technologies that enhance the nuanced, complicated aspects of transformational learning, and not simply the tools that might make teaching seem ‘easy’.

Online marking workshop


In Easter term 2016, the Arts & Humanities Faculty held an online marking workshop. Colleagues from Philosophy and Archaeology gave presentations on how their departments had implemented fully online marking, followed by questions and discussion on the educational and practical benefits and drawbacks of marking online.

Several key themes and questions emerged which could be useful to any department considering the use of duo to receive, mark and return assignments.

Reasons to try online marking

Effective feedback

  • Maintain feedback standards across a department
  • Incorporate rubrics or forms
  • University requirements for typed feedback and timely return to students

  • Paper-free assessment
  • Marks recorded digitally
  • Streamlined workflow from student to marker(s) and back again
Prevent academic misconduct

  • Easy to check evidence of plagiarism if something look suspicious
  • Catch collusion among students
  • Identify instances of essay re-use

Questions to ask when considering marking on duo

Quality assurance

  • How does your department handle anonymity? For example, do you require anonymity for all summative work? Would identifying scripts with student Z-codes be appropriate? At what point would it be reasonable to de-anonymise the data for administrative purposes?
  • What requirements do you have for moderation or (blind) second marking? For example, do students see the markers’ names? Do moderators need marking data? What evidence is required of these processes?
  • What particular processes do you use for external examining? For example, would your external examiner be happy to view assignments online? Does the external examiner expect assessments to be anonymous?
The medium

  • Would most staff be able to mark online from an accessibility standpoint?
  • Are the online tools sufficient for script annotation in your field?
  • Would your students need support and encouragement to engage with online feedback?
The big picture

  • Does online marking suit every step of the assessment process for your department?
  • What kind of training or support would students and staff require?
  • Would a pilot of online marking be appropriate? Would the department consider implementing one element of online assessment at a time?

If your department are interested in investigating online marking further, please contact your faculty learning technologist.

Further reading

Durham University Learning and Teaching Handbook, Section 6: Examination and Assessment

Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: the challenge of design’, Assessment & Evaluation in Higher Education, 38(11).

Buckley, E. and Cowap, L. (2013) ‘An evaluation of the use of Turnitin for electronic submission and marking and as a formative feedback tool from an educator’s perspective’, British Journal of Educational Technology, 44, pp. 562–570.

Carless, D. (2007) ‘Learning-oriented assessment: Conceptual bases and practical implications’, Innovations in Education and Teaching International, 44(1), pp. 57-66.

Fawcett, H. and Oldfield, J. (2016) ‘Investigating expectations and experiences of audio and written assignment feedback in first-year undergraduate students’, Teaching in Higher Education, 21(1) pp. 79-93.

Higher Education Academy (2012) A Marked Improvement: Transforming Assessment in Higher Education, York: HEA.

Hounsell, D. Enhancing Feedback (website).

O’Shea, C. and Fawns, T. (2014) ‘Disruptions and Dialogues’, in Kreber, C. and Anderson, C. (eds.) Advances and Innovations in University Assessment and Feedback. EUP, pp. 225-45.

Sopina, E. and McNeill, R. (2015) ‘Investigating the relationship between quality, format and delivery of feedback for written assignments in higher education’, Assessment & Evaluation in Higher Education, 40(5), pp. 666-80.

West, J. and Turner, W. (2015) ‘Enhancing the assessment experience: improving student perceptions, engagement and understanding using online video feedback’, Innovations in Education and Teaching International, 21 January, pp. 1-11.