
Coventry University’s mission is to be a global education group with a reputation for equity and innovation that empowers students and communities to transform their lives and society through teaching, learning, research and enterprise. The University has around 30,000 students across its UK campuses (Coventry, London and Scarborough), and around 26,000 students overseas at international collaborative partners.
Coventry University’s journey with data analytics began in 2011, at a time when there was very little understanding of institutional metrics or data at either a strategic or operational level. Quality systems were primarily focused on report writing and retrospective analysis rather than action planning and forward-looking improvement – and the University experienced declining performance across key performance metrics.
In response, key strategic interventions focusing on the student experience included:
Better insights on programme performance, including via student feedback derived through Module Evaluation Questionnaire (MEQ) surveys, was a significant development need.
Professor Andrew Turner, Deputy Vice-Chancellor (Education) at Coventry University, said: “We originally set up our course evaluation survey processes in 2012. At that point, everything was paper-based, and we issued 80,000 questionnaires per year. We then introduced a central system and gradually evolved our processes, eventually moving fully digital. Initially, we were using online logistics tools and anonymous surveys.
“However, we reached a point where we needed to be more sophisticated in how we understood feedback – particularly across different student demographics – and have a better solution for confidential surveys. We wanted to move beyond anonymous responses and be able to analyse how specific groups were experiencing teaching, learning and assessment, while still maintaining confidentiality. There was also no integration with our Virtual Learning Environment (VLE). That became a key driver for change.
“We needed a more robust, integrated solution that would allow confidential analysis and demographic insight. That is what led us to Explorance. We reviewed other competing tools, but Blue offered greater flexibility than the alternatives.”
Coventry University implemented Explorance Blue from 2020 which has further enhanced the strategic approach to using student feedback. The University conducts MEQ surveys for every module mid-way through each semester, integrating them into teaching sessions. Tutors respond to feedback with a “You said, we did” update within three weeks, which is embedded in the learning experience platform templates. Questionnaires are aligned with the National Student Survey (NSS) to ensure students are familiar with the feedback rubrics, and actions are taken at module, course, school and faculty levels.
In the 2024-25 academic year, the University surveyed 3,060 modules and issued nearly 148,000 evaluations. Response rates sit between 41% and 49%. Dr Douglas Howat, Dean of Students at Coventry University, who has responsibility for overseeing all surveys, reflected: “Anything above 40% for questionnaires is good. Given we have over 30,000 students, most taking six modules per year, the scale is significant. We have also rationalised modules over time, introducing bigger shared modules. One of the real strengths of Explorance is the ability to analyse responses within those shared modules by course. That mitigates risk and ensures all student groups are equally well served.”
However, for Dr Howat, it is Explorance’s support for the provision of intelligent use of data at all institutional levels that is especially valuable. “What I use extensively is the output from these surveys,” he shared. “We have a business intelligence dashboard that updates every time we complete a tranche of module evaluations. Once the data is uploaded, I can interrogate it – identifying hotspots, cold spots, trends across departments, and performance by demographic group – flagging up particular issues.
“We analyse data by ethnicity, gender, disability, and course. That ability to disaggregate is absolutely essential. If a module serves multiple courses, it must serve all students equally – whether they are in the majority or minority on that module. Being able to break down results by course and demographic ensures equity in delivery. “Moving from anonymous to confidential surveys was a big step. There was debate about whether students would feel comfortable responding. We have worked hard to explain how data is used, reassure students about confidentiality, and emphasise that feedback – positive or negative – has no bearing on academic outcomes.”
The University has experienced a number of strategic, operational and technical benefits from its five-year collaboration with Explorance to date.
Dr Howat commented: “We are a very data-driven institution. Aggregated dashboards allow us to identify issues quickly and intervene early. Local managers can drill down to individual modules and take action immediately rather than months later. Strategically, this informs our Access and Participation Plan work, demographic analysis, and institutional performance monitoring. We are piloting additional course-level questions, which are attached to the MEQ, and refining our approach to gain better alignment. Having that flexibility to ask questions in line with our strategic priorities is absolutely essential.
“The next step for us is not whether the system works – it does. It is about using it more strategically to drive institutional performance, including predictive insight, and Blue can provide an early warning system and more realistic expectation of NSS outcomes. This is key because NSS contributes to 50% of the Teaching Excellence Framework (TEF), and so is really important to contributing to our ‘Gold’ rating.”
Professor Turner added: “The integration of Blue with our in-house VLE has been transformative. It makes survey distribution simpler, improves visibility, and helps us close the feedback loop more effectively. The key benefit is speed and automation – clear schedules for when surveys open, when results are released, and when responses are required. Our deliberately formative approach to MEQ surveys is all about informing actions to improve the student experience.
“Support from the Explorance team has been strong and the openness of the Blue platform allows us to reach beyond the institution, for example taking this into external partners including our apprenticeships, healthcare practice educators, and teacher training provision.”