People Leaders: Strengthen Employee Voice
In just 5 minutes, uncover insights to act on today
Take the Voice of the Employee Assessment
Skip to contentExplorance Logo
Back to Blog Home

test article iconslist

Published onOctober 28, 2025|11 min read
Illustration for the article test article iconslist

The Student Voices in Higher Education Conference, which Explorance hosted in the UK in May, created a unique space for universities to share knowledge, experiences, best practices, insight, and lessons learned about their student voice initiatives.

The conference aimed to address the question: How are universities using student surveys to deliver teaching and learning outcomes? Explorance was delighted to welcome around 80 delegates at Conference Aston in Birmingham.

Academics, professional staff, and student representatives benefited from 12 university-led sessions, three panel discussions, two keynote speakers, and two Explorance-led sessions. These included senior HE sector representatives from Advance HE, Quality Assurance Agency for Higher Education (QAA), and Universities UK.

10 things we learned at the Student Voices Conference

1. Students value the opportunity to be heard

The opportunity to have their voices heard is essential to students. The day one keynote speaker, Jonathan Neves, Head of Business Intelligence and Surveys at Advance HE, revealed that national surveys show that 8 out of 10 students are satisfied with their experience in what is perceived as a consumer environment where they can be relatively critical of the value they receive. He explained that meeting or exceeding expectations can be challenging when the student experience can be inherently daunting, and levels of student well-being remain well below the general population.

2. Why representing student voices matters

Students and Higher Education Institutions have recently faced three significant challenges: Covid-19, strike action, and the cost-of-living crisis. All of these have impacted the student experience. With students attending university less frequently and considering more fundamental changes to their studies, Neves explained that this would likely affect broader perceptions (as campus life is a crucial factor in the student experience). Lower attendance and participation among some cohorts will likely impact the sense of belonging. However, many students recognise Higher Education as a transformational experience, and a commitment from universities to represent the student voice in surveys will help to maximise this.

3. “Strategic and authentic” institutional mission

The day two keynote speaker Professor Ruth Ayres, Pro Vice-Chancellor Education at Aston University, reflected on moving from “the customer-consumer relationship to how we work with students to co-create.” She asked delegates to rate their success using student voice activities such as national surveys, Module Evaluation Questionnaires (MEQs), and end-of-program surveys. These methods aside, at the heart of Aston’s story is a “strategic and authentic commitment” by the University to making a difference through student voices and evidencing when and where student feedback has impacted institutional decisions, projects, and initiatives. Attendees also learned how the University of Sussex had developed a holistic approach to capturing data through the student journey as part of its student voice strategy.

4. A dopting ‘student as partner’ approaches

Universities are increasingly taking an approach to the co-development of student voice activities in partnership with their Student Unions, as discussed in the Leaders’ Forum titled: How Can We Ensure That Student Voice Supports Continuous Improvements in Our Institutions? Embedding student voices in decision-making projects which impact the student experience and offering different forums through which students can engage, digital and in person, were overarching principles cited. Just as important were co-designing and co-creating projects (ranging from education strategy to curricular and learning spaces) and facilitating the journey from student feedback to genuine student voice.

5. Engaging the full diversity of learners

While universities are committed to ensuring student voices are heard and represented, specific groups (including mature students, commuter students, and degree apprenticeship students) remain disproportionately challenging to engage in student voice activity. In the Students’ Forum, Is Our Feedback Being Listened To and Acted On?, Yemi Gbajobi, Chief Executive of the Arts Students’ Union at the University of the Arts London, argued that these groups of students are not “hard to reach.” Universities must find out the best way to get to them. Targeted surveys or other focused student voice activities were some of the solutions presented by institutions.

6. More data analysis of existing surveys

The Students’ Forum explored the issue of survey fatigue. Student Union representatives on the panel agreed that students are ‘over-surveyed,’ and a discussion followed on what could practically be done to mitigate this. “We don’t need more surveys; we need to spend more time analysing the feedback we’ve got” was a collective view expressed. What was suggested was a more significant analysis of quantitative and qualitative feedback. University presentations also alluded to the fact that data analysis was the most complex/time-intensive aspect of surveys, both in terms of the number of surveys to process and in understanding themes arising. Explorance MLY can help transform student comments into actionable insights on specific themes with Machine Learning.

7. Prioritise actions, manage expectations

“Students tell you various things in feedback, and you must take it seriously.” This quote is taken from a presentation by Cardiff University. A representative from the University detailed how following an £8 million investment in the student experience, supported by the University’s Pro Vice-Chancellor Education and Students, a ‘trial and error’ approach to enhancing the way they work with students has been underway. However, while recognising that having given their feedback, students want to know what their institution will do about it, meeting/managing expectations and identifying actions to take in follow-up was an essential next step. “You can’t do everything at once, so it’s how you prioritise,” said the presenters.

8. “I’ve given my feedback. Now what?”

Closing the feedback loop has been the biggest challenge for universities in my own experience in course evaluation surveys for the past ten years and more. It came out organically in a number of university-led presentations as well as in the Students’ Forum. Fundamentally, students do not always feel that their feedback is being acted on (that “no one is listening”) and that they will not benefit from giving feedback. Pulse and mid-term surveys are a nod to the need to make changes that impact students currently in session. But being effective and agile and closing the loop at a local level is a must.

9. Increasing response rates is still a priority

Every university seeks to increase its survey response rates, primarily from its internal baseline. Students suggested that one of the reasons for low response rates is that they need help prioritising which surveys to respond to. That there is also a need to decide which surveys are most impactful to “reduce the noise” was suggested by Cardiff University. Institutions that have implemented Student Champion or Student Ambassador initiatives revealed that they are among their most successful student voice mechanisms, through which encouraging survey completion may be one task.

10. Professional development for individual instructors

Last, what about institutions working with Explorance for many years? How do they keep innovating? In 2023 Durham University renewed Explorance Blue for another five years. It is now developing a new Evaluation Excellence toolkit for every new faculty member. This includes student feedback – contextual Module Evaluation Questionnaires (MEQs), in-class feedback, student-led interviews, focus groups – and other sources. With the MEQs, the module leader writes a reflection piece, based on their scores, on how the module went and an action plan for improvements. This speaks to the broader development piece on how instructors engage with students to improve their teaching.

In summary, end-of-module and mid-module evaluation surveys remain a hugely valuable component for capturing student feedback and are here to stay. However, universities are now deploying surveys around teaching effectiveness, learning excellence, staff engagement, and student experience. One example is Student 360s, where universities examine student competencies and workforce readiness. It will be interesting to see how these develop between now and next year’s second edition of the Explorance Student Voices Conference.

Download our Post-conference Guide, Student Voice: The Complete Guide on How to Increase Student Engagement in Evaluations

360 DEGREE FEEDBACK

Icon list section title

Fuel better decision-making with robust, relevant feedback data

Hear every student and employee voice without accessibility limitations

Act quickly on survey results to enhance organizational practices

Grow your workplace's culture by leveraging qualitative analysis

Translate comments into optimization to ensure everyone feels valued

Base strategies on the right insights instead of assumptions or hunches

The Student Voices in Higher Education Conference, which Explorance hosted in the UK in May, created a unique space for universities to share knowledge, experiences, best practices, insight, and lessons learned about their student voice initiatives.

The conference aimed to address the question: How are universities using student surveys to deliver teaching and learning outcomes? Explorance was delighted to welcome around 80 delegates at Conference Aston in Birmingham.

Academics, professional staff, and student representatives benefited from 12 university-led sessions, three panel discussions, two keynote speakers, and two Explorance-led sessions. These included senior HE sector representatives from Advance HE, Quality Assurance Agency for Higher Education (QAA), and Universities UK.

10 things we learned at the Student Voices Conference

1. Students value the opportunity to be heard

The opportunity to have their voices heard is essential to students. The day one keynote speaker, Jonathan Neves, Head of Business Intelligence and Surveys at Advance HE, revealed that national surveys show that 8 out of 10 students are satisfied with their experience in what is perceived as a consumer environment where they can be relatively critical of the value they receive. He explained that meeting or exceeding expectations can be challenging when the student experience can be inherently daunting, and levels of student well-being remain well below the general population.

2. Why representing student voices matters

Students and Higher Education Institutions have recently faced three significant challenges: Covid-19, strike action, and the cost-of-living crisis. All of these have impacted the student experience. With students attending university less frequently and considering more fundamental changes to their studies, Neves explained that this would likely affect broader perceptions (as campus life is a crucial factor in the student experience). Lower attendance and participation among some cohorts will likely impact the sense of belonging. However, many students recognise Higher Education as a transformational experience, and a commitment from universities to represent the student voice in surveys will help to maximise this.

3. “Strategic and authentic” institutional mission

The day two keynote speaker Professor Ruth Ayres, Pro Vice-Chancellor Education at Aston University, reflected on moving from “the customer-consumer relationship to how we work with students to co-create.” She asked delegates to rate their success using student voice activities such as national surveys, Module Evaluation Questionnaires (MEQs), and end-of-program surveys. These methods aside, at the heart of Aston’s story is a “strategic and authentic commitment” by the University to making a difference through student voices and evidencing when and where student feedback has impacted institutional decisions, projects, and initiatives. Attendees also learned how the University of Sussex had developed a holistic approach to capturing data through the student journey as part of its student voice strategy.

4. A dopting ‘student as partner’ approaches

Universities are increasingly taking an approach to the co-development of student voice activities in partnership with their Student Unions, as discussed in the Leaders’ Forum titled: How Can We Ensure That Student Voice Supports Continuous Improvements in Our Institutions? Embedding student voices in decision-making projects which impact the student experience and offering different forums through which students can engage, digital and in person, were overarching principles cited. Just as important were co-designing and co-creating projects (ranging from education strategy to curricular and learning spaces) and facilitating the journey from student feedback to genuine student voice.

5. Engaging the full diversity of learners

While universities are committed to ensuring student voices are heard and represented, specific groups (including mature students, commuter students, and degree apprenticeship students) remain disproportionately challenging to engage in student voice activity. In the Students’ Forum, Is Our Feedback Being Listened To and Acted On?, Yemi Gbajobi, Chief Executive of the Arts Students’ Union at the University of the Arts London, argued that these groups of students are not “hard to reach.” Universities must find out the best way to get to them. Targeted surveys or other focused student voice activities were some of the solutions presented by institutions.

6. More data analysis of existing surveys

The Students’ Forum explored the issue of survey fatigue. Student Union representatives on the panel agreed that students are ‘over-surveyed,’ and a discussion followed on what could practically be done to mitigate this. “We don’t need more surveys; we need to spend more time analysing the feedback we’ve got” was a collective view expressed. What was suggested was a more significant analysis of quantitative and qualitative feedback. University presentations also alluded to the fact that data analysis was the most complex/time-intensive aspect of surveys, both in terms of the number of surveys to process and in understanding themes arising. Explorance MLY can help transform student comments into actionable insights on specific themes with Machine Learning.

7. Prioritise actions, manage expectations

“Students tell you various things in feedback, and you must take it seriously.” This quote is taken from a presentation by Cardiff University. A representative from the University detailed how following an £8 million investment in the student experience, supported by the University’s Pro Vice-Chancellor Education and Students, a ‘trial and error’ approach to enhancing the way they work with students has been underway. However, while recognising that having given their feedback, students want to know what their institution will do about it, meeting/managing expectations and identifying actions to take in follow-up was an essential next step. “You can’t do everything at once, so it’s how you prioritise,” said the presenters.

8. “I’ve given my feedback. Now what?”

Closing the feedback loop has been the biggest challenge for universities in my own experience in course evaluation surveys for the past ten years and more. It came out organically in a number of university-led presentations as well as in the Students’ Forum. Fundamentally, students do not always feel that their feedback is being acted on (that “no one is listening”) and that they will not benefit from giving feedback. Pulse and mid-term surveys are a nod to the need to make changes that impact students currently in session. But being effective and agile and closing the loop at a local level is a must.

9. Increasing response rates is still a priority

Every university seeks to increase its survey response rates, primarily from its internal baseline. Students suggested that one of the reasons for low response rates is that they need help prioritising which surveys to respond to. That there is also a need to decide which surveys are most impactful to “reduce the noise” was suggested by Cardiff University. Institutions that have implemented Student Champion or Student Ambassador initiatives revealed that they are among their most successful student voice mechanisms, through which encouraging survey completion may be one task.

10. Professional development for individual instructors

Last, what about institutions working with Explorance for many years? How do they keep innovating? In 2023 Durham University renewed Explorance Blue for another five years. It is now developing a new Evaluation Excellence toolkit for every new faculty member. This includes student feedback – contextual Module Evaluation Questionnaires (MEQs), in-class feedback, student-led interviews, focus groups – and other sources. With the MEQs, the module leader writes a reflection piece, based on their scores, on how the module went and an action plan for improvements. This speaks to the broader development piece on how instructors engage with students to improve their teaching.

In summary, end-of-module and mid-module evaluation surveys remain a hugely valuable component for capturing student feedback and are here to stay. However, universities are now deploying surveys around teaching effectiveness, learning excellence, staff engagement, and student experience. One example is Student 360s, where universities examine student competencies and workforce readiness. It will be interesting to see how these develop between now and next year’s second edition of the Explorance Student Voices Conference.

Download our Post-conference Guide, Student Voice: The Complete Guide on How to Increase Student Engagement in Evaluations

Testimonials

Tsetimonial Section

Facebook
“Over the past several years, we’ve built a trusted relationship with the Explorance team to strengthen our employee listening strategy. Thanks to this joint effort, we’ve made meaningful changes that are directly informed by employee insights. We report regularly to our board, engage our leaders in the review of the data, support our managers in interpreting results, and ensure that every frontline employee hears what we’re learning—and what we plan to do about it. The collaboration with Explorance has led to more informed conversations, stronger alignment, and positive change across the organization.”
Placeholder image for testimonial avatar
Amanda King
Vice President, Human Resources
“Even with the number of comments we put in, 50,000 in one go, I am still baffled by how quickly MLY does the analysis once you have got it into the system to then being able to look at it within a matter of hours.”
Placeholder image for testimonial avatar
Daniel Bayes
Teaching and Learning Officer (Student Voice and Feedback)
“MTM 9.4 is about breaking down barriers to insights. Whether it’s language barriers preventing a learner’s qualitative feedback from being properly considered or removing dead-ends in a quantitative analysis, MTM 9.4 enables our users to better answer the “Why?” behind their training program performance.”
Eric Matson
Head of MTM Product Management
demo

Get a Personalized Demo

Harness the power of feedback to achieve your goals.
Explorance LogoExplorance LogoExplorance Logo
Newsletter

Stay connected with the latest updates and news from Explorance.

Products
Solutions
Resources
Company
Explorance LogoExplorance Logo
  • Privacy Policy
  • Terms Of Use
  • Anonymous Reporting Form
  • Sitemap
Copyright 2025 © Explorance Inc. All rights reserved.