ELESIG Evaluation and Innovation in Education: The Path to Continuous Improvement
Written by Dr Jim Turner (LJMU) Current Chair of ELESIG
Imagine a world where evaluation becomes the guiding light in educational innovation. This ELESIG event provided some insights into how to possibly achieve this through a presentation by Louise Goulsbra, the Digital Education Manager, Evaluation and Innovation at the University of Leeds. In this blog, we delve into her unique perspective on evaluation’s role in transforming education, the opportunities it presents, and the challenges we must overcome.
Who was speaking
Louise Goulsbra is an experienced professional with over 20 years of expertise in the publishing and e-learning industry. As the Digital Education Manager, Evaluation and Innovation at the University of Leeds, she leads a team of data analysts and qualitative researchers, driving effective evaluation and innovation strategies. With a strong background in market research, strategy, and content development, Louise is passionate about leveraging sound evaluation practices to better serve learners in the dynamic landscape of digital education.
What were the main points
Mindset and method
In the realm of evaluation, it is time to see our mindset as just as critical as any method we might select. Conducting a successful evaluation goes beyond simply employing the right tools or following a prescribed approach. It requires an open mindset that embraces the opportunity to look afresh at each problem in an unbiased way. This mindset encourages us to approach evaluations with new eyes, free from assumptions about the solutions. It compels us to challenge preconceived notions and embrace curiosity, allowing us to uncover insights that may have previously been overlooked. By prioritizing mindset alongside method, we create an environment that fosters innovation, adaptability, and continuous learning, allowing evaluation to become a standard and integral part of our everyday work processes.
Data dashboard improvements
Improving the data dashboard emerged as a key theme, emphasising the significance of making it relevant and user-friendly. The goal is to ensure that the dashboard provides actionable insights and answers the critical questions that drive decision-making. By enhancing the usability and relevance of the data dashboard, organisations can empower stakeholders with the right information at the right time, enabling them to make informed choices and drive effective evaluation and innovation strategies.
Qualitative data
One notable theme revolved around the importance of giving equal weight to qualitative data alongside quantitative data, despite a tendency to heavily rely on the latter. While quantitative data may be more easily accessible and readily available, it was acknowledged that qualitative data provides valuable insights that quantitative data alone cannot capture. However, challenges were recognised in terms of collecting and processing qualitative data, highlighting the need for dedicated efforts and resources to ensure its effective integration into evaluation and innovation practices.
Tips for thinking through evaluation
Here is a summary of some of the tips for those working in this area.
- Baseline data: This is crucial for evaluation as it provides a starting point for comparison and tracking progress over time. Baseline data helps to establish a reference point, enabling evaluators to measure the effectiveness of interventions or changes in relation to the initial state.
- Iteration is essential: To foster continuous improvement and adaptation based on feedback and new insights. Iterative evaluation allows for adjustments and refinements throughout the process, enhancing the accuracy and relevance of the evaluation outcomes.
- Establishing key data categories: This ensures consistency and comparability across different evaluation efforts, enabling meaningful analysis and identification of patterns or trends. Having established data categories ensures a systematic approach to evaluation, facilitating the collection and analysis of relevant data across different projects or initiatives.
- Evaluation culture: It is important to embed these practices as part of standard processes and the content creation culture to ensure evaluation is an integral and ongoing part of the overall workflow. Collaboration and partnership with wider teams and academic stakeholders are essential to gather diverse perspectives, break down silos, and ensure a holistic evaluation approach that considers various viewpoints and expertise.
Recommendations for follow on courses
Here are some of recommended follow-on ideas from Louise.
Options with a qualitative focus
- LinkedIn – Learning Infographic Design
- The Datalabs Agency Infographics and Report Design workshop
- The Datalabs Agency An introduction to Data Visualisation and Storytelling Course.
- The Datalabs Agency Creative Data Presentations with Microsoft PowerPoint
- University of Arts Infographics Design Online Short Course
- Annual membership Interactive Design Foundation
Options with a quantitative focus
- The Knowledge Academy Scrum Master Certification Training Course
- PRINCE2
- Annual subscription to Data Camp for areas such as sentiment analysis.
- Microsoft Certified: Power BI Data Analyst Associate
- The Knowledge Academy Jira Masterclass
- The Information Lab Alteryx Certification Progression: Core, Advanced, Expert
Options for a higher-level focus
- The Datalabs Agency Designing Great data dashboards.
- Linkedin Learning Data Science, Managing your Team
- For content design and evaluation training Content Design London
- For higher level strategy training Imperial College Business School run a short course in Digital Transformation Strategy – Innovation: A Design Thinking Approach