Description
Session Description
This paper describes the outcomes of a UK-wide project to investigate students’ digital experiences and to identify organizational variables that have a significant correlation with key metrics from the student survey data.
Since the turn of the century it has been standard practice in the US to survey students about their experiences with IT (Brooks and Pomerantz 2017, Brooks 2016 and earlier) and to report on this data nationally. Consultations with the UK sector in 2014 found little appetite for a survey instrument under centralised control, but interest in a locally-administered survey with actionable outcomes to support digital initiatives on the ground. The opportunity to use aggregated sector data to support research, national intelligence and policy-making (redacted) has arisen only because the survey instrument and process has proved popular at a local level: it was not the original or main purpose.
The primary instrument used for the present study was the Jisc ‘Student digital experience’ survey, developed by the authors from an earlier research project and through three cycles of piloting and validation. The survey is short, locally administered, and customisable. It does not assume that digital technology is ‘good’ for education (Selwyn 2016), but asks a range of questions about how students respond to the digital environment, how they use their own devices and services for learning, and how they respond to digital aspects of the curriculum. For this analysis, over 30,00 sets of responses were available, collected from over 60 UK HE and FE institutions between December 2017 and April 2018. Clear guidance was provided to ensure users collected a reliable and valid sample from their student populations.
The primary data was subjected to multivariate statistical analysis to explore relationships among the student variables and to identify key variables for describing the overall student digital experience. Significant differences between the HE and FE sectors were also investigated.
The secondary instrument was a survey of the organizational leads for the ‘Student digital experience’ project. The variables used in this survey were identified by the authors via a desk review of existing surveys and of key papers (e.g. King & Boyatt 2014, Walker et al. 2017). A number of additional variables from publicly available data were added to this data set.
Statistical analysis was carried out to identify the nature of the relationships between the organisational variables and the key variables from the student data.
The results of both analyses will be of scholarly interest and of practical value to ALT-C participants.
– The digital environment and digital curriculum are of increasing significance to the overall student experience. Student-facing staff need to know how students respond to aspects of this provision.
– Scholarly work on student outcomes of digital provision has focused separately on pedagogical aspects (e.g. Higgins et al 2012, Caird & Lane 2013, Kirkwood & Price 2014, Limniou et al 2015, Henderson et al 2015) and on the digital environment for learning (e.g. Magen-Nagar & Steinberger 2017, Savin-Baden et al. 2017). In contrast our analysis explores the relationships among variables relating to both digital pedagogy and the digital environment, and will be of interest to researchers in both schools.
– National and institutional policy makers will be interested to understand the impact of different investments in digital infrastructure, in policy initiatives, and in teaching staff skills
– Organisational representatives will be interested in the validity of the primary research instrument as a tool for investigating the student digital experience (redacted) and in how to apply its results to effective interventions at a local level.
Session content: evaluation and reflection
The session is based on a large-scale project that was continuously evaluated over three cycles of piloting, through:
– surveys of organizational leads
– analysis and validation of the data collected from students
– a series of FE and HE expert panels, convened to peer review and support analysis of the survey findings (2018)
– presentation and consultation at research conferences and workshops
Participants will share fully in the analysis of survey data. The outcomes of the evaluation process at organizational level will be discussed as appropriate.
References
Brooks, D. C. (2016) ECAR Study of Undergraduate Students and Information Technology, 2016. Research report. Louisville, CO: ECAR.
Brooks, D. C. & J. Pomerantz (2017) ECAR Study of Undergraduate Students and Information Technology, 2017. Research report. Louisville, CO: ECAR.
Caird, S. & A. Lane (2013) Conceptualising the role of information and communication technologies in the design of higher education teaching models used in the UK. British Journal of Educational Technology 46 (1).
Henderson, M., N Selwyn & R. Aston (2015) What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Studies in Higher Education 42 (8).
Higgins , S., Z. Xiao & M. Katsipataki (2010) The Impact of Digital Technology on Learning: A Summary for the Education Endowment Foundation: EEF and Durham University.
King, E. and R. Boyatt (2014) Exploring factors that influence adoption of e-learning within higher education. British Journal of Educational Technology 46 (6).
Kirkwood, A. & L. Price (2014) Technology-enhanced learning and teaching in higher education: what is ‘enhanced’ and how do we know? A critical literature review. Learning, Media and Technology 39 (1).
Limniou, L., J.J. Downes & S. Maskell (2015) Datasets reflecting students’ and teachers’ views on the use of learning technology in a UK university. British Journal of Educational Technology 46 (5).
Magen-Nagar, N. & P. Steinberger (2017) Characteristics of an innovative learning environment according to students’ perceptions. Learning Environments Research 20 (3).
Savin-Baden, M., L. Falconer, K. Wimpenny & M. Callaghan (2017) Virtual Worlds for Learning. In Duval, E., M. Sharples & R. Sutherland (eds) Technology Enhanced Learning. Springer: Cham.
Selwyn, N. (2016) Is Technology Good for education? Cambridge: Polity Books.
Walker, R., J. Voce, E. Swift, J. Ahmed, M. Jenkins & P. Vincent (2017) 2016 Survey of Technology Enhanced Learning for higher education in the UK. UCISA
Resources for participants
https://digitalstudent.jiscinvolve.org/wp/category/tracker/