SciELO - Scientific Electronic Library Online

 
vol.34 issue1Concept mapping as a strategy to scaffold concept literacy in accounting for extended programmesAccountancy students' and lecturers' perceptions of the effect of open-book assessments on writing examinations author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

    Related links

    • On index processCited by Google
    • On index processSimilars in Google

    Share


    South African Journal of Higher Education

    On-line version ISSN 1753-5913

    S. Afr. J. High. Educ. vol.34 n.1 Stellenbosch  2020

    https://doi.org/10.20853/34-1-3661 

    GENERAL ARTICLES

     

    Development of a contextualised data analytics framework in South African higher education: evolvement of teacher (teaching) analytics as an indispensable component

     

     

    E. C. Janse van Vuuren

    Faculty of Economic and Management Sciences, University of the Free State. Bloemfontein, South Africa. e-mail: jansevanvuurenec@ufs.ac.za

     

     


    ABSTRACT

    Data analytics in higher education aims to address, amongst other matters, student success and the factors related thereto. Faced by continuous poor student success rates, the Faculty of Economic and Management Sciences at a South African university embarked on the development and implementation of a contextualised data analytics framework to address this problem. Implementation of the framework highlighted the need for the inclusion of teaching (teacher) analytics as an integral part of the framework. Including teaching analytics not only ensured a more comprehensive understanding of the teaching and learning process, but also resulted in unexpected extensions of the framework for the scholarly development of teachers. Features of the adapted data analytics framework, with a specific focus on the teaching (teacher) analytics component, is presented in this article.

    Keywords: data analytics, teacher (teaching) analytics, learner (learning) analytics, academic analytics, student success


     

     

    INTRODUCTION

    The application of data analytics in higher education has been evolving rapidly since its introduction in the sector during the early 2000s. Worldwide, higher education institutions have been seeking ways to improve student performance and enhance teaching and learning practice. In this pursuit, the use and analysis of available data on student success (predominantly numerical/quantitative) has become a valuable basis for informed decision-making regarding at-risk student analysis and/or teaching and learning practices, among other, in order to improve student performance and success (Long and Siemens 2011, 36; Lourens and Bleazard 2016, 139). Student-centredness is, however, regarded by many as the main driver of the use of data analytics in higher education, as it is directly linked to student success and high-quality teaching and learning practice (Campbell, DeBlois and Oblinger 2007, 44; Clow 2013, 685; Kajitani 2014, 26, Wright et al. 2014, 29). In acknowledging the strive towards enhanced quality and associated improved student success, Norris and Baer (2013, 5) describe data analytics as essential for achieving the aim of student-centredness. They furthermore point out that data analytics has the potential to identify effective and efficient teaching and learning practices, as well as models and innovations, in order to sustain higher education for the future.

    In order to achieve these goals, higher education institutions worldwide employ various frameworks to analyse available datasets (Dietz-Uhler and Hurn 2013, 18-19), which are mostly built on two pillars, namely academic and learning analytics. Academic analytics, on the one hand, make use of data in a broader environment, such as institutional, national and international educational environments, with a focus on institutional management and decision-making. Data sets linked to academic analytics typically include data on demographics, academics (including ability, performance history, and effort), financials, participation or engagement, as well as module/course-specific data (Campbell and Oblinger 2007, 4). Learning analytics, on the other hand, are more specifically related to issues at departmental or course/module level, with data, usually focusing on the learner, learning behaviours and the learning process (Long and Siemens 2011, 34). Even though it seems as if these two pillars of data analytics in higher education are separate, Hutchings, Huber and Ciccone (2011) share examples where the gap between academic and learning analytics has been bridged successfully, demonstrating the importance and relevance of such practices. However, Campbell and Oblinger (2007, 9-10), as well as Long and Siemens (2011, 34), state that the gap between academic analytics and learning analytics can only be bridged if the academic staff is involved in the process. They furthermore point out that the involvement of academic staff in data analytics may contribute to improving student success, implementing more effective teaching practices and enhancing the scholarship of teaching and learning. In support, Chatti et al. (2012, 10) and Johnson et al. (2011, 29) also highlight the importance of teachers reflecting on teaching, learning, and assessment through the effective use of data analytics, if they are to improve student success rates. Chatti et al. (2012, 4) also put special emphasis on the importance of placing the "teacher" in the domain of data analyses. They suggest that a teacher focus might be of particular value in stimulating academic staff to reflect on their current teaching practices. Such self-reflection may assist academic staff to adapt their current teaching practices and potentially enhance the effectiveness - and thus the quality - of their teaching.

    Despite the widely recognised usefulness of data analytics and the need for the involvement of academic staff within this analytics domain, Macfadyen et al. (2014, 17) found that teachers still used data ineffectively to improve teaching and learning practices. Norton (2016, 1) is of the opinion that staff in the higher education environment is not sufficiently knowledgeable about the optimal use of the analysis of available datasets in their teaching. This view corresponds with that of Ellis (2013, 662), who points out that the benefits of using data analytics in higher education contexts in relation to student learning and teaching practices remain unclear to teachers, and they could, therefore, ignore the benefits of and resist using data analytics. According to the Council of Higher Education (CHE), evidence-based (i.e. data-driven) curriculum reform and the adaptation of current teaching and learning practices in the South African higher education context are of the utmost importance to address continuing low student success rates (CHE 2016, 53). Even though data analytics are regarded as "critical" to improving student success and identifying effective teaching and learning practices (Norris and Baer 2013, 5; Chatti et al. 2012, 4; Campbell and Oblinger 2007, 1), ways should be sought to get academic staff more involved in this process if success want to be achieved in the South African higher education context.

    Given the background provided, the question whether data analytics could become a viable, value-adding endeavour for teachers, should they be provided with a structured pathway and the necessary support structures to support the development of their data analytic skills, was thus eminent. With this in mind, and faced with continuously low student success rates and limited participation by teachers in the analysis of their available student data, the author - an experienced higher education lecturer and teaching and learning manager - developed an operational data analytics framework for application in the Faculty of Economic and Management Sciences (FoEMS) at a South African university. The framework was to provide an innovative, structured platform for teachers to become involved in data analytics as a launching pad for actions that had the potential to improve the quality of teaching and learning practices and ultimately student success outcomes.

    The aim of this article is consequently two-fold; firstly, to share the development and basic features of our operational data analytics framework and; secondly, to highlight some of the values ensuing from its implementation, with the main focus on the evolvement of teacher (teaching) analytics as an indispensable component of the framework.

    The operational FoEMS framework was based on the theoretical underpinnings of data analytics frameworks currently utilised within the higher education context and, therefore, the discussion commences with an overview of data analytics in higher education and the nature of existing data analytics frameworks. It is followed by an overview of the development and adaptation of the operational FoEMS framework, showing how the new framework evolved from features in existing frameworks and other developments in the faculty during implementation. This article, however, does not aim to identify and address all the factors that could influence student success, but rather focus on the innovative use of existing, available datasets through a focused, goal-orientated approach to motivate students, staff and management to collectively identify problems and search for possible solutions to improve student success. The discussion concludes with an overview of the possible effects the inclusion of teacher (teaching) analytics (as an extension to existing data analytics frameworks in higher education) could have on student success, the quality of teaching and learning practices and teacher development.

     

    DATA ANALYTICS IN HIGHER EDUCATION

    Data analytics consists of a set of data sources from which collection, integration, and analysis can be done. This range of sources enables more meaningful interpretations than statistics in isolation (Ali et al. 2012, 484). The inclusion of a range of data sources also enables stakeholders to "choose from a flexible and extendable set of indicators" (Dyckhoff et al. 2012, 61). Since the inception of data analytics in the higher education environment, two distinct components have been identified, namely, academic analytics and learning analytics (see summary in Figure 1).

    Academic analytics is the foundational aspect of data analytics in higher education. Demographic data, academic data, admission data, and financial data are included under academic analytics (Long and Siemens 2011, 34; Campbell and Oblinger 2007, 4) and are mainly utilised to assist institutional decision-making with regard to student success, retention and graduation rates (i.e. analysing, reporting and predicting), but also in decision-making with regard to financial aspects, human resources and institutional research (Campbell and Oblinger 2007, 1; Goldstein 2005, 7). Due to the nature of the application of academic analytics, Goldstein (2005, 5) noted limited use by deans, heads of departments and teachers, even though Van Barneveld, Arnold, and Campbell (2012, 3-4) are of the opinion that the overlapping area between academic analytics and learning analytics could be of benefit to the student, teacher, and department.

    The second component of the twofold data analytics frameworks, as applied in higher education contexts, is learning analytics (see Figure 1). This component is of more direct importance to students and teachers, as it clearly focuses on the learning process and includes aspects such as networking, discourse, conceptualisation, personalisation, and patterns of student performance (Siemens 2013, 1380; Van Barneveld et al. 2012, 4; Long and Siemens 2011, 34). Van Barneveld et al. (2012, 4), however, point to the limited use of learning analytics, specifically on an institutional level.

    Initially, data collection or mining within the higher education context was pedagogically neutral (Ferguson 2012, 309), as it focused on student data captured on institutional (computerised) systems, i.e. academic analytics (Campbell and Oblinger 2007, 4). The aim of the data mining initiatives was mainly to assess student progress, predict student success and identify challenges facing student learning (Johnson et al. 2011, 28; Fynn and Adamiak 2018, 82).

    However, since 2008, more pedagogically-oriented research has been introduced, presenting an additional focus area within the research field of data analytics in higher education. The research focused, more specifically, on understanding the learning process, i.e. learning analytics (Ferguson 2012, 309) in relation to academic analytics. Following this additional focus on the integration of academic and learning analytics, numerous authors highlight the importance of student-centredness in this process (Norton 2016, 1; Kajitani 2014, 24; Wright et al. 2014, 28; Johnson et al. 2011, 28). Johnson et al. (2011, 28), therefore, rightfully state that, in order to take data analytics forward effectively on an integrated pathway within the higher education environment, teachers should understand the technical possibilities as well as the pedagogical value of analysing their available datasets. Long and Siemens (2011, 34) predict that data analytics will be used more often on course (and departmental) level; they emphasise the application of data analytics by individual teachers and highlight the need for teachers to have the appropriate skills needed for using data analytics.

    Two areas of concern for taking data analytics forward (i.e. incorporating it on individual staff level) are the question of "big data" versus smaller data sets, and the question of teacher focus within learning analytics. Natek and Zwilling (2014, 6406-6407) allude to the potential positive value of smaller data sets for providing usable data, and the contribution it could make to institutional knowledge management systems. Accordingly, they encourage teachers to incorporate data analyses into their daily work. Clow (2013, 683) supports this notion by indicating that teachers should utilise the new opportunities data analytics offer to achieve "richer conceptions of student learning" and to change current teaching practices. With regard to teacher focus, Kajitani (2014, 26) and Campbell and Oblinger (2007, 10) warn that teachers could focus so strongly on their own performance when dealing with data analytics, that they are distracted from the overarching aim and miss realising the full potential of the analysis of available datasets. They propose that the focus, when implementing data analytics frameworks on a wider scale, should remain on the use of data to enhance collaboration among teachers (and administration), support student understanding (with data) and, lastly, to set goals. Correspondingly, Johnson et al. (2011, 29) foresee data analytics in higher education offering a system that supports teachers in identifying students' needs but also assists them to adapt their teaching. From the above views, the expectation is thus that data analytics has the potential to change teachers' perceptions about teaching, learning, and assessment.

     

    EXISTING DATA ANALYTICS FRAMEWORKS

    Initially, the framework suggested by Campbell and Oblinger (2007, 3-8) followed a cyclical approach to learning analytics, which included the phases of capture, report, predict, act and refine. Building on this approach, Clow (2012, 134) also presented a cyclical learning analytics framework, which showed a great deal of similarity to the framework of Campbell and Oblinger. Clow's framework also emphasised the importance of closing the feedback loop in learning analytics by means of meaningful interventions that impact positively on learners. Elaborating on these cyclical frameworks, Elias (2011, 17) recognised the importance of other elements in learning analytics by strategically positioning organisations, computers, people and theory in the centre of her learning analytics framework. Even though these central aspects are still encapsulated by the initial cyclical framework, it reflects the importance of integrating high-quality data, teaching and learning theory, human innovation, as well as appropriate data management systems, in an effort to derive a successful learning analytics framework that could effectively address the needs of learners, teachers and administrators.

    A further dimension was added to existing learning analytics frameworks by Greller and Drachsler (2012, 44), who included internal limitations and external constraints as part of their learning analytics framework. The internal limitations referred to "competencies" and "acceptance", whilst external constraints referred to "conventions" and "norms". The internal limitations mentioned by Greller and Drachsler (2012, 44) correspond to debates about the data competence of teachers and the acceptance of data analyses as part of their daily work. The authors, furthermore, relate external constraints to ethical practices within the field of learning analytics, which has now become another focal point in the evolvement of data analytics in higher education. The ethical focus can be linked to aspects such as privacy/confidentiality of student data, institutional policies about the use of student data, as well as legal implications for learners, teachers, and institutions in relation to data use in data analytics (Greller and Drachsler 2012, 50-51).

    The above-mentioned frameworks present a broad view of learning analytics, but with limited specificity. Even though these frameworks could be translated into different contexts, I am of the opinion that the framework of Norris and Baer (2013, 22) offers a more operational approach. Originally, the Davenport/Harris framework (in Norris and Baer 2013, 21) suggested four progressive levels of data analytics, namely, statistical analysis, forecasting, predictive modelling and ultimately, optimisation. By adapting this framework for the higher education environment, in particular, Norris and Baer (2013, 22) suggest that optimisation (as the highest level in the Davenport/Harris framework) can be equalled to student success. Their framework consists of seven levels that are directly linked to student success, namely, student pipeline management, elimination of barriers to student retention and success, managing at-risk students by means of predictive analytics, evolving learning management systems, creating personalised learning environments and learning analytics applications, encouraging large-scale data mining, and extending student success measures beyond the formal training years (Norris and Baer 2013, 23).

     

    DEVELOPMENT OF AN OPERATIONAL FOEMS DATA ANALYTICS FRAMEWORK

    The starting point in the development of the operational data analytics framework was linking factors related to student success and institutional data sets to aspects already included in existing data analytics frameworks (as discussed above). The components of academic analytics and learning analytics recognised by authors such as Long and Siemens (2011, 34) as well as Campbell and Oblinger (2007, 4) were therefore maintained in the framework. Academic analytics in the initial FoEMS framework included the following:

    pre-admission data (e.g. type of secondary school attended),

    academic performance at secondary school,

    demographic data,

    registration data (e.g. bursary holder),

    benchmark student data (e.g. performance in national benchmark tests before admission),

    academic performance (e.g. participation in learning activities),

    examination admission data (e.g. performance in formal, formative assessments),

    examination data (e.g. admission mark, examination mark, final mark), and

    academic advising/credit load data.

    All the information above is regarded as being linked directly to the broader factors that influence student success in South African higher education (including student enrolment management, non-academic support needs and development, and academic support needs and development) (DHET 2013:32).

    Learning analytics in existing data analytics frameworks typically focus on student engagement, student performance and the learning process of students (Siemens 2013, 1380; Van Barneveld et al. 2012, 4; Long and Siemens 2011, 34), and was maintained as such in the initial, operational data analytics framework for the FoEMS.

    As the purpose of the data analytics framework was to serve as a new, contextualised operational system in the FoEMS, it focused on only the first three levels of the Norris/Baer framework, though a further extension of the framework towards the other levels is foreseen. The three features of the Norris/Baer framework included in our framework (see Figure 2) were, therefore, the following:

    (1) Student pipeline management (by means of managing "at-risk" students),

    (2) Elimination of barriers to student retention and success (through identifying and addressing barriers to learning), and

    (3) Management of at-risk students (by means of monitoring and addressing student engagement in academic support and development activities).

    These features spelled out the goals for the implementation of an operational data analytics framework for the FoEMS. These goals, in themselves, ensured a collective focus and clarity with regard to the implementation of the framework amongst all stakeholders (i.e. teachers, learners, managers) in the FoEMS. Figure 2 portrays the initial, operational data analytics framework, which formed the starting point for the implementation of data analytics in the FoEMS.

    The implementation of the initial FoEMS framework provided us with a structured base, from which we could gradually involve teachers in the analyses of their available datasets. It also became a starting point for further contextualisation and adaptation of the framework to address the problems I faced in the FoEMS, such as the low student success rate.

    Since its implementation in 2014, the FoEMS framework has been implemented progressively in 72 undergraduate modules across all five departments of the FoEMS (School of Accountancy, Departments of Business Management, Economics, Industrial Psychology, and Public Administration and Management). All undergraduate modules in the FoEMS are included in a three-year rolling cycle. The schedule for inclusion in the framework provides for a balanced mix between modules from the different departments as well as from the different study years. This approach attempt to provide the faculty management with a broader view of student success annually, even though not all undergraduate modules are included on an annual basis.

    A clear, structured pathway is followed in the operational process: Upon completion of data collection in a specific module, the data is captured and the teacher involved in the teaching of that module are invited to an individual session with the faculty's teaching and learning manager and data analyst. During this session, the teacher is introduced to the available data sets and offered the opportunity to present practical classroom-based hypotheses. This is then "tested" against the available data, and serves as a platform for critical engagement, reflection, and discussion with regard to teaching and learning in that specific module. If more in-depth statistical analysis is required, the FoEMS teaching and learning data analyst is involved in the process.

    Adaptation of the initial data analytics framework in the FoEMS

    After the initial implementation of the data analytics framework in some modules presented in the FoEMS and my subsequent critical engagement with teachers, it became clear that a number of adaptations were essential. The desired changes mostly related to the learning analytics component of the framework, and the need to address a more comprehensive set of factors that influence student success in the FoEMS. Learning analytics in the initial framework typically focused on student engagement data, student performance and the learning processes of students. Even though they are important, these aspects do not address certain crucial factors related to student success directly, including curriculum compilation, teaching and assessment activities and/or practices, learning activities and/or practices, learning resources or academic support and development, graduate attributes, stakeholder involvement (e.g. experiential learning, practical-based learning) and employability. These "gaps" necessitated the adaptation of the learning analytics component of the initial data analytics framework, as explained below:

    Changes were made to the learning activities (learning process) component by including learning resources and curriculum design/compilation. This change underlines the importance of sound curriculum design and the utilisation of appropriate learning resources in learning activities in order to support student success.

    Assessment (student performance) was adapted to include the teaching activities (e.g. presentations, class activities, practical sessions, tutorial sessions) preceding the assessment, as self-reported by the relevant teacher and through content analysis of the module guides - highlighting the importance of constructive alignment in the learning process.

    Student engagement was expanded to focus on academic student development and student support as part of the learning process. In our resource-restricted environment, the majority of students needed extra academic support and development, and "engagement" as a single entity was therefore not sufficient but required an additional focus on students' engagement with academic support and development activities.

    Other features included in our contextualised data analytics framework, which have not been included in data analytics frameworks before, are employability, graduate attributes, and stakeholder involvement. Employability builds on the work of Kuh (2008, 21), which refers to the importance of exposing undergraduate students to authentic or real-world problems. It also links with the accountability of higher education institutions for building partnerships with the workplace and being responsive to existing needs (DHET 2013, xi). Graduate attributes, in addition, are specifically alluded to as part of student success (CHE 2014, 14) and could not be ignored in our context.

    The nature of the changed and newly included features necessitated the inclusion of qualitative data collection methods in order to obtain a more in-depth understanding of the associated factors - adding another distinctive feature, contrary to previous data analytics frameworks (which focused mainly on quantitative data sets). Figure 3 portrays the changes made to the initial data analytics framework to create a more comprehensive "picture" of student success in the FoEMS.

    The most important realisation after the implementation of the new framework and the initial interaction with the teachers critically reflecting on the data was that the nexus between the learner (learning) and teacher (teaching) cannot be ignored when dealing with student success. This awareness provided an invaluable opportunity to make provision for the teacher (teaching) in the analyses of data. Teaching (teacher) analytics was therefore included as a unique feature in our data analytics framework. Due to the clear link between teaching and learning, features in the teaching (teacher) analytics segment are firmly related to the features of the learning (learner) analytics, although with some differences in emphasis. In the teaching (teacher) analytics component, curriculum and assessment were separated. Curriculum focuses on constructive alignment, learning resources, teaching activities and student academic support and development. Assessment focuses on evidence of barriers to learning as well as the link between student performance in assessments and their engagement with academic support and development activities. Due to the importance of the development of graduate attributes and stakeholder involvement, the two features were also included as part of the teaching (teacher) analytics component (see Figure 4).

    The addition of teaching analytics to our operational data analytics framework (see Figure 4) made a major contribution to our understanding of student success in the FoEMS, but more importantly, it resulted in a number of other unexpected benefits and opportunities for the further extension of the framework.

     

    EVOLVEMENT OF TEACHING (TEACHER) ANALYTICS

    The unexpected benefits and opportunities, which are presented in the next section of this article, led to a richer version of teaching (teacher) analytics in our operational data analytics framework and provide motivation for the future inclusion of teaching (teacher) analytics as an indispensable component of higher education data analytics frameworks.

    The introduction of teachers to data analytics as an impetus for the development of a richer model

    The value of including teachers in the data analytics environment in higher education is evident, but questions have been raised about the competency and commitment of teachers to deal with this involvement (Norton 2016, 1; Macfadyen et al. 2014, 17; Ellis 2013, 662). The FoEMS framework that was implemented, in this case, provided a structured introduction for teachers to engage with teaching and learning data. Data was collected from the module(s) the teachers presented, which created an appropriate setting for initial data discussion and introduction to the analyses of available datasets in their own module(s). In the introductory data discussion session, the teacher became the "learner", whilst the FoEMS teaching and learning data analyst became the "teacher" when discussing the module-specific data.

    It was interesting to note how the involvement of teachers with data from their "own" modules motivated them - it resembled the four motivational conditions of Wlodkowski (2003, 40) for fostering motivation for professional development. Inclusion was established, as both these partners (i.e. FoEMS teaching and learning data analyst and teacher) felt respected and connected - through either their expertise in the data analytics environment or their disciplinary environment. Data was used as a basis to promote the communication needed to understand the objectives and desired outcomes from both separate, yet interconnected, environments. Personal relevance was relatively high for both groupings, due to the relevance of the data ("own" module data versus data for institutional reporting) and created a positive attitude towards the session. Hypotheses relating to classroom interactions could be tested against available data, which created a platform for more critical reflection from those involved, resulting in the creation of more meaning. Doing so induced, almost spontaneously, Wlodkowski's (2003, 40) third motivational condition, namely, enhancing meaning through challenging, thoughtful learning experiences. Lastly, in the initial session, a seed was planted and the process of engendering competence (the fourth motivational condition) commenced, through more productive use of the data available to inform teaching and learning practices. This type of motivation led to the creation of a continuous, collaborative relationship between the teacher and FoEMS teaching and learning data analyst.

    The implementation of our adapted data analytics framework, therefore, provided the initial platform from which teachers could embark on a road towards more scholarly teaching. This is in line with the views of Clow (2013, 683), who urges teachers to become involved in data analytics, as it could provide them with "richer" ideas about learning that could inevitably influence their teaching practice. Kajitani (2014, 26), however, warns that teachers should understand that the purpose of their involvement with data analytics is not to compare themselves with other teachers (i.e. not specifically aimed at showcasing their teaching excellence), but rather to assist them to collaborate with other teachers, assist students to understand their learning better and to set teaching (teacher) and learning (learner) goals.

    Resultant scholarly approaches to teaching and learning

    The initial introduction of teachers to the analyses of available datasets in their module(s) and the possibilities it offered for further personal and professional development was a promising first step on the path to teacher development, as suggested by Kern et al. (2015, 5-7). Kern and co-authors present four quadrants for the development of teachers, namely, the practice of teaching, sharing about teaching, scholarly teaching and the scholarship of teaching and learning. As teachers were provided the opportunity to test practical, classroom-based hypotheses against available data in the initial session after the implementation of the adapted data analytics framework, critical reflection on their "practice of teaching" emerged. Even though it was done on a small-scale, teachers could share information about their teaching with the FoEMS teaching and learning data analyst.

    As more individual teachers from the FoEMS were introduced to the data analytics environment, a need emerged for an additional support structure, where further discussions between teachers could take place. A faculty-based, interdisciplinary interest group was formed. Page, Edwards and Wilson (2012, 32-33) allude to the benefits of working in a group, namely, increased accountability, structure, collaboration, motivation and an increase in scholarly output. They point out that such groups could even shift the culture in an educational unit, but for this to happen, participants need to be proactive and act with intent. Quinnell et al. (2010, 25) add that these group interactions are valuable if performed within disciplinary boundaries, as they support discipline-based teaching and learning, though it could be supplemented by interdisciplinary discussions as teachers become more critical reflectors on their own and others' work.

    The value of group interactions is taken a step further by Pope-Ruark (2012, 364-365) who presents a continuum of how scholarly engagements can develop teachers within their specific discipline-based communities. In addition to developing personally and professionally, such interactions can support the development of teaching and research frameworks within specific professional communities, influence theory-building challenges within professional communities, and influence disciplinary identities (Pope-Ruark 2012, 371-373). As suggested by Pope-Ruark, the implementation of the adapted data analytics framework in our faculty supported the development of teachers on a discipline-specific, but also interdisciplinary, level. The critical engagement between teachers, from an interdisciplinary view, were motivating and collaborative and assisted individuals with their discipline-specific development as scholarly teachers. It also acted as the incentive for their individual development paths towards the scholarship of teaching and learning.

    Even though the development of teachers on the suggested continuums of teacher (teaching) development takes place in a number of cases, a study by Vajoczki et al. (2011, 12) reports on the limited evidence of publications/outputs by teachers, suggesting that alternative support structures should be put in place to help these teachers to share their work. An increase in scholarly outputs (i.e. conference presentations) from our interest group after the implementation of the data analytics framework highlights the value of this interdisciplinary space that had been created for teachers to collaborate with and motivate each other, but also to improve accountability in relation to research outputs, within the group.

    Against the background of teacher development, it should, however, be emphasised that defining teaching excellence remains challenging, especially when it is linked to a reward system. Such reward systems might result in a striving for excellence being reduced to a mere "evidence-gathering process" (Wood and Su 2017, 460). Kern et al. (2015, 5-6), therefore, rightfully state that excellent teachers are highly likely to engage in reflection on their practices (practice of teaching), and might share it informally (sharing about teaching), which would still impact positively on the students' learning experiences. The question is then whether teachers who are not formally involved in scholarly teaching or the scholarship of teaching and learning - and therefore don't have "evidence" - are excluded from being rated as excellent teachers? Nonetheless, some parallels could be drawn between the continuum/quadrants of teacher development (Kern et al. 2015, 5-6; Vajoczki et al. 2011, 12) and the definitions of teacher excellence from the study of Wood and Su (2017, 461) (see Table 1).

    The implementation of our data analytics framework afforded teachers the opportunity to move from the earlier levels of practice of teaching/sharing of teaching/good teaching, towards a more scholarly approach to teaching. In some cases, resistance to or fear of the transition to more scholarly teaching is caused by a lack of competence: teachers are unable to integrate data analytics or big data sets that are available at institutions with their daily teaching practice. The implementation of the FoEMS framework provided a structured, demarcated, yet integrated, platform to introduce teachers to data analyses and provide continuous support structures for their development in this field, and thus to becoming scholarly teachers.

    The influence of the operational FoEMS data analytics framework on student and teacher success

    The initial aim of the development and implementation of the operational data analytics framework in the FoEMS was persistently low student success rates and the expectation that it would lead to an improvement in the quality of teaching and learning. Natek and Zwilling (2014, 6407) are of the opinion that the inclusion of data analyses of available datasets in the daily activities of teachers could improve student success rates. In addition to improving student success rates, analyses of these small data sets could provide additional useful results, and assist higher education institutions to develop their larger data analytics management systems further.

    The intention of the implementation of our framework was initially based on the viewpoint of Natek and Zwilling (2014, 6407), that teachers should be involved in data analytics, as it could lead to improving student success rates. However, the implementation of the framework had further (un)intended outcomes, such as increased engagement of teachers with learners. This outcome can be attributed to critical reflection by teachers on the available data, which raised further critical questions about the learning process. The reflection also contributed to the adoption of a more scholarly approach by the teachers, in which critical reflection became central to their daily teaching practices. A steady improvement in student success rates in the FoEMS has been observed since the implementation of the data analytics framework. It is acknowledged that other variables might have contributed too, and, therefore, the improvement cannot be solely attributed to the implementation of the framework, even though it is believed that the framework played a significant role. Figure 5 portrays the current operational FoEMS data analytics framework, featuring the (teaching) values specifically associated with teaching (teacher) analytics as a second, but inseparable, "field" of the framework.

     

    CONCLUSION

    Existing data analytics frameworks (inclusive of both academic and learning analytics) in higher education have not given sufficient recognition to the teacher as a central element of these frameworks in higher education, even though the possible benefits thereof have been described (Long and Siemens 2011, 34; Clow 2013, 683; Natek and Zwilling 2014, 64066407). In order to provide a structured platform for the introduction of teachers to the field of data analytics in higher education, a contextualised, operational data analytics framework was developed and presented in this article. Through the implementation of this framework, the value of data analytics was highlighted, also in a South African context, through the steady improvement in student success within the faculty.

    Another unique feature of the framework is the particular emphasis placed on the teacher, by adding teaching (teacher) analytics (focusing on the teacher and teaching process) to the framework. This resulted in the framework serving as a successful platform for the introduction of teachers to data analytics, eliciting critical reflection amongst teachers and even initiating a move towards scholarly development of teachers. Although done on a small scale, the scholarly development of teachers has already resulted in several research outputs within the field of teaching and learning that had not previously been noted in the FoEMS. These positive teaching (teacher) outcomes highlight the value being added to the current knowledge base on data analytics in higher education by the addition of teaching (teacher) analytics as an indispensable component of our data analytics frameworks. The value of using (richer) qualitative datasets within this domain, in particular, has also come to the fore.

    In addition to the practical detail shared, the adapted data analytics framework presented in this article provides a basis from which further research, with a specific focus on including the teacher in the data analytics environment, can be performed. The resultant impact of teaching (teacher) analytics on the planning, adaption, and implementation of large-scale data analytics frameworks within higher education also needs to be researched further and tracked in the future.

     

    REFERENCES

    Ali, L., M. Hatala, D. Gasevic and J. Jovanovic. 2012. A qualitative evaluation of evolution of a learning analytics tool. Computers & Education 58: 470-489.         [ Links ]

    Campbell, J. P., P. B. DeBlois and D. G. Oblinger. 2007. Academic analytics. A new tool for a new era. EDUCAUSEReview Jul/Aug: 40-57. https://er.educause.edu/articles/2007/7/academic-analytics-a-new-tool-for-a-new-era        [ Links ]

    Campbell, J. P. and D. G. Oblinger. 2007. Academic analytics. EDUCAUSE Oct: 1-20. https://www.google.co.za/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=r)a&uact=8&sqi=2&ved=0ahUKEwiE87rI6qLPAhWICsAKHVUZBboQFgg3MAI&url=https%3A%2F%2Fnet.educause.edu%2Fir%2Flibrary%2Fpdf%2FPUB6101.pdf&usg=AFQjCNHmznYo3MCSPJZtlK70FrgZ7svOQw&sig2=ztmrPXdrrf0JFq7Z08o2UQ        [ Links ]

    Chatti, M. A., A. L. Dyckhoff, U. Schroeder and H. Thüs. 2012. A reference model for learning analytics. International Journal of Technology Enhanced Learning 4(5/6 Jan): 1 -22. http://www.thues.com/upload/pdf/2012/CDST12_IJTEL.pdf        [ Links ]

    CHE see Council on Higher Education.         [ Links ]

    Clow, D. 2013. An overview of learning analytics. Teaching in Higher Education 18(6): 683-695.         [ Links ]

    Clow, D. 2012. The learning analytics cycle: Closing the loop effectively. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK, 134.         [ Links ]

    Council on Higher Education. 2016. South African Higher Education Reviewed. Two Decades of Democracy. Pretoria: CHE.         [ Links ]

    Department of Higher Education and Training. 2013. White Paper for Post-School Education and Training. Building an Expanded, Effective and Integrated Post-School System. Pretoria: DHET.         [ Links ]

    DHET see Department of Higher Education and Training.         [ Links ]

    Dietz-Uhler, B. and J. E. Hurn. 2013. Using learning analytics to predict (and improve) student success: A faculty perspective. Journal of Interactive Online Learning 12(1): 17-26.         [ Links ]

    Dyckhoff, A. L., D. Zielke, M. Bultmann, M. A. Chatti and U. Schroeder. 2012. Design and implementation of a learning analytics toolkit for teachers. Educational Technology & Society 15(3): 58-76.         [ Links ]

    Elias, T. 2011. Learning analytics: Definitions, processes and potential. Learning analytics. http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf        [ Links ]

    Ellis, C. 2013. Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology 44(4): 662-664.         [ Links ]

    Ferguson, R. 2012. Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning 4(5/6): 304-317.         [ Links ]

    Fynn, A. and J. Adamiak. 2018. A comparison of the utility of data mining algorithms in an open distance learning context. South African Journal of Higher Education 32(4): 81 -95.         [ Links ]

    Goldstein, P. J. 2005. Academic analytics: The uses of management information and technology in higher education. ECAR Research Study Volume 8. http://www.educause.edu/ers0508        [ Links ]

    Greller, W. and H. Drachsler. 2012. Translating learning into numbers: A generic framework for learning analytics. Educational Technology & Society 15(3): 42-57.         [ Links ]

    Hutchings, P., M. T. Huber and A. Ciccone. 2011. The scholarship of teaching and learning reconsidered. Institutional integration and impact. California: Jossey-Bass.         [ Links ]

    Johnson, L., R. Smith, H. Willis, A. Levine and K. Haywood. 2011. The 2011 Horizon Report. Austin: The New Media Consortium.         [ Links ]

    Kajitani, A. 2014. Make friends with student data. Educational Horizons 93(2): 24-26.         [ Links ]

    Kern, B., G. Mettetal, M. D. Dixson and R. K. Morgan. 2015. The role of SoTL in the academy: Upon the 25th anniversary of Boyer's scholarship reconsidered. Journal of the Scholarship of Teaching and Learning 15(3): 1 -14.         [ Links ]

    Kuh, G. D. 2008. High-impact educational practices: What they are, Who has access to them, and Why they matter. Washington: Association of American Colleges and Universities.         [ Links ]

    Long, P. and G. Siemens. 2011. Penetrating the fog. Analytics in learning and education. EDUCAUSE Review Sept/Oct: 31-40. https://net.educause.edu/ir/library/pdf/erm1151.pdf        [ Links ]

    Lourens, A. and D. Bleazard. 2016. Applying predictive analytics in identifying students at risk: A case study. South African Journal of Higher Education 30(2): 129-142.         [ Links ]

    Macfadyen, L. P., S. Dawson, A. Pardo and D. Gasevic. 2014. Embracing big data in complex educational systems: The learning analytics imperative and the policy challenge. Research & Practice in Assessment 9: 17-28.         [ Links ]

    Natek, S. and M. Zwilling. 2014. Student data mining solution-knowledge management system related to higher education institutions. Expert Systems with Applications 41: 6400-6407.         [ Links ]

    Norris, D. M. and L. L. Baer. 2013. Building organizational capacity for analytics. Louisville, CO: EDUCAUSE Publications. https://library.educause.edu/resources/2013/2/building-organizational-capacity-for-analytics        [ Links ]

    Norton, L. 2016. What student data will mean for higher education. Press release on the Inquiry report: From bricks to clicks: the potential of data analytics in higher education. Higher Education Commission.http://www.policyconnect.org.uk/hec/sites/site_hec/files/he_commission_press_release_-_from_bricks_to_clicks.pdf        [ Links ]

    Page, C. S., S. Edwards and J. H. Wilson. 2012. Writing groups in teacher education: A method to increase scholarly productivity. SRATE Journal 22(1): 29-35.         [ Links ]

    Pope-Ruark, R. 2012. Back to our roots: An invitation to strengthen disciplinary arguments via the scholarship of teaching and learning. Business Communication Quarterly 75(4): 357-376.         [ Links ]

    Quinnell, R., C. Russell, R. Thompson, N. Marshall and J. Cowley. 2010. Evidence-based narratives to reconcile teaching practices in academic disciplines with the scholarship of teaching and learning. Journal of the Scholarship of Teaching and Learning 10(3): 20-30.         [ Links ]

    Siemens, G. 2013. Learning analytics: The emergence of a discipline. American Behavioral Scientist 57(10): 1380-1400.         [ Links ]

    Vajoczki, S., P. Savage, L. Martin, P. Borin and E. D. H. Kustra. 2011. Good teachers, scholarly teachers and teachers engaged in scholarship of teaching and learning: A case study from McMaster University, Hamilton, Canada. The Canadian Journal for the Scholarship of Teaching and Learning 2(1): online. http://ir.lib.uwo.ca/cjsotl_rcacea/vol2/iss1/2?utm_source=ir.lib.uwo.ca%2Fcjsotl_rcacea%2Fvol2%2Fiss1%2F2&utm_medium=PDF&utm_campaign=PDFCoverPages        [ Links ]

    Van Barneveld, A., K. E. Arnold and J. P. Campbell. 2012. Analytics in higher education: Establishing a common language. EDUCAUSE Learning Initiative 1. http://net.educause.edu/ir/library/pdf/ELI3026.pdf        [ Links ]

    Wlodkowski, R. J. 2003. Fostering motivation in professional development programs. New Directions for Adult and Continuing Education 98: 39-47.         [ Links ]

    Wood, M. and F. Su. 2017. What makes an excellent lecturer? Academics' perspectives on the discourse of "teaching excellence" in higher education. Teaching in Higher Education 22(4): 451-466.         [ Links ]

    Wright, M. C., T. McKay, C. Hershock, K. Miller and J. Tritz. 2014. Better than expected. Using learning analytics to promote student success in gateway science. Change 43(1): 28-34.         [ Links ]