SciELO - Scientific Electronic Library Online

 
vol.38 issue6Students' perception on the adoption of an e-textbook (digital) as an alternative to the printed textbookRe-imagining Africanisation of sustainable epistemologies and pedagogies in (South) African higher education: a conceptual intervention author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

    Related links

    • On index processCited by Google
    • On index processSimilars in Google

    Share


    South African Journal of Higher Education

    On-line version ISSN 1753-5913

    S. Afr. J. High. Educ. vol.38 n.6 Stellenbosch Nov./Dec. 2024

    https://doi.org/10.20853/38-6-5970 

    GENERAL ARTICLES

     

    Blended, flipped and lit: student perceptions and performance under blended learning with a flipped classroom and a lightboard

     

     

    J. WinfieldI; E. WhitelawII

    ICollege of Accounting, University of Cape Town Cape Town, South Africa https://orcid.org/0009-0002-0968-8668
    IISouthern Africa Labour and Development Research Unit, University of Cape Town Cape Town, South Africa https://orcid.org/0000-0002-1466-3580

     

     


    ABSTRACT

    Promoting student success is a key objective for higher education institutions in South Africa, and the post-pandemic era presents the sector with a novel opportunity to take advantage of improved online learning infrastructure and skills. As institutions face critical decisions regarding teaching modalities, it is imperative to establish and enhance evidence on the extent to which blended learning can - or cannot - facilitate student success in a post-pandemic setting.
    This article examines student perceptions of and academic performance in a large first-year accounting class which recently implemented a blended learning model that involved a flipped classroom approach and video lessons filmed with a lightboard. Using a combination of qualitative and quantitative analyses, we find that students perceive the blended model as advantageous to their learning experience and perceive video lessons filmed with a lightboard to be especially valuable. These positive sentiments are reflected in improved performance across the entire grade distribution. Using regression analysis that allows us to account for observable differences in the characteristics of those taught in person and those taught via the blended model, we find a statistically significant 7 per cent improvement in students' final grades. We find an even stronger association between the blended model and improved performance for students who have experienced gaps and disparities in education and life experiences. Ultimately, this study, conducted at a contact university in South Africa, makes a strong case for the benefits of blended learning, and reveals one way to structure the learning activities in a large class to take full advantage of these benefits.

    Keywords: blended learning, flipped classroom, lightboard, regression analysis, higher education, student success


     

     

    INTRODUCTION

    Like many higher education institutions around the globe, a key priority of South African universities is to promote student success (Department of Higher Education and Training 2013). The post-pandemic era has presented educators with a novel opportunity to take advantage of instructors' and students' newfound facility with online learning activities, fostered during Covid-19 closures, to enhance students' learning experience and academic success. By strategically resuming in-person learning activities to supplement the most effective online activities, courses can combine the strengths of both online and in-person modalities, in the way posited by much blended learning research (Lapuh Bele and Rugelj 2007; Muxtorjonovna 2020).

    That said, some early anecdotal evidence from South African universities suggests hesitancy in this regard: many courses seem to have simply returned to the largely in-person style of teaching and learning that was in effect before the pandemic, perhaps supplemented by just a handful more online learning activities than before. At this critical juncture, when such decisions about teaching modalities are being made, it is crucial to establish and enhance evidence on the extent to which blended learning can - or cannot - facilitate student success.

    This article evaluates student perceptions of and academic performance in a large first-year accounting class at a contact university in South Africa, which recently implemented a blended learning model of instruction. We evaluate the perceptions and performance of students under the course's blended model (whom we have called the "blended group") in direct comparison with those enrolled under the in-person, pre-pandemic model (the "in-person group"). As the two groups shared consistent curricula, desired learning outcomes, course material, examination standards, and the staff responsible for delivering the blended iteration were also involved in delivering the in-person iterations, we are able to isolate the impact of the blended learning model relative to the in-person model.

    Although we evaluate an instance of blended learning, we cannot claim to assess blended learning in general terms. Any two courses using a blended learning model may vary not only as to the ratio of in-person to online learning activities, but also as to which instructional means and methods those activities employ (Graham 2021). This is as it should be, given that different courses require different forms of blended learning to suit the content and students' needs (Poon 2013). The particular form of blended learning assessed in this study is combined with a flipped classroom approach, in which students learn foundational concepts before coming to class. The means by which they learn the foundational concepts is via asynchronous viewing of videos filmed using an innovative teaching tool called a lightboard. This article is therefore an evaluation of the combination of the three new features of the course: blended learning, flipped classroom, and lightboard videos.

    The article proceeds by reviewing literature on the theory and benefits of blended learning, flipped classrooms, and lightboard videos, before turning to discuss the research context and methodology. Findings are presented in two parts - we first discuss results on student perceptions, and thereafter discuss the findings on student performance. Our study's findings contribute to current discourse in South Africa on student perceptions of accounting education and pedagogies (e.g., Sexton and Rudman 2022; Steenkamp and van Schalkwyk 2023) flipped classrooms (e.g., Gerber and Eybers 2021); blended learning (Janse van Rensburg and Oguttu 2022) and student academic performance, especially in accounting (e.g., Bruwer and Ontong 2020; Rossouw and Brink 2021; Papageorgiou 2022).

     

    REVIEW OF THEORY AND RELATED WORK

    Blended learning

    Blended learning has been defined in a wide variety of ways (see, for example, Garrison and Kanuka 2004; Picciano 2006; Hrastinski 2019). For our purposes, we use a definition that focuses purely on the blending of modalities, such as Graham's (2021, 196) parsimonious "blended learning is the strategic combination of online and in-person learning". However, this definition needs a small refinement to ensure that courses that are delivered mostly using an inperson modality, but also include a smattering of online learning resources - for example readings and lecture recordings accessed via learning management system (LMS) - are not included in the definition. Similarly, a course where all learning happens online, except for a few in-person workshops, ought not to be considered truly blended. For these reasons, Allen and Seaman (2016) proposed that blended learning involves no less than 30 per cent, and no more than 80 per cent, of learning activities being online.

    Numerous studies have sought to evaluate the overall benefits of blended learning, with many finding positive results. For example, the United States Department of Education (Means et al. 2013) commissioned a meta-analysis of schools, higher education institutions, and professional training, which finds that students who are educated through blended modalities outperform students who are taught only in person. Another meta-analysis focused on higher education finds similar effects (Bernard et al. 2014). Some studies do not find these positive effects, however. For example, Keller et al. (2009) observe no statistically significant difference in academic performance between a blended group and an in-person group.

    Other studies investigate student perceptions of blended learning. Martinez-Caro and Bolarín (2011) conduct a study of 21 undergraduate and postgraduate courses, establishing that student satisfaction is higher in blended courses than in in-person courses. Again, the positive effects are not universal, with some studies showing lower rates of student satisfaction in blended courses (e.g., Strayer 2012).

    In addition to student satisfaction, other benefits of blended learning have been hypothesized, including that: it develops digital skills necessary for today's world of work (Patmanthara and Hidayat 2018); it saves on operational cost by reducing instructor time (Vaughan 2007); and it makes education more accessible by allowing students to study after working hours and from a greater distance (Poon 2013). This expanded access would be particularly of value in a country like South Africa, where the government's National Development Plan calls for approximately 300 000 more enrolments in higher education institutions between 2021 and 2030, and almost two million more enrolments in Technical and Vocational Education and Training Colleges (Department of Higher Education and Training 2023).

    Existing research also reflects on barriers and challenges to successful blended learning delivery, which are much the same as for online learning more generally. For example, blended learning can have negative psycho-social effects because it generally places higher demands than in-person learning on student self-motivation and time management, and introduces perceptions of psychological distance between instructors and fellow students (So and Brush 2008; Sharaievska et al. 2022). However, such feelings are typically stronger under purely online learning conditions. Authors of a study on student perceptions to an online module in postgraduate accounting in South Africa, initiated during the pandemic, to suggest that, "Such disconnect could be alleviated by applying a blended learning approach in future, using the advantages of both the F2F and the online environment" (Steenkamp and van Schalkwyk 2023,233).

    Additionally, resource constraints are often a substantial barrier because a switch to blended learning calls for wholesale instructional redesign, including reworking of the course overview, lesson plans, lesson materials and resources, and continuous review (Cheung et al. 2010). Another common barrier is created by a lack of, or intermittent, access to the internet, which prevents or inhibits students' engagement with online content (Almahasees, Mohsen, and Amin 2021). In developing countries like South Africa, these problems can be exacerbated by the digital divide (Mhlanga 2021).

    The flipped classroom

    Also known as the inverted classroom, the flipped classroom is an instructional model in which the activities that have traditionally taken place in a classroom, such as the learning of foundational content, are not performed in the classroom. Instead, they are learned by students before they arrive at class, where they do activities more closely related to traditional homework, such as learning to apply content that they encountered before class (Bergmann and Sams 2012; Sohrabi and Iraj 2016). In this way, students are responsible for their own initial learning process, and the instructor instead helps them with more complex tasks (Lai and Hwang 2016).

    Flipped classrooms do not necessarily employ a blended learning model, because the flipped classroom need not include any online learning activities. However, the flipped approach lends itself to a blended model, because - especially with technological advances -asynchronous online learning activities are a potentially effective means by which students can access the foundational content (Fisher, Perényi, and Birdthistle 2021).

    Flipped classroom research has shown favourable effects. For example, a meta-analysis of 71 studies revealed generally positive academic performance, and several other advantages, including increased student motivation and more positive student attitudes (Akçayir and Akçayir 2018). A small-scale study in South Africa on an online flipped classroom reveals that more than 65 per cent (out of 35) of students commented that they found the fact that they could revisit discussions in the video one of the most valuable aspects (Gerber and Eybers 2021). This is extremely important in the South African context where many students are not first language speakers of English (the medium of instruction at the majority of institutions, including the one in our study).

    In the Netherlands, Van der Velde et al. (2021) find that students appreciate the flexibility of the flipped model, and report achieving a better understanding through active application and peer learning, in a large first-year class. However, the authors also recommended that students receive close guidance about how to prepare for class, that expectations be made explicit, and that external incentives be considered to keep students motivated.

    Lightboard videos

    A lightboard, also known as a glassboard or learning glass, is a glass screen placed between a camera and an online instructor, on which the instructor can write or draw with a fluorescent marker (Northwestern University 2016). Due to LEDs recessed into the frame of the lightboard, the glowing ink appears to float in front of the instructor. The lightboard is thus used in much the same way as a whiteboard in an in-person class, although the instructor never obscures any part of the board, and does not have to turn around to write, so can always look toward the camera lens. The lightboard can be used in a live online class using livestreaming software, or it can be used to create video recordings for asynchronous learning. In either case, the footage must be reflected horizontally in order for text handwritten on the lightboard to appear correctly to the viewer. Video editing techniques also allow for typewritten text to be overlaid on the screen, economising on time and space. Suitable images and videos can also be added, taking advantage of a wide range of multimedia tools.

    Since the lightboard is a relatively recent innovation, there has not yet been much research about it. The few extant studies indicate mostly favourable effects. For example, Rogers and Botnaru (2019) find that students who watch lightboard videos experience a modest performance increase and report improved understanding. Smith, Knight, and Penumetcha's (2017) analysis of student feedback commend it as innovative and engaging, and VanderMolen, Vu, and Melick (2018) point to its effectiveness for teaching foundational concepts in a flipped classroom environment. These positive effects may result from the way that the instructor's gaze and gestures are more visible to students in a lightboard video than they are in picture-in-picture videos, thereby improving the instructor's social presence (Bhat, Chinprutthiwong, and Perry 2015; Lubrick, Zhou, and Zhang 2019).

    Mixed results were reported by Stull et al. (2018), who claim to have conducted the first systematic investigation that compares learning through lightboard videos and conventional whiteboard lessons. The study observed that the lightboard improved both students' ratings of the experience and their test scores immediately after viewing a lightboard video, but the performance benefits did not persist a week later. They did not suggest a reason that learning acquired via the lightboard might fade but hypothesised that perhaps the control group learned enough while taking the initial test to equalise the scores on the later test.

     

    COURSE CONTEXT AND RESEARCH METHODOLOGY

    The two groups

    This study compares the academic performance and perceptions of two groups of students enrolled in a second semester, first-year accounting course at the University of Cape Town. The in-person group consists of 1 871 students from the two student cohorts enrolled in 2018 and 2019, before changes to the learning model were brought about by Covid-19. We use two years because the learning model employed was virtually identical, and so this gives a bigger data set to improve the statistical power of the study. We do not include years prior to 2018, because the results could be distorted by the fact that 2018 was the first year that a new financial aid policy was implemented in South Africa, and also the first in the recent past that the university did not experience shut-downs due to protest action. The blended group consists of the 963 students enrolled in 2022, which was the year in which pandemic restrictions were lifted sufficiently to resume in-person activities. In the intervening years - 2020 and 2021 - the course used an online model.1

    Some students (27% of the blended group and 37% of the in-person group) are in the university's Academic Development Programme (ADP). The primary purpose of ADP is to attract and retain students who have experienced gaps and disparities in education and life experiences. By providing a range of academic and psycho-social support, the programme aims to improve rates of graduation, sometimes over a deliberately extended period. Students not in ADP are often referred to as "mainstream" students.

    There were minimal differences between the blended learning programme offered to mainstream and ADP students in 2022, but for in-person learning in 2019 there were more considerable differences: the classes were taught separately by different individuals, and there was one extra lecture per week and several other optional learning interventions for ADP. We discuss this further when analysing student performance. Our key motivation in recognising these two streams is that the benefits of alternate learning models might differ depending on differences in previous education and life experiences.

    The two learning models

    Table 1 summarises the differences between the learning models experienced by the two groups in this study. The major features are indicated with an asterisk, and the in-person activities (for both groups) are shaded.

    The blended group's flipped classroom approach is evident in the section entitled "Activities related to learning how to apply the concepts". The in-person group had tutorial sessions, but more in-person instruction time is spent in the lectures shown at the top of the table, where the foundational concepts are taught. For the blended group, no in-person instruction time is spent on teaching the foundational concepts, and an extra 45 minute flipped classroom lecture is added each week to bolster the instruction time spent on applying concepts. Thus, this group experienced more in-person instruction to help them apply concepts than the in-person group, despite considerably less in-person teaching time overall.

    The blended group experienced more, and a greater variety of, learning activities. This is largely because of the online features available to the blended group. For example, because there were between five and seven lightboard video lessons each week, all on their own lesson page in the LMS, additional content could be added to each lesson page in a targeted way, e.g., additional examples related specifically to the relevant concepts, links to a dedicated topic about each lesson in the online Q&A tool, and detailed section references to the textbook. Also, because analytics are available to inform instructors about students' engagement with online material (e.g., completion rates, quiz scores, and upload and download counts), instructors are able to create personalised feedback for students, such as the mid-semester progress reports and weekly badges.

    The nature and structure of these learning activities conforms to key recommendations emerging from the literature about blended learning and the flipped classroom approach. For example, to mitigate time management risks, they are delivered according to a consistent weekly plan with explicit guidance about what work to do each day. The lightboard videos, knowledge checks and Q&A are designed to encourage student engagement despite being online. The flipped classroom lectures include group work to reduce the psychological distance between students. Lastly, the mid-semester reports and weekly badges are intended to counteract the lower self-motivation that can accompany blended learning.

    Methodologies

    To assess students' perceptions of the course, a comprehensive course evaluation was made available via the LMS to all students in the final week of the semester. This survey instrument3included 27 questions, many of which used a five-point Likert scale (1 for strongly agree through to 5 for strongly disagree), which provided quantitative feedback. Also, every question included the option to add a comment, producing a large volume of qualitative feedback.

    In total, 830 students from the blended group responded (a response rate of 86.7%). The analysis of responses to this evaluation indicates how students perceived the blended learning model generally, and also how effective they perceived particular features of the model to be. In some cases, it is possible to make comparisons to the in-person group's perceptions, using data from course evaluations given to the 2018 and 2019 classes, which had a collective response rate of 81.1 per cent. In both groups, a high response rate was incentivised by linking completion of the survey to a small benefit in the way the quiz component of the final grade was calculated.

    To assess the effect on academic performance, we fit students' final grades data to an ordinary least squares (OLS) regression model of the blended and in-person groups. The precision of our estimated association between the blended learning model and academic performance is enhanced by controlling for the observable ability and background characteristics of both groups, which may have also influenced academic performance. These characteristics are students', gender, age, whether their home language is English, NBT4 grades, and school-leaving exam grades reflected in an Admission Point Score (APS).5

    The sample comprises first-year students who enrolled for the course for the first time in our years of interest (N = 1 962). We do not include students who are repeating the course.6 The performance of these students may be somewhat affected by their experience of a different learning model in a previous year, which could potentially compromise our conclusions. Owing to the anonymous nature of responses to the survey instrument, responses are not able to be linked to student performance.

     

    FINDINGS AND DISCUSSION

    Assessment of student perceptions

    Overall perceptions

    Perhaps the most fundamental question on the survey instrument asked students about which delivery mode (blended, online, in-person) they felt was best for them, with explanations to ensure they understood the differences. 79 per cent preferred the blended model (657 students), 11 per cent preferred an online model (91 students), and 7 per cent preferred an in-person model (61 students), with 3 per cent responding "I don't know" (21 students). Comments in response to this question included:

    [1] "I prefer blended learning because you can access material time that suites your study plan and you also we are able to physically interact with [the lecturer] and the tutor."

    [2] "blended learning helps me to have time to try and understand work on my own without being pressurised by others understanding, and the lectures on Friday help to consolidate any information."

    At the time that students responded to the survey instrument, they had experienced online learning models in the first semester, before Covid-19 restrictions were reduced at the mid-year point. In the vast majority of cases, they had also experienced in-person models, which was the form of instruction that a few of their other courses resumed as soon as they were permitted to do so. Students were thus able to judge the different models from a reasonably well-informed perspective. That the blended model was preferred by seven times more respondents than any other model is an unequivocal statement that their perceptions of the model were extremely positive.

    Perceptions of learning model features

    The survey instrument also included questions about specific features of the course. The prompt was "Please indicate whether you think each of the following ... learning activities has been valuable for your learning. If you have not engaged with a learning activity, please leave it blank." Available response options were based on a five-point Likert scale. Figure 1 summarises the responses.

    Comments in response to the prompt about the video lessons included the quotes below. There were many other responses which reported the same idea as these quotes.

    [3] "You can access material at a time that suits your study plan"

    [4] "I can complete the week's work at my own pace"

    [5] "Helps me to try and understand work on my own without being pressurized by others"

    [6] "You're able to rewind and rewatch sections"

    [7] "You can always go back to the video when there's something you don't understand"

    [8] "Allows students to absorb the course content from short concise videos rather than long lectures"

    Since the video lessons were offered to the blended group as a substitute for the lectures that were offered to the in-person group, we can compare responses about the video lessons to the responses about lectures in the in-person group's course evaluations. No equivalent general question was asked about lectures, but specific questions were asked about the individual instructors. Responding to "Lecturers with this lecturer have contributed to my learning", the in-person group gave quite varied responses for the eight lecturers who lectured this group, with the percentage strongly agreeing ranging between 69 per cent and 21 per cent. The blended group's responses about the video lessons are above both of these ranges (82%), suggesting that the blended group appreciated the video lessons more than the in-person group appreciated lectures by any of their eight lecturers, despite the fact that one of them was the instructor who created the lightboard videos for the blended group.

    A follow-up question asked the blended group to rate their level of agreement with the statement "recorded videos are more valuable for my learning than they would be if they were created without a lightboard." Of the 589 respondents to this prompt, 460 (78%) strongly agreed, and 81 (14%) agreed. Common comments included many like the following:

    [9] "By being able to see the lecturer and watch them write out the work with you during the lecturer is much more effective than the traditional way recorded videos are done where a screen / presentation is recorded with audio only"

    [10] "The lightboard makes it easier to follow along and understand the concepts being taught. It makes it feel like I'm in a classroom environment"

    [11] "The lightboard used makes it incredibly easy to feel as though [the lecturer] is engaging with you despite it being a video"

    [12] "I prefer lightboard videos because we can see [the lecturer's] facial expressions and hand gestures, when he is explaining"

    [13] "The lightboard makes it easy for the lecturer to illustrate what they are speaking about while they can be seen by the student watching the lecture video. This helps because these are two essential parts in forming an understanding"

    [14] "I am a visual learner, so I like being able to see things step by step instead of being overwhelmed by slides full of information"

    As indicated by Figure 1, the knowledge checks were also seen by the blended group as making an important contribution to their learning. Comments in response to the question about the knowledge checks included:

    [15] "The knowledge checks helps with knowing how much information you have grasped and understood and the feedback helps"

    [16] "These helped to ensure that I pay attention while watching the videos"

    The weekly quizzes for both groups are intended to assess the students' understanding of foundational concepts. It is noteworthy that the responses to the question about the weekly quizzes were substantially more positive in the blended group than the responses to the equivalent question in the in-person group, which was asked if the weekly quizzes completed in tutorial sessions contributed to their learning, with 34 per cent strongly agreeing. The difference in perceived value is most likely explained by the timing: since the blended group's quiz was online, it could be inserted into the learning process at the most appropriate time, that is, when students had finished learning the concepts, but before they began learning to apply them. Since the in-person group's quiz was conducted in person, it had to be delayed until an appropriate in-person event, the tutorial session, which took place several days after the students had already begun to apply their conceptual understanding by answering the weekly assignment questions.

    The flipped classroom lecture, which took place on a Friday after students had completed the video lessons and the quiz, and before they attempted their assignments, was perceived less positively by the blended group than the other features mentioned so far. The quantitative data here is somewhat at odds with the qualitative comments received in response to the question about this activity, which were almost all positive, with students identifying its role as helping them to obtain a deep understanding, improving their critical thinking about the topics, helping with difficult concepts, assisting with learning how to manage time when answering questions, and giving them an opportunity to ask the lecturer questions directly. Also, in responses to the open-ended question "What aspect(s) of the course do you most appreciate?", 91 students mentioned the flipped classroom lectures. The only three negative comments were:

    [ 17] "By Friday we should have already completed the week's work because of the Thursday quiz, so i personally find going to a flipped friday lecture somewhat pointless"

    [ 18] "There should be more explaining on how to get to the answers"

    [19] "Questions that we are required to answer are so easy they do not help us to prepare for tests as complex questions are asked there"

    The above comments suggest that the flipped classroom lectures were not perceived as positively partly because some students did not understand the proper function of the class, and also because these classes, which were introduced for the first time for this group, could be refined by the teaching team to enhance their effectiveness. The design of this activity would probably benefit from attention to the recommendations of Van der Velde et al. (2021), e.g., that students receive close guidance about how to prepare for class, and that expectations be made explicit.

    Of all the features introduced specifically for the blended group, the least popular was the online Q&A tool. The instructors had aimed to make it as helpful as possible, creating forums for each lesson and question, and responding within a few hours to questions posted. Nonetheless, very few questions were asked. Even in the first week of the semester, only 15 students asked questions (less than 2% of the class). By the last week of the semester, no questions were asked. Part of the advantage of such a tool is that it gives students a chance to read their peers' questions, and the answers, but this was under-utilised too, with most questions being viewed by less than 10 students. It is difficult to say why this tool was not perceived more positively, as no qualitative feedback identified any problems with it. According to the university's digital learning consultants, the low usage rates conform with the experience of other courses, many of which introduced this tool during Covid-19 but found that it was used by very few students, even in the purely online environment. It seems that students prefer different ways of asking questions, especially in a blended context. As one commenter stated: "If I have the question I usually ask my tutor".

    The last feature we analyse is the weekly badges. These were added to the course design as a way to mitigate a challenge of both blended learning and the flipped classroom approach: that it can place higher demands on student self-motivation. Badges were congratulatory emails with fun GIFs sent to students who had performed well in the relevant week by completing all the knowledge checks, achieving at least 60 per cent on the weekly quiz, submitting a complete assignment, and by attending their tutorial session. The record of their engagement with the week's activities was possible because their online activities were tracked by analytics tools. As is shown in Table 1, the badges were perceived very positively by students. This is perhaps surprising, because 48 per cent of the class earned fewer than four out of the total eleven badges. Although there was one comment which suggested the system risks demotivating students, there were 21 comments which expressed enthusiasm for the system, including:

    [20] "The motivation from getting the badge is unbelievable. It makes you eager to do all the work as soon as it's released"

    [21] "It makes me feel like my efforts are acknowledged which is very important"

    [22] "Really gives us that extra push when we really don't feel like going the extra mile. And it just makes the whole thing fun and hence more learning takes place"

    [23] "It was really nice to be held accountable"

    Assessment of student performance

    Table 2 displays a summary of the data used in our analysis. The first two columns show average characteristics of students in the in-person and blended groups. The first row shows students' average final grade for the course. For each student, their final grade is a weighted average of two tests, an exam, and course work.7 The subsequent rows indicate the averages for the control factors we use.

    Some students (typically international students who do not write South African school examinations) do not have NBT and school-leaving exam/APS information. Additional rows indicate the proportion of students who are missing this data. In total, approximately 8 per cent of students are missing (some of) this information. Controls for students' observable academic ability are important for isolating the effect of the blended learning model, and therefore, students with missing APS and/or NBT scores are necessarily excluded from the regression analysis. This is unlikely to affect our findings, as there is no reason to believe that missing information is correlated with academic performance, and there are no statistically significant differences in missing information between the two groups.

    The final three rows in the table relate to non-numerical grades, which are awarded for various reasons. Students are coded "DPR" if they have underperformed during the semester to the extent that they are not permitted to write the final exam. Others are ill on the day of the final exam and are granted a deferred exam (DE) to be written the following year. Other non-numerical grades include absences without the granting of a deferred exam (AB), an official leave of absence (LOA), and rare codes for incomplete (INC) and outstanding (OS). None of these non-numerical grades can be used in the determination of an average numerical final grade and are thus excluded from the regression sample.

    Non-numerical grades could cause a problem for the validity of the analysis if, for example, many more students are coded DPR in the blended group than in the in-person group. Specifically, it may make it appear as if the blended group performed better, but only because a lower share of low-performing students was eligible to write the exam. In practice, the rates of students with non-numerical grades are similar in both groups, with differences not statistically different from zero. The summary statistics of students who comprise the regression sample (i.e. those who are not missing data and who have numerical grades) are shown in the third and fourth columns of Table 2.

    Figure 2 shows how similar the final grade distribution was for each year included in the in-person group (2018 and 2019) and shows the higher grades of the blended group across the distribution of grades. The average course grades correspond to those shown in the top row of Table 3: 70.84 for the blended group versus 66.29 for the in-person group. As noted, the 4.55 difference in mean course grades does not account for observable ability and background characteristics, which - independent of the learning model - may influence academic performance. We therefore run a regression analysis, with the specification as follows:

    The dependent variable, yi, is a student's final grade for the course. The variable blendedlearningi is an indicator equal to one if the student is taking the course in 2022, and zero otherwise. β is the regression coefficient of interest, the association between the learning model in 2022 and final course grade. Xt is a vector of individual characteristics: whether the student is in the ADP stream, English home language, gender, age, NBT grades, and APS. Finally, a is a constant term - the average final grade, yi, when all other dependent variables are set to 0, and εi is the individual-specific error, which includes all unobserved factors that affect final grades, assumed to be uncorrelated with all included regressors.

    Table 3 presents the regression coefficients from the OLS regression. The first row, labelled "blended learning", reports the average difference in grades between the in-person and blended groups. In the first column, this coefficient is positive, reflecting the blended group's higher average final grade before any controls are added. The three asterisks next to the coefficient indicate that, like all of the results in this row, this coefficient is statistically significant at the 1 per cent level.

    The second column shows the blended learning coefficient when controls are added for the student's stream being ADP, whether they identify as female, whether English is their home language, and their age. This coefficient of 4.32 is slightly lower than 4.55, but nonetheless indicates that the higher average grades in the blended group cannot be explained by differences in these characteristics between the cohorts.

    Another set of controls - the NBT and matric scores, which reflect a combination of academic preparedness and ability, are added in the third column. Effectively, this column addresses whether differences in students' academic ability when they were admitted could account for the stronger performance of the blended learning group. The result here indicates that the statistically significant positive effect remains, and even grows slightly, after controlling for the influence of observable academic ability on final grades. The fact that the blended group performed worse, on average, than the in-person group in the first semester, pre-requisite accounting course (not blended) (Table 2), but better in the second semester, points to the strength of the blended learning model.

    Next, we include an interaction effect between the blended learning indicator and the ADP indicator to test whether there is a differential effect for ADP students in 2022. Rather than a weaker effect for ADP students, a positive and statistically significant differential effect of 2.49 percentage points indicates a stronger effect of blended learning for ADP students than for mainstream students in 2022. Overall, the effect of blended learning for ADP students is a 6.11 (3.62 + 2.49) percentage point improvement, on average, compared to a 3.62 percentage point improvement for mainstream students. This is a powerful finding, given that under the in-person model ADP students were offered more support than mainstream students, whereas under the blended model, there were very few differences between the two streams. This suggests blended learning represent an example of an inclusive and accessible pedagogy (e.g., Nyoni 2022).

    The last column shows the coefficient interpreted as a percentage change, rather than a percentage point change, which is achieved by applying a logarithmic scale to the dependent variable. This shows that the blended learning effect, when controlling for all of the characteristics, represents a significant 7 per cent increase in grades. This is a practically large and relevant change, pointing to a strong benefit of the learning model used in this course. We run several robustness checks on these findings, with the effect sizes summarised in Appendix A. Coefficients on the blended learning variable are substantively similar.

     

    CAVEATS AND CONSIDERATIONS

    There are several caveats to this study. First, the survey instrument used to assess students' perceptions was an online questionnaire, which limited the potential for deep exploration of the qualitative feedback. Second, in our analysis of academic performance there is the potential that unobservable characteristics are correlated with performance in ways that cause us to over- or under-estimate the association between blended learning and student performance. For example, if students are more resilient in 2022, and resilience positively impacts on academic performance, then we would overestimate the effect of blended learning. That said, given that we can only control for what we can observe, the size of the blended learning coefficient in this case makes a strong argument for a blended model. Third, while our analysis of academic performance controls for many factors, it cannot identify the contribution of each of the different learning activities offered to the blended learning group. The survey instrument does go some way to assessing the impact of different activities, but students' perception of the value of a learning activity is only a proxy for their true value - it is possible that students do not correctly perceive the learning benefits of certain activities. Similarly, although these findings reflect positively on the combination of blended learning, the flipped classroom and a lightboard, we cannot conclude about the learning effects of any one of these three features on its own.

     

    CONCLUSION

    Our study finds that blended learning, when combined with a flipped classroom approach and recorded lightboard videos as the primary means of teaching foundational concepts, can be highly effective. Not only do students prefer this combination over a traditional contact or a purely online model, but also on average their academic grades increase by up to five percentage points. For a higher education sector grappling with student success and uncertainty surrounding the extent to which online teaching features at contact universities should be retained, our findings make a compelling argument for blended learning. Unlike purely online learning which can widen inequalities in higher education, our study shows that blended learning has strong and positive effects for ADP students, even exceeding those for mainstream students.

    To be effective, however, these features should be implemented in ways that mitigate the risks associated with blended learning, for example by ensuring that all students have the resources to easily access the learning activities, by providing guidance and structure to help students with time management, by using group work to reduce psychological distance between students, and by harnessing online analytics to create reward structures that boost motivation. For maximum benefit from a similar learning model, instructors should therefore be prepared to make wholesale changes to course design, and especially consider including the learning activities we have found to be most effective: lightboard videos, knowledge checks, weekly online videos, and weekly badges.

    Primary data availability: On request.

    Secondary data

    University of Cape Town. 2018-2022 [dataset]. University of Cape Town, 2023. [privately distributed]

     

    NOTES

    1 . We do not analyse these online groups because the distribution of grades in these years is peculiar. Abnormal performance improvements in 2020 have been established across the university, likely owing to assessment leniency and reduced course loads (Whitelaw, Branson, and Leibbrandt 2022). On the other hand, there is anecdotal evidence of unprecedented declines in performance in 2021. This cohort of students was impacted by learning loss in their final year of schooling, and additional screening tests for admission were not written in 2021.

    2 . For completeness, this table lists all non-negligible learning activities. For succinctness, only the activities most pertinent to the learning model are analysed in this study.

    3 . Available on request.

    4 . The National Benchmark Test is a set of three tests (academic literacy, quantitative literacy, mathematics) used each year by many South African universities to assess applicants' preparedness for tertiary studies. The grade is one of the factors used to determine admission.

    5 . South Africans who complete twelve years of school write a standardised set of school leaving examinations, results of which comprise an Admission Points Score (APS) used by the university to determine admissibility. APS is the summation of students' six (best) grades (%), excluding Life Orientation.

    6 . This implies that the sample of students who respond to the survey may not directly correspond to those whose performance we assess, although there will be a high degree of overlap.

    7 . Test 1 is weighted 10 per cent, test 2 is weighted 15 per cent, the examination is weighted 65 per cent and course work is weighted 10 per cent.

     

    REFERENCES

    Akçayir, G. and M. Akçayir. 2018. "The flipped classroom: A review of its advantages and challenges." Computers & Education 126: 334-345. https://doi.org/10.1016/j.compedu.2018.07.021.         [ Links ]

    Allen, I. E. and J. Seaman. 2016. Online Report Card - Tracking Online Education in the United States. https://files.eric.ed.gov/fulltext/ED572777.pdf.         [ Links ]

    Almahasees, Z., K. Mohsen, and M. O. Amin. 2021. "Faculty's and Students' Perceptions of Online Learning During COVID-19." Frontiers in Education 6: 638470. https://doi.org/10.3389/feduc.2021.638470.         [ Links ]

    Bergmann, J. and A. Sams. 2012. Flip your classroom: Reach every student in every class every day. International society for technology in education.         [ Links ]

    Bernard, R., E. Borokhovski, R. Schmid, R. Tamim, and P. Abrami. 2014. "A meta-analysis of blended learning and technology use in higher education: From the general to the applied." Journal of Computing in Higher Education 26(1): 87-122. https://doi.org/10.1007/s12528-013-9077-3.         [ Links ]

    Bhat, S., P. Chinprutthiwong, and M. Perry. 2015. "Seeing the Instructor in Two Video Styles: Preferences and Patterns." In Proceedings of the 8th International Conference on Educational Data Mining, 305-312. Madrid, Spain.         [ Links ]

    Bruwer, A. and J. M. Ontong. 2020. "Early assessment as a predictor of academic performance: an analysis of the interaction between early assessment and academic performance by first-year accounting students at a South African university." South African Journal of Higher Education 34(4): 11-26.         [ Links ]

    Cheung, K. S., J. Lam, N. Lau, and C. Shim. 2010. "Instructional Design Practices for Blended Learning." In Proceedings of 2010 International Conference on Computational Intelligence and Software Engineering, 1-4. Wuhan, China. https://doi.org/10.1109/CISE.2010.5676762.         [ Links ]

    Department of Higher Education and Training. 2013. "White paper for post-school education and training. Building an expanded, effective and integrated post-school system." Government Gazette no 37229.         [ Links ]

    Department of Higher Education and Training. 2023. Statistics on Post-School Education and Training in South Africa: 2021. Published by the Department of Higher Education and Training.         [ Links ]

    Fisher, R., Á. Perényi, and N. Birdthistle. 2021. "The positive relationship between flipped and blended learning and student engagement, performance and satisfaction." Active Learning in Higher Education 22(2): 97-113. https://doi.org/10.1177/1469787418801702.         [ Links ]

    Garrison, D. and H. Kanuka. 2004. "Blended Learning: Uncovering Its Transformative Potential in Higher Education." The Internet and Higher Education 7(2): 95-105. https://doi.org/10.1016/j.iheduc.2004.02.001.         [ Links ]

    Gerber, A. and S. Eybers. 2021. "Converting to inclusive online flipped classrooms in response to Covid-19 lockdown." South African Journal of Higher Education 35(4): 34-57.         [ Links ]

    Graham, C. R. 2021. "Exploring Definitions, Models, Frameworks, and Theory for Blended Learning Research." In Blended Learning, edited by A. G. Picciano, C. D. Dziuban, C. R. Graham, and P. D. Moskal. 1st Edition. Routledge. https://doi.org/10.4324/9781003037736-3.         [ Links ]

    Hrastinski, S. 2019. "What Do We Mean by Blended Learning?" TechTrends 63(5): 564-569. https://doi.org/10.1007/s11528-019-00375-5.         [ Links ]

    Janse van Rensburg, E. D. and J. W. Oguttu. 2022. "Blended teaching and learning: Exploring the concept, barriers to implementation and designing of learning resources." South African Journal of Higher Education 36(6): 285-298.         [ Links ]

    Keller, J. H., J. M. Hassell, S. A. Webber, and J. N. Johnson. 2009. "A comparison of academic performance in traditional and hybrid sections of introductory managerial accounting." Journal of Accounting Education 27(3): 147-154.         [ Links ]

    Lai, C.-L. and G.-J. Hwang. 2016. "A self-regulated flipped classroom approach to improving students' learning performance in a mathematics course." Computers & Education 100: 126-140.         [ Links ]

    Lapuh Bele, J. and J. Rugelj. 2007. "Blended learning - An opportunity to take the best of both worlds." International Journal of Emerging Technologies in Learning (IJET) 2(3).         [ Links ]

    Lubrick, M., G. Zhou, and J. Zhang. 2019. "Is the Future Bright? The Potential of Lightboard Videos for Student Achievement and Engagement in Learning." EURASIA Journal of Mathematics Science and Technology Education 15(8): Article No: em1735. https://doi.org/10.29333/ejmste/108437.         [ Links ]

    Martínez-Caro, E. and F. Bolarín. 2011. "Factors affecting students' satisfaction in engineering disciplines: Traditional vs. Blended approaches." European Journal of Engineering Education 36(5): 473-483. https://doi.org/10.1080/03043797.2011.619647.         [ Links ]

    Means, B., Y. Toyama, R. Murphy, and M. Bakia. 2013. "The Effectiveness of Online and Blended Learning: A Meta-Analysis of the Empirical Literature." Teachers College Record 115(3): 1-47        [ Links ]

    Mhlanga, D. 2021. "The Fourth Industrial Revolution and COVID-19 Pandemic in South Africa: The Opportunities and Challenges of Introducing Blended Learning in Education." Journal of African Education 2(2): 15-42        [ Links ]

    Muxtorjonovna, A. M. 2020. "Significance of Blended Learning in Education System." The American Journal of Social Science and Education Innovations 02(08): 507-511. https://doi.org/10.37547/tajssei/Volume02Issue08-82.         [ Links ]

    Northwestern University. 2016. Lightboard Studio. https://digitallearning.northwestern.edu/article/2016/04/12/lightboard-studio        [ Links ]

    Nyoni, P. 2022. "Pedagogies of access and success among South African university students in the extended curriculum programmes amidst Covid-19 disruptions." South African Journal of Higher Education 36(4): 137-153.         [ Links ]

    Papageorgiou, E. 2022. "Self-Regulated learning strategies and academic performance of accounting students at a South African university." South African Journal of Higher Education 36(1): 251-278.         [ Links ]

    Patmanthara, S. and W. N. Hidayat. 2018. "Improving Vocational High School Students Digital Literacy Skill through Blended Learning Model." Journal of Physics: Conference Series 1028: 012076. https://doi.org/10.1088/1742-6596/1028/1/012076.         [ Links ]

    Picciano, A. 2006. "Blended learning: Implications for growth and access." Journal of Asynchronous Learning Networks 10(3). https://doi.org/10.24059/olj.v10i3.1758.         [ Links ]

    Poon, J. 2013. "Blended learning: An institutional approach for enhancing students' learning experiences." Journal of Online Learning and Teaching 9(2): 271-288.         [ Links ]

    Rogers, P. D. and D. T. Botnaru. 2019. "Shedding Light on Student Learning through the Use of Lightboard Videos." International Journal for the Scholarship of Teaching and Learning 13(3): 6.         [ Links ]

    Rossouw, M. and S. M. Brink. 2021. "An investigation into the success rates of students with no prior accounting knowledge in obtaining a professional accounting degree." South African Journal of Higher Education 35(2): 230-245        [ Links ]

    Sexton, N. D. and R. Rudman. 2022. "Program renewal: Students perception on changes to teaching pedagogy in auditing. South African Journal of Higher Education 36(3): 249-268.         [ Links ]

    Sharaievska, I., O. McAnirlin, M. H. E. M. Browning, L. R. Larson, L. Mullenbach, A. Rigolon, A. D'Antonio, S. Cloutier, J. Thomsen, E. C. Metcalf, and N. Reigner. 2022. "'Messy transitions': Students' perspectives on the impacts of the COVID-19 pandemic on higher education." Higher Education (2022): 1-18. https://doi.org/10.1007/s10734-022-00843-7.         [ Links ]

    Smith, T., C. Knight, and M. Penumetcha. 2017. "Lightboard, Camera, Nutrition!" Journal of the Academy of Nutrition and Dietetics 117(9): A70.         [ Links ]

    So, H.-J. and T. A. Brush. 2008. "Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors." Computers & Education 51(1): 318-336. https://doi.org/10.1016/j.compedu.2007.05.009.         [ Links ]

    Sohrabi, B., and H. Iraj. 2016. "Implementing flipped classroom using digital media: A comparison of two demographically different groups perceptions." Computers in Human Behavior 60: 514-524.         [ Links ]

    Steenkamp, G. and O. van Schalkwyk. 2023. Active learning in an online postgraduate research module: Perceptions of accounting students and lecturers. South African Journal of Higher Education 37(2): 233-250.         [ Links ]

    Strayer, J. F. 2012. "How learning in an inverted classroom influences cooperation, innovation and task orientation." Learning Environments Research 15(2): 171-193. https://doi.org/10.1007/s10984-012-9108-4.         [ Links ]

    Stull, A. T., L. Fiorella, M. J. Gainer, and R. E. Mayer. 2018. "Using transparent whiteboards to boost learning from online STEM lectures." Computers & Education 120: 146-159        [ Links ]

    Van der Velde, R., N. Blignaut-Van Westrhenen, N. H. M. Labrie, and M. B. Zweekhorst. 2021. "'The idea is nice ... but not for me': First-year students' readiness for large-scale 'flipped lectures' -what (de)motivates them?" Higher Education 81(6): 1157-1175. https://doi.org/10.1007/s10734-020-00604-4.         [ Links ]

    VanderMolen, Julia, Kristen Vu, and Justin Melick. 2018. "Use of Lightboard Video Technology to Address Medical Dosimetry Concepts: Field Notes," Current Issues in Emerging eLearning 4(1): Article 6. https://scholarworks.umb.edu/ciee/vol4/iss1/6.         [ Links ]

    Vaughan, N. 2007. "Perspectives on Blended Learning in Higher Education." International Journal on E-Learning 6(1): 81-94.         [ Links ]

    Whitelaw, E., N. Branson, and M. Leibbrandt. 2022. Learning in lockdown: University students' academic performance during COVID-19 closures. SALDRU Working Paper No. 289. University of Cape Town.         [ Links ]

     

     

    APPENDIX A

    Robustness checks

    In the first instance, we consider that in 2019 and 2022 the same educator was responsible for setting the assessments, whereas in 2018 this was done by a different individual, who was not involved in 2019 or 2022. Although exam papers are externally moderated, it is plausible that different individuals may set assessments with different degrees of difficulty. We therefore check the robustness of our findings to the exclusion of the 2018 cohort and compare student performance in 2019 and 2022 only. The resulting difference in the performance of the 2019 and 2022 cohorts of 4.66, on average, reveals that the strong, positive blended learning effect is preserved under this alternative sample.

    Next, instead of basing the regression analysis on final course grades, we base them on exam grades (using the full regression sample that compares 2022 with 2018 and 2019). Although the components of the final course grade were broadly the same for both cohorts (65% for the exam, 25% for the tests and 10% for coursework), the detailed composition of the 10% course work differed. This component in 2022 was simply the student's average of their weekly quizzes, whereas in 2018 and 2019 it included a project submission component.1 This defends the analysis against a possible claim that we are not comparing like with like. Again, the findings are shown to be robust: focussing only on exam grades, the blended learning effect, at a 5.44 percentage point improvement, appears to be even bigger than it is for the overall course grade.

    Lastly, we include controls for the share of tutorials students submitted and attended. This information is omitted in our main specification since we expect that submission and attendance rates are likely influenced by blended learning. However, it may be that the lower submission and attendance reflect something other than a blended learning effect (i.e. higher rates of illness). If we believe that submission and attendance are not related to the type of learning model, the blended learning effect grows by 1.5 percentage points when these controls are added. This finding suggests that, on average, students' performance improves by 5.83 percentage points under a blended learning approach, regardless of how many tutorials they submit and attend. However, not shown here, is a significant and positive effect of tutorial submission and attendance, suggesting that students who attend and submit more tutorials, perform better regardless of the approach.

     

     

    NOTE

    1 . The project was a take-home submission that required approximately 8 hours of students' time to complete. It was removed from the set of course assessments after 2019. Other than the learning models, this is one of the few inevitable small differences between the courses experienced by the two groups under study. It is unlikely that on its own it could explain the differential academic performance: any benefit students experienced in 2022 from the small time saving of not having to do the project is likely to be offset by having to make up for the lost learning opportunity (offered by the project) by spending time on other learning activities.