## Abstract

As the COVID-19 pandemic forced a sudden shift to online teaching and learning in April 2020, one of the more significant challenges faced by instructors is encouraging and maintaining student engagement in their online classes. This paper describes my experience of flipping an online classroom for a core Chemical Engineering Fluid Mechanics class to promote student engagement and collaboration in an online setting. Comparing exam scores with prior semesters involving in-person, traditional lecture-style classes suggests that students need a certain degree of adjustment to adapt to this new learning mode. A decrease in student rating of teaching (SRT) scores indicates that students largely prefer in-person, traditional lectures over an online flipped class, even though written comments in the SRT contained several responses favorable to flipping the class in an online setting. Overall, SRT scores on a department level also showed a similar decrease, which suggests students were less satisfied with the quality of teaching overall throughout the department, with this flipped method of instruction neither improving nor worsening student sentiment toward online learning. In addition, whereas most students liked the prerecorded lecture videos, they were less enthusiastic about using breakout rooms to encourage student collaboration and discussion. Further thought and discussion on best practices to facilitate online student interaction and collaboration are recommended, as online learning will likely continue to grow in popularity even when in-person instruction resumes after the pandemic.

## Introduction

Since the COVID-19 pandemic abruptly shifted higher education to fully online delivery modes in March 2020, educators and students alike grappled with numerous online teaching and learning challenges. In general, online teaching can be classified broadly in the following ways: (1) fully online and synchronous, where students log in to a video conferencing platform at set class times during the week, (2) fully online but asynchronous, where students access posted lecture materials at their own time, or (3) a hybrid model which is a mix of synchronous and asynchronous classes, e.g., students meet online at a set time only once a week and watch prerecorded videos for the rest of the week. Note that these classifications are for fully online lecture classes only and do not include teaching models that involve some form of in-person instruction or courses with a laboratory component. From students' perspectives, “Zoom fatigue” set in rapidly; anecdotal conversations with students indicated that they spent as much as 5 h a day on exhausting Zoom classes. In addition, students missed out on interpersonal interactions with faculty and other students, which is essential in terms of students' learning through collaborative discussions and perhaps more crucial in developing their interpersonal and communication skills. These jarring adjustments for students toward learning, coupled with social isolation and various personal complications that arose from the pandemic, have led to increased anxiety and depression among students [1,2]. Likewise, instructors faced an abrupt adjustment with little time to adapt course materials to an online platform on top of personal and family life complications such as childcare and a myriad of work-from-home challenges. Technological issues such as poor internet and software proficiency aside, one of the biggest challenges for instructors is maintaining student interest and engagement in their courses [35]. Notably, Wester et al. reported a significant decrease in emotional engagement of science, technology, engineering, and math (STEM) students during the pandemic, as quantified by students' attitudes toward science courses in general, not just specific to a particular course [5]. Higher education instructors had the unenviable task of creating engaging online learning experiences under severe time constraints while delicately balancing accommodations for students' unique challenges with a responsibility for teaching core content effectively and holding students to high standards.

Even before the pandemic-driven switch to online teaching, numerous studies have shown that active learning techniques such as a flipped classroom can increase student engagement [68]. The origins of the flipped classroom approach are widely attributed to two Colorado high school teachers, Jonathan Bergmann and Aaron Sams, who, in the mid-2000 s, provided voice-over annotations of their Powerpoint slides to accommodate students who missed classes [9]. Over the years, numerous researchers studied the effects of the flipped classroom on student learning; see Al-Samarraie et al. [10] for an excellent review of flipped class experiences across different higher education disciplines. The foundations for the flipped classroom stem from a student-centered learning approach, in contrast to passive participation in traditional lecture settings [1114]. Broadly, the flipped classroom approach places responsibility on students to first learn the material before class via prerecorded videos or reading assignments. Then, class time is typically dedicated to collaborative problem-based learning and discussion among students to practice and retain their learned knowledge. Bishop and Verleger further narrow the definition that a flipped-classroom approach should contain two parts: (1) “interactive group learning activities inside the classroom” and (2) “direct computer-based individual instruction outside the classroom” [12]. They argue that preclass activities should still be teacher-centered, excluding more passive assignments such as assigned readings. This concept of computer-based instruction automatically lends itself perfectly to online teaching, which begs the following questions: (1) Is a flipped classroom method a natural fit for online teaching? (2) Can we effectively facilitate collaborative learning among students in an online setting? In this paper, I attempt to answer these questions via my experience of teaching ChE 3111: Fluid Mechanics at the University of Minnesota—Duluth online in Fall 2020 using the flipped classroom method of instruction. Although this class is taught with more traditional Chemical Engineering applications in mind, the same concepts such as Bernoulli's and Navier–Stokes equations, pump, and pipe flow design also apply to Biomedical Engineering analysis of drug delivery systems and modeling physiological fluid flow in the body.

## Methodology

This class of 44 students met synchronously for 50 min every Monday, Wednesday, and Friday. Students were assigned lecture videos and/or worked examples before class. These prerecorded videos were filmed in a professional studio setting on the University of Minnesota Twin Cities campus in Summer 2020 with the assistance of the video production team at the Academic Technology Support Services following strict COVID-19 safety protocols. In total, 36 videos varying between 8 and 36 min in length were filmed. A handout of partially filled notes accompanies each video—students are required to complete the handout and submit it online before the start of class, with these completed handouts constituting 30% of the homework grade. This handout serves two purposes: (1) to keep students accountable to watch the videos before class and (2) to provide students with a complete set of notes for each lecture. Similar to the in-person flipped classroom format for another engineering class [15], the 50 min of class time is roughly divided into the following sequence:

• 5–10 min: The instructor leads a brief recap of the slides used in the video(s).

• 15 min: Students work individually on practice problems from the week's homework set.

• 15–20 min: Smaller groups of five to six students meet in breakout rooms. Breakout groups were randomly assigned at the start of the semester. The groups remained unchanged throughout the semester, with the rationale of building rapport and comfort between the same students to facilitate discussion. An undergraduate teaching assistant (TA) was also present in every class to rotate through the breakout rooms and help answer questions.

• 5–10 min: The student groups rejoin the class to compare their work to a brief explanation of the practice problem's solution.

With the first and last segments of the class resembling more of a traditional lecture involving direct instruction from faculty, it may be argued that this format is strictly not an entirely flipped classroom but more of a hybrid model of instruction. Nevertheless, feedback from an informal midterm survey indicated favorable responses from students toward this format. The brief recap at the start of the class allowed the instructor to emphasize essential concepts and field questions regarding the lecture video material. At the start of the semester, only 5 min was allocated for the final segment, where the instructor briefly explained the solution. However, the time was extended to 10 min following overwhelming feedback from a midterm survey conducted after the first exam that revealed students wanted more detailed explanations of the solutions to the practice problems. Equipping students with a better understanding of how to approach solving the practice problem helps them complete the homework problems outside of class.

### Data Comparison.

Data from Fall 2020 (online, flipped class, n = 43 students) are compared with prior semesters that employed in-person traditional lecture modes in Fall 2017 (n = 42), Fall 2018 (n = 48), and Fall 2019 (n = 25). Two different measures are compared across semesters:

1. Exam scores (two midterm exams + one cumulative final exam) are used to measure student learning. Each of these exams covered the same material across all four semesters.

2. Student rating of teaching (SRT) compares student satisfaction and perception of learning for the class. These student evaluations contain 14 questions in total, as shown in Table 1; scores are based on a Likert-type scale ranging from 1 (very strongly disagree) to 6 (very strongly agree). Note that several questions were changed or modified when the modality of instruction changed to online for Fall 2020. Specifically, the overall average scores and the individual score for question A1: “The instructor used appropriate and effective instructional methods,” are compared across semesters. Thus, the latter question serves as a more direct gauge of students' perception of this flipped class. These scores are also compared with the department averages to measure relative changes in student sentiment between this class and the department average. In addition, common feedback themes are extracted from written responses to two questions: (1) “What aspects of this course were particularly effective?” and (2) “What could the instructor do to improve his/her teaching?”

Table 1

Questions from the end-of-semester SRT evaluations. Some questions were modified or changed entirely because of the change to a fully online format for Fall 2020

Questions for Fall 2017–2019Questions for Fall 2020 (if different)
A. Delivery of instruction
1.The instructor used appropriate and effective instructional methods
2.The instructor's teaching style motivated me to learnThe format and page design were easy to use
3.The instructor used the class time wellInstructions were clear, and help was available if I encountered problems
B. Articulating expectations and assessing learning
4.The instructor clearly articulated exp ectations for this course
5.The instructor provided a regular and helpful assessment of my progress in this class
6.The course assignments, exams, and projects were a good measure of my learning
C. Creating an environment that supports learning
7.The instructor created an open, respectful environment that supported my learning
8.The instructor was available at designated times outside of classThe instructor was available (for example, office hours, or email) to assist if needed
9.I feel comfortable asking questions in class and/or for help outside of classThe instructor included a way to exchange ideas with other students
D. Administrative issues
10.The instructor was organizedThe course was well organized
11.The instructor graded my work in a timely way
12.The text(s) and/or other required materials were a necessary part of the course
E. Other
13.I would recommend this instructor to a fellow student
14.Overall, I learned a lot in this course
Questions for Fall 2017–2019Questions for Fall 2020 (if different)
A. Delivery of instruction
1.The instructor used appropriate and effective instructional methods
2.The instructor's teaching style motivated me to learnThe format and page design were easy to use
3.The instructor used the class time wellInstructions were clear, and help was available if I encountered problems
B. Articulating expectations and assessing learning
4.The instructor clearly articulated exp ectations for this course
5.The instructor provided a regular and helpful assessment of my progress in this class
6.The course assignments, exams, and projects were a good measure of my learning
C. Creating an environment that supports learning
7.The instructor created an open, respectful environment that supported my learning
8.The instructor was available at designated times outside of classThe instructor was available (for example, office hours, or email) to assist if needed
9.I feel comfortable asking questions in class and/or for help outside of classThe instructor included a way to exchange ideas with other students
D. Administrative issues
10.The instructor was organizedThe course was well organized
11.The instructor graded my work in a timely way
12.The text(s) and/or other required materials were a necessary part of the course
E. Other
13.I would recommend this instructor to a fellow student
14.Overall, I learned a lot in this course

## Results

Figure 1 shows the comparison of exam scores across the four semesters. Student's t-tests were conducted in Microsoft Excel, and the Bonferroni correction was applied when multiple comparisons were performed. Notably, the average score for Midterm exam 1 in Fall 2020 was lower than previous semesters with traditional in-person lectures. Although data from an informal survey of prior student cohorts (data not shown) showed that most students had experience with in-person flipped class in at least one of their lower-division classes, this class is the first online flipped class for most students. With Fall 2020 as the first fully online semester for most students, coupled with typical semester-to-semester variability in exam difficulty and subtle differences between student cohorts, it is impossible to pinpoint any specific reason for the lower scores. Comparing midterm exam 1 and midterm exam 2 shows statistically higher scores in Fall 2020 (p < 0.05 using Student's t-tests). An improvement was also observed in Fall 2017, where midterm exam 1 scores were also lower. The fact that average exam scores recovered in subsequent exams to levels that are statistically the same as previous semesters suggests some adaptability by the students either to the online format, or the flipped method of instruction, or some combination of both.

Fig. 1
Fig. 1

The bar charts in Fig. 2 compare the overall SRT scores across the four semesters. An overall decrease in SRT scores in Fall 2020 was observed for the ChE 3111 course and the department average (Fig. 2(a)). Although the department average in Fall 2017 was the lowest across the four semesters, it should be noted that there was significant faculty turnover (two out of nine faculty) between Fall 2017 and Fall 2018, rendering this comparison less useful. Except for Fall 2017, which had a higher SRT ratio due mainly to the lower department average that semester, similar score ratios were observed between Fall 2018 and Fall 2020 (Fig. 2(b)). Looking specifically at question A1 in the SRT: “The instructor used appropriate and effective instructional methods,” the trends mirror those for the overall SRT scores identically (Fig. 3). This observation is not surprising since many studies have shown that such student evaluations reflect student–consumer satisfaction more than student learning [16,17]. Instead of “reading the actual rating items, [students] locate a column on the form to reflect their general level of enjoyment in the course” [17]. Taken together, however, these trends suggest that students were less satisfied with the quality of teaching overall throughout the department in the primarily online environment in Fall 2020, while employing the flipped method of instruction for ChE 3111 neither improved nor worsened student sentiment toward online learning.

Fig. 2
Fig. 2
Fig. 3
Fig. 3

Table 2 highlights some commonly written feedback in response to two open-ended questions. Notably, despite the overall drop in SRT scores, a significant number of students enjoyed the flipped/hybrid method of instruction. Specifically, several students tied their feedback directly to their online learning, indicating that the flipped classroom “worked well with the online aspect” and “was an effective approach to distance learning.” Another student responded that “despite the troubles around online learning, I found this class with the flipped classroom format particularly effective.” The next most common positive feedback (prerecorded lecture videos, homework problems in class, breakout room discussions) revolved around particular aspects of a flipped classroom. In particular, many students enjoyed the prerecorded lectures “because I could access them at any time” and “being able to watch lecture videos whenever I want.”

Table 2

Summary of common responses to two open-ended questions on the SRT

What aspects of this course were particularly effective?Number of mentions (total number of student responses: 34)
Flipped class setup14
Having prerecorded video lectures9
Working on homework problems during class7
Breakout rooms/discussion with other students on homework problems8
What could the instructor do to improve his/her teaching?Number of mentions (total number of student responses: 25)
More time on exams4
Breakout room issues (do not like the setup, not useful overall, group members not talking, etc.)4
More resources (more video examples, practice tests, etc.)3
Explain solutions more / slower3
What aspects of this course were particularly effective?Number of mentions (total number of student responses: 34)
Flipped class setup14
Having prerecorded video lectures9
Working on homework problems during class7
Breakout rooms/discussion with other students on homework problems8
What could the instructor do to improve his/her teaching?Number of mentions (total number of student responses: 25)
More time on exams4
Breakout room issues (do not like the setup, not useful overall, group members not talking, etc.)4
More resources (more video examples, practice tests, etc.)3
Explain solutions more / slower3

Interestingly, the use of breakout rooms for discussion was both a frequent positive and negative feedback. Several students enjoyed “working in groups to discuss the homework questions, and “be able to bounce back ideas of (f) one another.” One student, in particular, noted the social interaction aspect that breakout rooms provided: “Especially because being online, it was hard to interact with students, so breakout groups really helped.” However, it was obvious that group dynamics and student personalities played a critical role in whether breakout rooms were successful. One student bluntly wrote “no breakout rooms”; other students felt “some other students in the class are left unfulfilled by working with other students… it's the issue of the students who won't collaborate,” and while “working as a group could have been effective, I didn't gain much help from my group.” Such group dynamics are not unique in this online breakout room setting and warrant further study into how best to accommodate the wide variety of student personalities and preferences, as discussed below.

It should be noted that these anonymous SRT data present an aggregate of all the students who responded to the survey; as such, a deeper analysis of any differences between different groups of students is not possible. For example, Chiquito et al. [18] found that flipping the classroom improved female students' grades over male students, suggesting that the flipped method of instruction benefited female students more, which should presumably correlate with more positive SRT scores. However, without demographic data from the SRT, it is impossible to perform a similar comparison to ascertain whether this gender difference is also observed in an online flipped class setting.

## Discussion

Overall, implementing the flipped method of instruction in an online environment for this class was well-received, even though the decrease in SRT scores compared to previous semesters suggests that flipped online learning is still less preferable than an in-person, traditional lecture format. Also, exam scores suggest that some adjustment was necessary before students could demonstrate learning at the same level as previous semesters. However, it is unclear whether this is an adjustment to online learning, the flipped instruction method, or some combination, including other factors. Taken together, the flipped class method of instruction lends itself well to online teaching, with the prerecorded videos an effective way for students to learn the material before class. Facilitating collaborative learning and discussion among students in an online setting, however, showed mixed results, although several qualitative observations could be made from this online flipped instruction experience:

1. (1)

Requesting students to turn on their cameras during class promotes student engagement. However, it should be emphasized that turning on their cameras is a request and not a requirement, and I often make clear that students may choose not to turn on their cameras, e.g., due to privacy reasons, poor internet connection. Finders and Mu$ñ$oz contend that such a requirement may be culturally insensitive, intrusive, and create anxiety for some students [19]. However, I have anecdotally observed that students using their cameras tend to be much more engaged with one another and the instructor/TA in breakout rooms. In contrast, rooms with no student cameras turned on tend to be completely silent, even when I join the room and attempt to ask questions. Although Finders and Mu$ñ$oz suggest that instructors should “trust students to be in charge of their bodies, spaces and learning” [19], turning on their cameras creates accountability for students to stay engaged with the material. In the latter half of the semester, whenever I remind students of the request to turn their cameras on, I also challenge students to be honest with themselves should they choose not to do so.

2. (2)

Breakout rooms are not ideal, but are there better online alternatives? As previously mentioned, student engagement in breakout rooms is highly dependent on student personalities and preferences. Although I chose to keep students in the same breakout groups hoping that they would build rapport with one another as the semester progressed, this strategy backfired with a few groups where most students did not want to engage in discussion from the start. For example, midsemester, a student provided feedback that she gave up trying to engage her group—at the start of the semester, she would always turn on her camera and attempted to make conversation by, e.g., asking how everyone else was doing. Still, the lack of response was very disheartening for her. In an in-person setting, students instinctively sit close to peers they already know, leading to more interaction naturally during group discussions. In addition, students do not have the option to “hide” from and not respond to their peers in the physical classroom, which forces even the most reclusive students to engage in some limited discussion. Some potential options to consider for future online classes:

1. Rather than random assignments, allow students to choose their groups. One potential drawback of this strategy is the exclusion of certain groups of students, e.g., minority students such as students of color, as students tend to choose to interact with people who are most “like” themselves.

2. Switch up breakout groups periodically throughout the semester. Although this strategy would render any breakout group issues temporary, it may also discourage discussion as students have to adjust to new faces and personalities.

3. Survey the students whether they want to be in a group that engages in discussion and assign the groups accordingly, essentially giving students the option of working alone during the breakout discussion segment. This strategy was attempted mid-semester for a different class in Spring 2021 with some degree of success. Although no data were collected for this class, anecdotally I observed that breakout rooms composed of students who opted for discussions were generally lively. On the other hand, individual exam scores for several students who opted to work alone during the breakout discussion segments of class notably suffered, possibly due to removing any accountability to stay engaged during class.

3. (3)

The online platform is excellent when screen sharing among students is helpful. One obvious advantage of online classes is the ability for students to share computer screens during breakout discussions. Rather than huddling together or peering over each other's shoulders in a physical classroom setting, students can easily see another student's work and help troubleshoot problems. This is especially useful during practice problems where some software is required, e.g., organizing data or plotting using Excel. Unfortunately, such problems involving the use of software were rare for this class. However, this could be a significant advantage for classes that involve, e.g., heavy programing or specialized software. In some cases, however, some students took photos of their work and screen-shared them with the group, which significantly promoted discussion as students could now see the student's thought process and approach to the question. This latter strategy could be pursued further by encouraging students to share their work more often.

4. (4)

Would more collaborative forms of student performance evaluation work better in an online setting? Although exams are traditionally considered the primary tool for evaluating student learning, some studies have reported improved student performance via alternatives such as problem-based learning [20,21] or collaborative exams [22,23]. Such methods would encourage (if not force) interaction among students and hence worth exploring in future semesters regardless of whether the instruction is in-person or online. However, it is unclear whether such approaches can be implemented successfully in an online setting, where low engagement and cheating are already common problems.

As the nation cautiously looks to a resumption of normal in-person activities for Fall 2021, several lessons learned from the past year of online teaching could be adapted in the physical classroom and create better online learning experiences. First, while an overwhelming number of students strongly prefer in-person instruction over online learning, several students have benefited from the flexibility afforded by online learning. A potential option would be to offer an online section of the same class to accommodate students for whom traveling to campus may be inconvenient or may experience conflicts with work or personal schedules. Second, holding more online office hours may encourage more students to seek help when they can access instructor/TA help from home. However, online office hours for this class were poorly attended compared to in-person office hour attendance in previous semesters. Hence, it is unclear if students will utilize the convenience of online office hours. Finally, the availability of recorded lectures over the past year could allow for better resource sharing between faculty, not just within the same institution but also collaboratively with other institutions. For faculty who adopt a traditional lecture style for in-person teaching, these videos could serve as a supplemental resource for students who may not have understood certain concepts well the first time during the lecture. Ultimately, regardless of what individual instructors choose to do with the online resources they have developed over the past year, this pandemic has shifted the academic landscape significantly and hastened the growth, popularity, and availability of online learning. Hence, now armed with some knowledge and experience with online teaching and without the unexpected stress of a sudden pivot from a pandemic, it may be prudent for instructors to continue to consider effective strategies for online teaching to adapt to this changing landscape.

## Acknowledgment

The creation of the videos for this class was funded by a Teaching Innovation Grant awarded by TeachingSupport@UMN at the University of Minnesota. The author thanks Andrew Matthews from the Academic Technology Support Services for his help in producing, editing, and uploading the lecture and example videos.

## References

1.
Wang
,
X.
,
Hegde
,
S.
,
Son
,
C.
,
Keller
,
B.
,
Smith
,
A.
, and
Sasangohar
,
F.
,
2020
, “
Investigating Mental Health of U.S. College Students During the COVID-19 Pandemic: Cross-Sectional Survey Study
,”
J. Med. Internet Res.
,
22
(
9
), p.
e22817
.10.2196/22817
2.
Cao
,
W.
,
Fang
,
Z.
,
Hou
,
G.
,
Han
,
M.
,
Xu
,
X.
,
Dong
,
J.
, and
Zheng
,
J.
,
2020
, “
The Psychological Impact of the COVID-19 Epidemic on College Students in China
,”
Psychiatry Res.
,
287
, p.
112934
.10.1016/j.psychres.2020.112934
3.
Petillion
,
R. J.
, and
McNeil
,
W. S.
,
2020
, “
Student Experiences of Emergency Remote Teaching: Impacts of Instructor Practice on Student Learning, Engagement, and Well-Being
,”
J. Chem. Educ.
,
97
(
9
), pp.
2486
2493
.10.1021/acs.jchemed.0c00733
4.
Perets
,
E. A.
,
Chabeda
,
D.
,
Gong
,
A. Z.
,
Huang
,
X.
,
Fung
,
T. S.
,
Ng
,
K. Y.
,
Bathgate
,
M.
, and
Yan
,
E. C. Y.
,
2020
, “
Impact of the Emergency Transition to Remote Teaching on Student Engagement in a Non-STEM Undergraduate Chemistry Course in the Time of COVID-19
,”
J. Chem. Educ.
,
97
(
9
), pp.
2439
2447
.10.1021/acs.jchemed.0c00879
5.
Wester
,
E. R.
,
Walsh
,
L. L.
,
Arango-Caro
,
S.
, and
Callis-Duehl
,
K. L.
,
2021
, “
Student Engagement Declines in STEM Undergraduates During COVID-19–Driven Remote Learning
,”
J. Microbiol. Biol. Educ.
,
22
(
1
), p. 50.10.1128/jmbe.v22i1.2385
6.
Seery
,
M. K.
,
2015
, “
ConfChem Conference on Flipped Classroom: Student Engagement With Flipped Chemistry Lectures
,”
J. Chem. Educ.
,
92
(
9
), pp.
1566
1567
.10.1021/ed500919u
7.
Steen-Utheim
,
A. T.
, and
Foldnes
,
N.
,
2018
, “
A Qualitative Investigation of Student Engagement in a Flipped Classroom
,”
Teach. High. Educ.
,
23
(
3
), pp.
307
324
.10.1080/13562517.2017.1379481
8.
Gilboy
,
M. B.
,
Heinerichs
,
S.
, and
Pazzaglia
,
G.
,
2015
, “
Enhancing Student Engagement Using the Flipped Classroom
,”
J. Nutr. Educ. Behav.
,
47
(
1
), pp.
109
114
.10.1016/j.jneb.2014.08.008
9.
Bates
,
J.
,
Almekdash
,
E. H.
, and
Gilchrest-Dunnam
,
M. J.
,
2017
, “
The Flipped Classroom: A Brief, Brief History
,”
The Flipped College Classroom: Conceptualized and Re-Conceptualized, L. Santos Green
,
J. R.
Banas
, and
R. A.
Perkins
, eds.,
Springer International Publishing
,
Cham, Switzerland
, pp.
3
10
.10.1007/978-3-319-41855-1_1
10.
Al-Samarraie
,
H.
,
Shamsuddin
,
A.
, and
Alzahrani
,
A. I.
,
2020
, “
A Flipped Classroom Model in Higher Education: A Review of the Evidence Across Disciplines
,”
Educ. Technol. Res. Dev.
,
68
(
3
), pp.
1017
1051
.10.1007/s11423-019-09718-8
11.
Mingorance Estrada
,
Á. C.
,
Granda Vera
,
J.
,
Rojas Ruiz
,
G.
, and
Alemany Arrebola
,
I.
,
2019
, “
Flipped Classroom to Improve University Student Centered Learning and Academic Performance
,”
Soc. Sci.
,
8
(
11
), p.
315
.10.3390/socsci8110315
12.
J. L.
,
Bishop
, and
M.
,
Verleger
,
2013
, “
The Flipped Classroom: A Survey of the Research
,”
ASEE Annual Conference and Exposition
, Atlanta, GA, June 23–26, Paper No. #6219.10.18260/1-2--22585
13.
Demetry
,
C.
,
2010
, “
Work in Progress—An Innovation Merging ‘Classroom Flip’ and Team-Based Learning
,” IEEE Frontiers in Education Conference (
FIE
), Washington, DC, Oct. 27–30, pp.
T1E-1
T1E-2
.10.1109/FIE.2010.5673617
14.
Lage
,
M. J.
,
Platt
,
G. J.
, and
Treglia
,
M.
,
2000
, “
Inverting the Classroom: A Gateway to Creating an Inclusive Learning Environment
,”
J. Econ. Educ.
,
31
(
1
), pp.
30
43
.10.1080/00220480009596759
15.
V. K
,
L.
,
2020
, “
Flipping the Classroom for a Material and Energy Balances Course: Effect on Student Learning Versus Student Perception and Sentiment
,”
Chem. Eng. Educ.
,
54
(
3
), pp.
160
170.
https://journals.flvc.org/cee/article/view/115591
16.
Blackmore
,
J.
,
2009
, “
Academic Pedagogies, Quality Logics and Performative Universities: Evaluating Teaching and What Students Want
,”
Stud. High. Educ.
,
34
(
8
), pp.
857
872
.10.1080/03075070902898664
17.
Titus
,
J. J.
,
2008
, “
Student Ratings in a Consumerist Academy: Leveraging Pedagogical Control and Authority
,”
Sociol. Perspect.
,
51
(
2
), pp.
397
422
.10.1525/sop.2008.51.2.397
18.
Chiquito
,
M.
,
Castedo
,
R.
,
Santos
,
A. P.
,
López
,
L. M.
, and
Alarcón
,
C.
,
2020
, “
Flipped Classroom in Engineering: The Influence of Gender
,”
Comput. Appl. Eng. Educ.
,
28
(
1
), pp.
80
89
.10.1002/cae.22176
19.
Finders
,
M.
, and
Muñoz
,
J.
,
2021
, “
Why It's Wrong to Require Students to Keep Their Cameras on in Online Classes (Opinion)
,” Inside Higher Ed, Washington, DC, accessed May 18, 2021, https://www.insidehighered.com/advice/2021/03/03/why-its-wrong-require-students-keep-their-cameras-online-classes-opinion
20.
Larson
,
J.
,
Farnsworth
,
S. K.
,
Folkestad
,
L. S.
,
Tirkolaei
,
H.
,
Glazewski
,
K. K.
, and
Savenye
,
W.
,
2018
, “
Using Problem-Based Learning to Enable Application of Foundation Engineering Knowledge in a Real-World Problem
,” IEEE International Conference on Teaching, Assessment, and Learning for Engineering (
TALE
),
IEEE
,
Piscataway, NJ
, Dec. 4–7, pp.
500
506
.10.1109/TALE.2018.8615329
21.
Yadav
,
A.
,
Subedi
,
D.
,
Lundeberg
,
M. A.
, and
Bunting
,
C. F.
,
2011
, “
Problem-Based Learning: Influence on Students' Learning in an Electrical Engineering Course
,”
J. Eng. Educ.
,
100
(
2
), pp.
253
280
.10.1002/j.2168-9830.2011.tb00013.x
22.
Eaton
,
T. T.
,
2009
, “
Engaging Students and Evaluating Learning Progress Using Collaborative Exams in Introductory Courses
,”
J. Geosci. Educ.
,
57
(
2
), pp.
113
120
.10.5408/1.3544241
23.
Gilley
,
B. H.
, and
Clarkston
,
B.
,
2014
, “
Collaborative Testing: Evidence of Learning in a Controlled in-Class Study of Undergraduate Students
,”
J. Coll. Sci. Teach.
,
43
(
3
), pp.
83
91
.10.2505/4/jcst14_043_03_83