Page 1 of 13
Advances in Social Sciences Research Journal – Vol. 10, No. 6
Publication Date: June 25, 2023
DOI:10.14738/assrj.106.14847.
Donavant, B. W. (2023). Fostering Student-Faculty Engagement and Increasing Learning in the Ongoing Quest for Online Quality.
Advances in Social Sciences Research Journal, 10(6). 201-213.
Services for Science and Education – United Kingdom
Fostering Student-Faculty Engagement and Increasing Learning
in the Ongoing Quest for Online Quality
Brian W. Donavant
ORCID: 0000-0003-4936-5185
Department of Behavioral Sciences,
University of Tennessee at Martin, United States
ABSTRACT
As online delivery of higher education programs continues to increase, a scarcity of
empirical research assessing the efficacy of online pedagogy exists. Believing that a
focus on learning is more important than the touted accolades of access and
convenience, this experimental study examined how best to leverage specific
components of technology to provide engaging and meaningful faculty-student
interaction that actually increases learning within online courses. The results
indicate that the use of digital presentations narrated and provided by the
instructor of record rather than course material incorporated from static
presentations or other external sources brought about a statistically significant
increase in learning improvement compared to the use of more traditional online
tools. Amid the clamor for academic quality and accountability within higher
education programs, the results and context of this study provide direction for
future discussions and research as well as immediate implications for online
education praxis.
Keywords: Online education, Distance learning, Online learning, Online delivery, Online
pedagogy
INTRODUCTION
Across the higher education landscape and as new and robust technology propels us through a
dramatic shift in educational practice, providers and practitioners tout online education (OE)
as a highly effective delivery method, accompanied by the usual accolades of convenience and
accessibility for the learner. Although myriad research addresses the general feasibility of OE
and the use of technology within formal educational settings, notoriously little examines how
best to leverage specific components of technology to provide the greatest educational benefit:
engaging and meaningful faculty-student interaction that actually increases learning [6; 18; 30;
43]. Rather, the overwhelming majority of existing literature and research compares online
delivery to traditional face-to-face instruction, resulting in a no-significant-difference stalemate
that does little to inform or enhance meaningful online education [18; 43; 47; 48]. If, as most
research has indicated, there is no significant advantage to the effectiveness of the respective
delivery methods, but there is also no distinct disadvantage, then either approach would be
appropriate given particular circumstances. However, just as in traditional face-to-face settings,
educational practitioners bear responsibility for identifying meaningful ways to maximize
learning within the given context – in this case, the online environment.
Page 2 of 13
202
Advances in Social Sciences Research Journal (ASSRJ) Vol. 10, Issue 6, June-2023
Services for Science and Education – United Kingdom
Against this backdrop, the allure and number of undergraduate programs have skyrocketed
since 1971, and more than one in four programs now are delivered fully online [44]. The
increase is particularly dramatic within the social sciences and humanities, with these fields
now representing seven of the top 10 most awarded degrees [49]. For example, general
humanities have jumped from 11th to 1st with over 175,000 degrees awarded annually; always
popular, psychology has maintained its relatively high ranking but increased 308% in degrees
awarded; and, criminal justice, with more than 62,000 degrees awarded annually, has
catapulted from 29th to 9th as the most awarded baccalaureate degrees in the United States.
Beyond the continuing influx of traditional students into college and university programs and
fueled by various state completion agendas and incentives, significant numbers of working
professionals nationwide have completed some college but no degree [10; 18; 19], adding tens
of thousands of stopped-out adults returning to complete degrees and swelling the ever- expanding number of students accessing these offerings online in order to facilitate work and
lifestyle schedules [4]. But, with the continuing popularity of virtual programs and the explosive
growth of online delivery, does the quality of these offerings satisfy sound academic practice,
those endeavors that combine appropriate curricular and faculty standards with convenient
access in order to actually increase learning? With 39 states currently in the Complete College
America (CCA) alliance focusing on the completion agenda and promoting significant increases
in the overall number of students completing degrees [14; 15], professional associations and
regional accrediting agencies such as the Association of American Colleges and Universities [2]
and Southern Association of Colleges and Schools Commission on Colleges [46] continue to
emphasize strategic attention to program quality standards, with AAC&U [1, p. 1] noting that
“the quality shortfall is just as urgent as the attainment shortfall.”
Creative and engaging online courses utilize not only the vast technological resources available
to supplement this continually evolving method of educational delivery, but also the expertise
of the facilitators. Proponents herald the use of OE as one means of meeting learners “where
they live” and providing them with rich material that attaches meaning to their daily lives and
the learning experience. But are the current technological enhancements and plethora of videos
and other media that are finding increased use within the online environment and on myriad
digital devices simply gimmicks to lure participants to an expanding educational cash cow or
legitimate tools that can help online learners derive the greatest benefit from their educational
experiences? Rather than focusing on the online delivery method as a whole, this experimental
pilot study examined the effectiveness of various deliveries of educational material within the
online environment, and whether those efforts impacted learning improvement and
contributed to academic measures of program quality.
THE CURRENT LANDSCAPE
The general academic goal of many (most) educational institutions offering online programs,
and college or university faculty delivering online courses, seems to be to try to make the online
class as good as its traditional counterpart. A few researchers even tout positive assessment
results as evidence that OE is more effective in some instances, especially within the so-called
“soft” sciences such as the humanities, liberal arts, and education [44; 47]. But the question of
whether OE can be just as effective as its face-to-face counterpart has long been answered, and
the more pertinent question becomes how best to leverage the myriad technological options
Page 3 of 13
203
Donavant, B. W. (2023). Fostering Student-Faculty Engagement and Increasing Learning in the Ongoing Quest for Online Quality. Advances in Social
Sciences Research Journal, 10(6). 201-213.
URL: http://dx.doi.org/10.14738/assrj.106.14847
and instructional techniques available to achieve the greatest learning improvement within the
expanding online environment.
When discussing the effectiveness of educational methodologies, one would be hard-pressed
not to acknowledge the ability to relate various examples of practical application through the
use of dialogue and discussion, i.e., verbally, and tempered through the lens of faculty
perspective and expertise, as a primary strength of the face-to-face classroom. Whether
remotely or in person, students can glean only so much from books, slides, and other materials
projected onto a screen; yet, in response to the widely proclaimed lack of personal interaction
that impedes online learner engagement, online delivery often relies upon the use of these static
presentations, asynchronous typed discussion boards, and uploaded video clips from various
external sources in a dismal attempt to simulate the interaction of the face-to-face classroom
[8; 13; 20; 30; 34). Finding engaging ways for delivering anecdotal yet valid and well-informed
insight, those real-world examples best provided by subject-matter expert facilitators, offers
the greatest opportunity to bring the material to life and make meaning of the experience. To
do otherwise minimizes the role of higher education faculty to obsolescence, and learners might
just as well enroll in a self-directed correspondence course.
One readily available and popular alternative for faculty aspiring to increase the interest,
engagement, and satisfaction of their online students is to incorporate materials from the
growing array of open educational resources (OER) currently available through social media
and other providers (Colvard et al., 2018), which typically encompass no-cost online learning
content and other materials not restricted by copyright license and available for reuse or
redistribution (Hilton et al., 2016). Many faculties rate the increasingly high-quality of OER, as
well as items readily available through YouTube, iTunes, and other outlets, as superior to
printed textbooks and other traditional materials [4]. Several studies suggest that their use
contributes to increased student performance [22; 23; 37], while others find no significant
difference in final grades whether using OER or traditional resources [16; 29]. Of course, the
use of supplemental materials to provide currency to course topics and inform theoretical
perspectives is a common and appropriate practice within academia, and faculty unequivocally
should maintain full freedom in their pursuit of academic duties. However, these assessments
and recommendations fail to consider how the overuse of OER – and concurrently minimal
direct faculty input into the teaching endeavor, a situation exasperated by the remoteness of
the online environment – may create a failed critical mass of program direction and oversight,
or how the minimization of faculty engagement may be viewed by various accrediting bodies
and professional associations. SACSCOC [46, p. 44] notes that “[q]ualified, effective faculty
members are essential to carry out the mission of the institution and to ensure the quality and
integrity of its academic programs” and identifies the instructor of record as “the person
qualified to teach the course and who has overall responsibility... for the achievement of
student learning outcomes” (p. 45). While encouraging the incorporation of various media and
OER into online classes in order to reflect the most current perspectives regarding criminal
justice issues, Bernat and Frailing [8, p. 345] offer valuable perspective and advocate an
appropriate balance between “technology and social presence” in both course materials and
instructional delivery, noting faculty members’ responsibilities for making these
determinations.
Page 4 of 13
204
Advances in Social Sciences Research Journal (ASSRJ) Vol. 10, Issue 6, June-2023
Services for Science and Education – United Kingdom
Arguably, most instructors find that their greatest strength in the traditional classroom is the
ability to relate various examples of practical application through the use of dialogue and
discussion, i.e., verbally; yet many faculties struggle to find ways to incorporate this component
into their online courses. In too many instances, excellent classroom facilitators attempt to
bundle their material and “stick it online,” often in the form of the same static PowerPoint
presentations or other pre-recorded media they use in face-to-face settings [8; 20]. This leads
to frustration for both the facilitator who does not understand why students are not “getting it”
and the learner who is forced to self-educate. Students often come away from these experiences
exasperated that they had to enroll in a course when they could have learned just as much by
simply buying the book and reading it on their own.
The virtual classroom requires innovative approaches that help learners engage in the
experience. Online education suffers from a lack of imagination and willingness by facilitators
to evaluate current practice and employ new approaches to delivering educational material
[30; 44; 47]. A recent review of one well-known publisher’s instruction manual for developing
online courses said that nearly all communication in the online classroom is written, referring
to not only the correspondence among participants and facilitators, but the delivery of
educational material as well. Although probably accurate, this notion is self-limiting, and faculty
must leverage technology to identify and implement new mechanisms for bridging the online
student-faculty communication gap, always with an emphasis on convenience and ease of
accessibility for the learner but focusing on genuine academic gain.
To date, the predominant consideration among studies examining the quality of OE revolves
around satisfaction with the student-student and student-instructor interaction [3; 8; 9; 18; 19;
25; 32; 38]. In national and regional studies of online professional development, Donavant (18;
19] found that student interaction with the facilitator is paramount to participants’ recognition
of OE as a legitimate educational delivery. Many studies of OE in higher education have
identified gender, race, and age [27; 30; 41; 42], or the participant’s level of formal education
[12; 19; 43] as determinant factors in the satisfaction and potential success of online students.
Others simply tout the popularity of modern technology with students in today’s increasingly
digital world or simple convenience as the driving forces behind increased participation in
online classes and a justification for exploring continually evolving issues within this growing
field [7; 11; 40].
In the dismally few instances in which educators utilized pre- and post-tests strategies to
measure academic improvement, an exhaustive review of the literature reveals virtually no
empirical evidence regarding which educational practices maximize success in the online
environment. Educational effectiveness generally is determined by learner achievement and
results in the acquisition and development of new knowledge and skills [17; 31; 35; 39].
Recognition of any educational methodology as appropriate is contingent upon the
demonstration that participants actually learned, and online practitioners must find ways to
maximize the benefits of emerging technology. Learning success is affected by a myriad of
circumstances and demographic considerations including educational level; time spent availing
oneself to educational material; the use of various educational methodologies; familiarity with
and previous exposure to the educational methodology employed; and gender, race, and age of
the learner [21; 30; 43]. Understanding the make-up and capacity of the student base remains
a critical component of any evaluation of effective education, especially within the rapidly
Page 5 of 13
205
Donavant, B. W. (2023). Fostering Student-Faculty Engagement and Increasing Learning in the Ongoing Quest for Online Quality. Advances in Social
Sciences Research Journal, 10(6). 201-213.
URL: http://dx.doi.org/10.14738/assrj.106.14847
expanding online arena and as the field continues to address the continuing criticism of low- quality academic programs accused of attracting below-average students [44].
THE PRESENT STUDY
Rather than focusing on the delivery method of OE as a whole, this study focused on the
integration of narrated digital presentations into online courses and their efficacy for online
pedagogy. The study utilized an experimental design to examine the potential benefits of
enhanced instructional delivery features such as facilitator-narrated digital presentations
within specific online undergraduate course iterations to effect higher levels of learning
improvement. The research results offer practical insight for those educators who may be
considering ways to enhance online courses, online components of hybrid classes, or online
materials in support of face-to-face courses, and how these tools best fit within a
comprehensive approach to online education.
This study examined the differences in the levels of academic engagement and learning
improvement of students who participated in OE with enhanced features, defined as digital
presentations accompanied by audible narration, compared to those who participated with
traditional features, defined as digital presentations comprised of slides with photographs or
videos, and narrative text, but with no audible narration. The study also considered whether
differences in academic engagement were related to learning improvement, as well as whether
the demographic factors of gender, race, age of the learner, number of years of formal education
received, and previous exposure to OE related to the levels of academic engagement and
learning improvement of students who participated in OE with enhanced features compared to
those who participate with traditional features. For purposes of this study, online education
was defined as instructional material transmitted and delivered via a personal computer to
learners at locations remote from that of the instructor(s). Within this context, instruction may
include postings, discussion board, online materials, synchronous or asynchronous chat, and
other methods, and allows self-paced, interactive, and individualized learning. Academic
engagement was defined as the level of meaningful discussion, determined by evaluating the
length of time and level of detail included in prompted responses posted to asynchronous
online audio discussion boards, and learning improvement was defined as an increase in the
academic knowledge of students measured by comparison of pre-and post-test scores.
METHODOLOGY
The relevance of any study examining the efficacy of a particular educational delivery method
is logically contingent upon the data indicating that learning actually occurred, and the pre- test/post-test design of this study provided this validation. Although not part of the original
research design but because the numbers of participants in the control and experimental
groups were relatively low, replicating the study bolstered the reliability of the results, with the
initial and subsequent iterations designated as Cohorts 1 and 2, respectively. Similarly, multiple
research iterations increased the total number of participants to enhance validity of the results.
Cohort 1 consisted of 24 university students who enrolled in an online social science course
and were randomly placed into one of two sections of the course, with 13 students placed in
the control group and 11 students placed in the experimental group (initial placement included
12 and 11 students in the control and experimental groups, respectively; however, one student
enrolled late and was placed in the control group). Cohort 2 also consisted of 24 university
Page 6 of 13
206
Advances in Social Sciences Research Journal (ASSRJ) Vol. 10, Issue 6, June-2023
Services for Science and Education – United Kingdom
students who enrolled in a subsequent iteration of the same online course, but with 12 students
placed in both the control and experimental groups.
Students in both groups of both cohorts were informed that research was being conducted to
evaluate best educational practices within the online environment and that their participation
in the study was voluntary. No incentives were offered for participation in the study, and
students were informed that their participation was not linked to course performance, grades,
or status with the university. All students agreed to participate in the study, and, once each
participant had completed the pre-test, access was granted to all course and instructional
materials. No additional references to the ongoing research study were made during the
remainder of the 15-week course iterations.
To ensure that the respective course sections educated all participants effectively, both the
control and experimental groups in both cohorts were taught by the same instructor and
received the same educational content, supported by the same course textbook and methods of
academic assessment over the same period of time. All groups were provided the same video
clips, provided the same audio recordings of classroom lectures, asked the same discussion
prompts and required to respond to audio discussion boards, required to submit the same
writing assignments, and administered the same course examinations. Because the online
delivery platform utilized by the university allowed for the combination of multiple course
sections into one online “class,” students in the control and experimental groups were unaware
that students had been separated into different course sections. All students were able to
collaborate with each other in discussion board topics, regardless of whether they were
members of the control or experimental groups.
While both the control and experimental groups received the same educational content
material, digital presentations incorporated into the course during the ninth weeks of the two
15-week semesters differed in respect to their delivery format. The presentations provided to
the control groups consisted of static, un-narrated slide presentations, while those provided to
the experimental groups consisted of the same visual slide presentation, but also incorporated
narration by the instructor. McLeish [33] points out that learner interest typically begins to
decline after about 10 minutes, reaches a low point at about 40 minutes, and increases
somewhat during the last 10 minutes. Accordingly, none of the narrated presentations
exceeded 20 minutes, compared to the hour-long classroom recordings provided to both
groups. The narrated digital presentations contained essentially the same audio information as
the pre-recorded classroom lectures provided to both groups, but in a condensed format; hence,
audible instruction and coverage of course topics were not withheld from either group.
After incorporating the narrated presentations into the courses, the researcher monitored the
amount of time students spent accessing other course materials. The researcher also assessed
students’ levels of academic engagement, i.e., the level of meaningful discussion as determined
by evaluating the level of detail included in prompted responses to discussion topics provided
by the instructor. Both the control and experimental groups participated in audible discussion
boards, that is both the discussion topics presented by the instructor and students’ responses
were in the form of audio recordings rather than typed text. Students’ levels of academic
engagement were assessed by determining how many seconds of meaningful discussion they
posted to the discussion boards. Students’ comments and questions that were not germane to
Page 7 of 13
207
Donavant, B. W. (2023). Fostering Student-Faculty Engagement and Increasing Learning in the Ongoing Quest for Online Quality. Advances in Social
Sciences Research Journal, 10(6). 201-213.
URL: http://dx.doi.org/10.14738/assrj.106.14847
the respective topics were not included in the assessment. Finally, at the end of both 15-week
course iterations, all participants were given a post-test related to the course material. The
comprehensive post-tests comprised the weighted final examination and one component of the
courses’ grading criteria, and were included in the calculation of students’ final course grades;
other factors included in the courses’ grading rubrics included three additional examinations,
participation in discussion board postings, and a research paper.
FINDINGS
The majority of online students in this study exhibited significant learning improvement from
their participation in the respective courses, but the learning improvement of students
participating with enhanced narrated presentations was statistically significantly higher than
students participating with only traditional static presentations of the course materials. The
following comprehensive assessment of results includes descriptive data of the participants
and results of the statistical analyses, followed by a summary and discussion of the major
findings.
Descriptive Data
Cohort 1 of this study consisted of 24 criminal justice students enrolled in an online criminal
investigations course. Participants included 13 males (54%) and 11 females (46%), and ages
ranged from 20 to 64 years (M = 33.3, SD = 10.06). Seventeen (71%) of the participants were
white, six (24%) were black, and one (5%) was Hispanic. The majority of students were seniors
(13, 54%), with six juniors (25%), three sophomores (13%), and two freshmen (8%).
Cohort 2 also consisted of 24 criminal justice students enrolled in a subsequent iteration of the
same online course, with similar demographic distributions. Participants also included 13
males (54%) and 11 females (46%), and ages also ranged from 20 to 64 years (M = 31, SD =
9.67). Nineteen (79%) of the participants were white, four (17%) were black, and one (4%)
was Hispanic. The majority of students were seniors (12, 50%), with six juniors (25%), three
sophomores (12.5%), and three freshmen (12.5%).
Statistical Analyses
Paired samples t tests for Cohort 1 indicated that students in both the control and experimental
groups demonstrated statistically significant improvements in learning based upon pre- and
post-test scores. The control group post-test scores (M = 78.38, SD = 9.81) were significantly
higher than the pre-test scores (M = 61.75, SD = 8.31), t(12) = -5.42, p < .001; and, the
experimental group post-test scores (M = 83.55, SD = 7.98) were significantly higher than the
pre-test scores (M = 66.55, SD = 9.76), t(10) = -6.55, p < .001. Though not significantly, Cohort
2’s control group post-test scores (M = 44.5, SD = 11.48) declined from their pre-test scores (M
= 61, SD = 2.59), t (11) = 1.38., p = .19; however, the experimental group post-test scores (M =
84.33, SD = 2.33) were significantly higher than the pre-test scores (M = 67.0, SD = 2.72), t (11)
= -7.25, p < .001.
One-way analyses of covariance (ANCOVA) indicated that both cohorts experienced statistically
significant differences in learning improvement between the control and experimental groups
as demonstrated by the differences between post-test scores, controlling for pre-test scores.
Preliminary evaluations of the homogeneity-of-slopes assumption indicated that the
relationships between the covariates and the dependent variables did not differ significantly as
Page 8 of 13
208
Advances in Social Sciences Research Journal (ASSRJ) Vol. 10, Issue 6, June-2023
Services for Science and Education – United Kingdom
a function of the independent variables, F (1, 20) = .02, p = .90 (Cohort 1) and F (1, 20) = .29, p
= .59 (Cohort 2). The Cohort 1 ANCOVA revealed that the post-test scores of the experimental
group (M = 83.55, SD = 7.98) were statistically significantly higher than the post-test scores of
the control group (M = 78.38, SD = 9.81), indicating a statistically significant difference in the
effectiveness of OE with narrated digital presentation compared to OE without these features,
F (1, 21) = 6.66, p < .02. The Cohort 2 ANCOVA revealed that the post-test scores of the
experimental group (M = 84.33, SD = 2.33) were statistically significantly higher than the post- test scores of the control group (M = 67.0, SD = 2.72), indicating a statistically significant
difference in the effectiveness of OE with narrated digital presentation compared to OE without
these features, F (1, 21) = 9.53, p = .006. Both cohorts’ relationships between the post-test
scores and the delivery of narrated digital presentations were strong, as assessed by partial η2,
with these presentations accounting for 24% of the variance in Cohort 1 post-test scores and
31% of the variance in Cohort post-test scores, holding constant the respective pre-test scores.
Independent samples t tests were conducted to determine whether there were statistically
significant differences between the overall lengths of time spent accessing educational
materials, or between the levels of academic engagement as determined by evaluating the level
of detail included in prompted responses, by students in the control and experimental groups.
Differences in the overall length of time spent by the Cohort 1 control group (M = 40.54, SD =
27.15) compared to the experimental group (M = 57.50, SD = 39.27) were not significant, t(22)
= -1.25, p = .23, nor was the time spent by Cohort 2 control group (M = 39.08, SD = 8.03)
compared to the experimental group (M = 57, SD = 11.32), t(22) = -1.29, p = .21. Differences in
the level of engagement of the Cohort 1 control group (M = 162.69, SD = 164.23) compared to
the experimental group (M = 234.91, SD = 203.23) were not significant, t(22) = -1.97, p = .34,
nor was engagement of the Cohort 2 control group (M = 147.83, SD = 46.42) compared to the
experimental group (M = 215.58, SD = 50.85), t(22) = -.99, p = .33.
Pearson product-moment correlations were computed to determine whether statistically
significant relationships existed between learning improvement and the individual
independent variables of gender, race, age of the learner, number of years of formal education
received, lengths of time of academic engagement, and levels of academic engagement. Using
the Bonferroni approach, a p value of less than .001 was required for significance. Learning
improvement was not found to be significantly related to any of the individual independent
variables for either cohort.
Pearson product-moment correlations also were computed to determine whether statistically
significant relationships existed between course content mastery, as measured by post-test
score, and final course grades. Post-test scores of both cohorts were found to be significantly
related to final grades, r (22) = .74, p = <.001 (Cohort 1) and r (22) = .79, p = <.001 (Cohort 2).
DISCUSSION AND IMPLICATIONS
First and foremost, the study demonstrated that learning and academic success in online
criminal justice courses can occur under a variety of circumstances. While somewhat limited
by the relatively low number of participants, the two-cohort design combined with similar
results in the subsequent iteration of this study to bolster the reliability and validity of the
findings.
Page 9 of 13
209
Donavant, B. W. (2023). Fostering Student-Faculty Engagement and Increasing Learning in the Ongoing Quest for Online Quality. Advances in Social
Sciences Research Journal, 10(6). 201-213.
URL: http://dx.doi.org/10.14738/assrj.106.14847
This study primarily examined the efficacy of using narrated digital presentations in the
delivery of educational material within the online criminal justice environment to bring about
higher levels of learner engagement and, ultimately, learning improvement. Some students in
the control and experimental groups of both cohorts demonstrated a statistically significant
learning improvement based upon pre- and post-test scores. However, some students within
both control groups ‘disengaged’ from the course, even though they remained enrolled in the
course and participated – albeit minimally – throughout the semester. These students failed to
complete all course assignments, and some actually scored lower on the post-test than the pre- test. It was interesting that this was in contrast to students within the experimental groups who,
all but one student in Cohort 1, remained engaged in the course and completed all assignments,
and achieved higher scores on the post-test.
Just as academic program quality has long been a concern across higher education, many
scholars express concern that institutions may be ignoring academic quality standards in favor
of exploiting the growing online delivery trend and its expanding tuition revenues in order to
support other disciplines with less robust or sagging enrollments [24; 36; 44; 45]. Critical to
the integrity concerns that often accompany programmatic or institutional considerations for
venturing into or expanding online offerings, the results of this study indicate that the use of
narrated presentations, provided by the instructor of record rather than incorporated from
external sources, can enhance learning and, therefore, academic success. These resources
brought about a statistically significant increase in learning improvement compared to the use
of static presentations, all without statistically significant increases in the time spent accessing
the educational materials provided for the course. Ultimately, initial credence is granted to the
argument that more is not necessarily better, and that quality of instruction and educational
tools has a greater impact on learners’ success than quantity. Recognition of this factor may well
provide educators with the means to successfully embrace the paradigm shift that is underway
within education as a result of advances in technology and the continuing shift toward online
delivery while continuing to uphold academic program integrity standards.
Creative and engaging education utilizes not only the expertise of the facilitator, but also the
vast resources available to supplement evolving methods of educational delivery. The use of
narrated digital presentations is one means of meeting learners “where they live” and providing
them with rich material that will bring meaning to their learning experience. As more and more
learners embrace digital media and digital media presentations, they have expectations, skills,
and tools to use them with ease. This reality provides an enormous and exciting opportunity
for faculty to use the same tools to create enriching educational experiences, especially within
the online environment where facilitators often struggle to present information in rich and
meaningful ways. The use of narrated digital presentations is not a panacea to solving the
perceived lack of personal interaction with the facilitator that so often is cited as detrimental
to attracting and retaining online learners, or even requisite to creating quality online courses.
But it is one tool within a comprehensive approach to online education that can help to improve
the effectiveness of the experience.
Both an operational definition of engagement and the current literature on the effectiveness of
specific practices within the online environment to provide meaningful education are woefully
lacking [28; 43; 47). While numerous previous studies examined the general efficacy of online
delivery within higher education and some explored its feasibility for professional
Page 11 of 13
211
Donavant, B. W. (2023). Fostering Student-Faculty Engagement and Increasing Learning in the Ongoing Quest for Online Quality. Advances in Social
Sciences Research Journal, 10(6). 201-213.
URL: http://dx.doi.org/10.14738/assrj.106.14847
[9]. Boaz, M., Elliott, B., Foshee, D., Hardy, D., Jarmon, C., & Olcott, D., Jr. (1999). Teaching at a distance: A
handbook for instructors. New York: League for Innovation in the Community College and Archipelago, a
Division of Harbrace.
[10]. Bruns, D. L., & Bruns, J. W. (2015) Assessing the worth of the college degree on self-perceived police
performance. Journal of Criminal Justice Education, 26, 121-146. doi: 10.1080/10511253.2014.930161
[11]. Canada Police Sector Council, Canadian Police Knowledge Network. (2010). The state of e-learning in
Canadian policing: Elements of effective e-learning for police.
http://www.policecouncil.ca/reports/PSC%20State%20of%20E- Learning%20in%20Canadian%20Policing_Final.pdf
[12]. Chan, D. C., & Auster, E. (2003). Factors contributing to the professional development of reference
librarians. Library & Information Science Research, 25, 265-287.
[13]. Clark-Ibanez, M., & Scott, L. (2008). Learning to teach online. Teaching Sociology, 36, 34-41. doi:
10.1177/0092055X0803600105
[14]. Colvard, N. B., Watson, C. E., & Park, H. (2018). The impact of open educational resources on various
student success metrics. International Journal of Teaching and Learning in Higher Education, 30, 262-276.
http://www.isetl.org/ijtlhe/
[15]. Complete College America. (2018). About. https://www.completecollege.org/about/
[16]. Croteau, E. (2017). Measures of student success with textbook transformations: The Affordable Learning
Georgia Initiative. Open Praxis, 9, 93-108.
[17]. Daffron, S. R., & Caffarella, R. S. (2021). Planning programs for adult learners: A practical guide (4th ed.).
Hoboken, NJ: Jossey-Bass.
[18]. Donavant, B. W. (2009a). The NEW modern practice of adult education: Online instruction in a continuing
professional education setting. Adult Education Quarterly, 59, 227-245. doi:10.1177/0741713609331546
[19]. Donavant, B. W. (2009b). To internet or not? Assessing the efficacy of online police training. American
Journal of Criminal Justice, 34, 224-237. doi: 10.1007/s12103-009-9061-7
[20]. Donavant, B. W. (2011). Narrated digital presentations: An educator’s journey and strategies for
integrating and enhancing education. In K. King & T. Cox (Eds.), The Professor’s Guide to Taming
Technology: Leveraging Digital Media, Web 2.0 and More for Learning. Charlotte, NC: Information Age.
[21]. Donavant, B. W., Daniel, B. V., & MacKewn, A. S. (2013). (Dis)connected in today’s college classroom? What
faculty say and do about mixed-age classes. Journal of Continuing Higher Education, 61, 132-142. doi:
10.1080/07377363.2013.836811
[22]. Feldstein, A., Martin, M., Hudson, A., Warren, K., Hilton III, J., & Wiley, D. (2012). Open textbook and
increased student access and outcomes. European Journal of Open, Distance, and E-learning, 15(2).
http://www.eurodl.org/?p=archives&year=2012&halfyear=2&artictle&article=533
[23]. Fischer, L., Hilton III, J., Robinson, T. J., & Wiley, D. (2015). A multi-institutional study of open textbook
adoption on the learning outcomes of post-secondary students. Journal of Computing in Higher Education,
27(3), 159-172.
[24]. Flanagan, T. J. (2000). Liberal education and the criminal justice major. Journal of Criminal Justice
Education, 11, 1-13.
[25]. Hereford, L. (2000). NEA poll dives into distance learning. Community College week, 12(24), 30.
Page 12 of 13
212
Advances in Social Sciences Research Journal (ASSRJ) Vol. 10, Issue 6, June-2023
Services for Science and Education – United Kingdom
[26]. Hilton III, J. L., Fischer, L., Wiley, D., & Williams, L. (2016). Maintaining momentum toward graduation:
OER and the course throughput rate. International Review of Research in Open and Distributed Learning,
17(6), 18-27. http://www.irrodl.org/index.php/irrodl/article/view/2686/3967
[27]. Holley, C. A. (2002). Student satisfaction with and success in on-line and on-site English composition
classes. Unpublished doctoral dissertation, University of Southern Mississippi, Hattiesburg.
[28]. Jankowski, N. A., & Marshall, D. W. (2017). Degrees that matter: Moving higher education to a learning
systems paradigm. Sterling, VA: Stylus.
[29]. Lovett, M., Meyer, O., & Thille, C. (2008). The Open Learning Initiative: Measuring the effectiveness of the
OLI statistics course in accelerating student learning. Journal of Interactive Media in Education, 14, 1-16.
doi: 10.5334/2008-14
[30]. MacKewn, A. S., DePriest, T. L., & Donavant, B. W. (2022). Metacognitive Knowledge, Regulation, and Study
Habits. Psychology, 13(12). http://doi.org/10.4236/psych.2022.1312112
[31]. Mager, R. F. (1997). Preparing instructional objectives (3rd ed.). Atlanta, GA: CEP Press.
[32]. McConnell, S., & Schoenfeld-Tachner, R. (2004). Transferring your passion for teaching to the online
environment: A five step instructional development model. Fort Collins, CO: Colorado State University,
College of Veterinary Medicine and Biomedical Sciences.
[33]. McLeish, J. (1976). The lecture method. In N. L. Gage (Ed.), The Psychology of Teaching Methods, pp. 252-
301: Chicago: University of Chicago Press.
[34]. Miner-Romanoff, K. (2014). Student perceptions of juvenile offender accounts in criminal justice
education. American Journal of Criminal Justice, 39, 611-629. doi.org/10.1007/s12103-013-9223-5
[35]. Nadler, L., & Nadler, Z. (1994). Designing training programs: The Critical Events Model. Houston, TX: Gulf.
[36]. Oliver, W. M. (2013). The History of the Academy of Criminal Justice Sciences (ACJS): Celebrating 50 years,
1963-2013.
https://cdn.ymaws.com/www.acjs.org/resource/resmgr/Historian/ACJS50thAnniversaryHistoryBo.pdf
[37]. Pawlyshyn, N., Braddlee, D., Casper, L., & Miller, H. (2013). Adopting OER: A case study of cross- institutional collaboration and Innovation. EDUCAUSE Review.
https://er.educause.edu/articles/2013/11/adopting-oer-a-case-study-of-crossinsitutional-collaboration- and-innovation
[38]. Picciano, A. G. (2001). Distance learning: Making connections across virtual space and time. Upper Saddle
River, NJ: Prentice-Hall.
[39]. Rachal, J. R. (2002). Andragogy’s detectives: A critique of the present and a proposal for the future. Adult
Education Quarterly, 52, 210-227.
[40]. Reif, L. R. (2013, October 7). Online learning will make college cheaper. It will also make it better. Time,
54-55.
[41]. Roach, R. (2002). Staying connected. Black Issues in Higher Education, 19, 22-26.
[42]. Sakurai, J. M. (2002). Traditional vs. online degrees. E-Learning, 3, 28-32.
[43]. Sitren, A. H., & Smith, H. P. (2017) Teaching criminal justice online: Current status and important
considerations. Journal of Criminal Justice Education, 28, 352-367. doi: 10.1080/10511253.2016.1254267
Page 13 of 13
213
Donavant, B. W. (2023). Fostering Student-Faculty Engagement and Increasing Learning in the Ongoing Quest for Online Quality. Advances in Social
Sciences Research Journal, 10(6). 201-213.
URL: http://dx.doi.org/10.14738/assrj.106.14847
[44]. Sloan III, J. J. (2018) The state of criminal justice educational programs in the United States: Bachelors’
degrees, curriculum standards, and the ongoing quest for quality. Journal of Criminal Justice Education.
doi: 10.1080/10511253.2018.1457701
[45]. Southerland, M. D., Merlo, A. V., Robinson, L., Benekos, P. J., & Albanese, J. S. 2007). Ensuring quality in
criminal justice education: Academic standards and the reemergence of accreditation. Journal of Criminal
Justice Education, 18, 87-105.
[46]. Southern Association of Colleges and Schools Commission on Colleges. (2018). Resource manual for the
principles of accreditation: Foundations for quality enhancement (3rd ed.). Decatur, GA: Author.
[47]. Stack, S. (2013) Does discussion promote learning outcomes? Analysis of an online criminology class:
Research note. Journal of Criminal Justice Education, 24, 374-385. doi: 10.1080/10511253.2012.758752
[48]. Stack, S. (2015) The impact of exam environments on student test scores in online courses. Journal of
Criminal Justice Education, 26, 273-282, doi: 10.1080/10511253.2015.1012173
[49]. United States Department of Education. (2018). Digest of education statistics – 2018.
https://nces.ed.gov/programs/digest/2018menu_tables.asp