NSF Funded HSI Pilot Project: Improving Online STEM Education for Undergraduate Students in HSI contexts
(An NSF-funded, best-practices project comparing four disciplines in five theme areas)
October 2022—October 2024
Principal Investigators:
Dr. Anna Ni: overall project lead
Dr. Miranda McIntyre: lead methodologist
Dr. Pamela Medina: lead in DEI issues (theme)
Dr. Yunfei Hou: lead in online lab strategies (theme)
Senior Staff:
Dr. Jing Zhang: lead in rehearsal strategies (theme); project website/listserv
Dr. Yu Liu: lead in online testing strategies (theme); project website/listserv
Dr. Monty Van Wart: lead in lecture strategies (theme); project facilitator
Dr. Gogle Seferoglu: project evaluator
Other Personnel (Disciplinary Leads):
Mathematics:
Dr. Youngsu Kim
Dr. Hani Aldirawi
Psychology:
Dr. Donna Garcia
Information Decision Sciences:
Dr. Jesus Canelon
Dr. Lewis Njualem
Computer Science and Engineering:
Dr. Yunfei Hou
PROJECT SUMMARY
(Note: Proposed to approved project moved from 8 to 4 disciplines and from 4 to 5 theme areas at request of NSF)
Overview:
Improving Online STEM Education for Undergraduate Students in HSI contexts HSI Program goals include (a) engaged student learning in accord with Hispanic students needs and preferences, (b) higher learning achievement, (c) improved retention, and (d) higher graduation rates. This proposal seeks to achieve all the goals through better basic understanding of online STEM education in HSI contexts, better understanding the applied challenges and best practices STEM online education In HSI contexts, and better dissemination of best practices.
Overview: A pilot project is proposed to improve both the teaching of students and instructing instructors, and identification of the best dissemination strategies to aid HSIs in their quest to continuously improve their education. Specifically, the pilot project will survey the specific challenges and best practices in eight four disciplines in a single HSI in five theme areas. Special focus will be given to lecture attributes, student rehearsal strategies, lab issues and testing practices which have been particularly problematic in the STEM context. The pilot project will identify a series of strategies to disseminate initial findings. The Track 2 project will take the framework established in the pilot project at one HSI institution and replicate it at four HSI community colleges, four HSI teaching universities in the CSU system, and four HSI research institutions. This will provide a deep pool of comparative practices in each of the disciplines in each of the theme areas.
Intellectual Merit:
While much is known about online education in general and ad hoc online successes in STEM, there is much research to be done in order to understand the underlying challenges of increasing student learning achievement, engagement and retention in STEM and HSI settings. Best practices have yet to be determined in HSI contexts in order to overcome challenges and effectively take advantage of online education strengths. While STEM disciplines have many high- and mid-level teaching similarities with non-STEM disciplines, they have numerous substantial mid- and micro-level differences as well. For example, when trainers from centralized units advise all faculty to keep their recorded lectures very short, STEM instructors tend to throw their hands up in despair because they know students want relatively full lectures explaining the highly detailed knowledge they are expected to master.
Broader Impacts:
Improving online education in STEM has been a long-time challenge because of weak understanding of the special needs of STEM curricula and mismatched training strategies often used by centralized training staff. This has led to poor adaptation of online teaching strategies to STEM education needs. This problem has swelled during the COVID-19 pandemic in which student and faculty concerns continue unabated (Hammerness et al., 2022). Because HSI students also have different needs and resources than the general population of students, clearly matching their needs and constraints can have an enormous impact on ensuring online teaching effectiveness. For example, HSI students are more likely to have dense living arrangements that make for serious distractions, requiring support of a variety of instructional venues for maximal effectiveness for those students affected.
Project description
Background for and Goals of the Proposal:
As expressed in the summary statement, HSI Program goals include (a) engaged student learning relative to Hispanic students’ needs and preferences, (b) higher learning achievement, (c) improved retention, and (d) higher graduation rates. This proposal seeks to improve all of these goals through better basic understanding of online STEM education in HSI contexts, better understanding of the applied challenges and best practices in dealing with STEM online education in HSI contexts, and better dissemination of best practices. It also aligns with the institution’s mission and strategic plan at CSUSB, as it states one of the core objectives in CSUSB Strategic Plan 2015-2022 is to “support student success by creating new quality online courses each academic year to facilitate the Graduation Initiative 2025.”
Providing optimally designed online learning practices has been problematic in online education in general for a number of reasons.
First, when online education is poorly understood or when there are unrealistic expectations – frequently the case in higher education – a series of institutional support problems occur (Kushnir & Berry, 2014). Online teaching requires new skills and teaching strategies, and without robust, high-quality training, it is likely faculty will suboptimize their teaching talents.
A second set of issues has to do with faculty concerns and motivation (Mansbach and Austin, 2018). Because online teaching requires new skills and online courses take enormous upfront investments of time to “build,” faculty are concerned about sufficient, timely training and the opportunity to devote adequate energy in advance of teaching a new course in the face of other obligations. Students routinely evaluate online instructors more critically (Otter et al., 2013), so there are promotion and tenure concerns as well.
A third concern regards the overall quality of instruction (Bernard et al., 2004; Marks, Sibley, and Arbaugh, 2005; Tallent-Runnels et al., 2006). Will students get enough aural/visual input when they move to an online mode? While “richer” presentations in online classes include prerecorded podcasts or videos, and/or “live” videoconference sessions, they are generally shorter in time and sometimes dramatically less than in face-to-face classrooms, leading to concerns about insufficient media variety (Arbaugh, 2014).
Challenges to “presence” both social and cognitive are critical to overcome in online education. Social presence—student-to-student learning—is important in STEM learning, but is more about mastery and practice of protocols than other settings in which it may be more about discussions regarding social concepts, philosophical perspective, and personal experiences. STEM students enjoy quality rehearsal activities with fellow students but are highly critical of poorly-designed rehearsal that they call “busywork” (Kehrwald, 2008; Wei, Chen, & Kinshuk, 2012). Cognitive presence—both intellectual stimulation and professional enthusiasm—are easily “washed out” in a two-dimensional virtual learning environment. Because of the increased importance of lectures in STEM environments, instructors must be aware of and able to find strategies to compensate for a less-connected space.
A fifth consideration involves student readiness (Sieber, 2005). Instructors must be prepared for the online learning experience, but so, too, must students. For example, without the perception of a structured lecture schedule (even though weekly activities may actually be as structured), inexperienced students may be lured into unrealistic notions of what they can accomplish or will do in virtual contexts (Rooij & Zirkle, 2016).
Hispanic and other URM students have additional needs and preferences that are critical to integrate whenever possible in any institutional environment, but especially when HSIs are supporting students that may have increased cultural, linguistic, economic, and other challenges (Hammerness et al., 2022). While cultural generalizations should always be guarded, first generation Hispanic students tend to like friendly as opposed to competitive environments, more highly-structured teaching styles, and more class interaction. HSI or not, weaker students need more monitoring, more structured rehearsal, and more external and customized encouragement. Online learning environments can certainly suit these needs and preferences, but instructor transition from face-to-face (F2F) to online calls out for clear instructor best practice conventions to be known and adopted.
When these factors are not adequately recognized and addressed, student retention will flag which is a chronic problem in weaker online teaching programs, and education rates will dip. Yet the opportunities are equally important and great (Young, 2006). First, online learning provides education at a distance. Traversing distance for students and faculty costs time and travel expense and is sometimes a complete bar to attendance (Nguyen, 2015; Song, Singleton, Hill, & Koh, 2004). Second, online learning greatly enhances convenience and this element is most enhanced when conducted in an asynchronous mode so students can access learning modules to fit in with their schedules (Song, Singleton, Hill, & Koh, 2004). However, convenience is also enhanced in an online synchronous mode in that students and faculty can participate anywhere around the world (Boling et al., 2012). Third, the need for brick-and- mortar space is reduced when teaching online in “richer” real-time interactions (Traynor-Nilsen, 2017). Fourth, online learning enhances digital skills, such as student presentations, which are becoming increasingly important in the current digital age (Hernandez-Lara & Serradell-Lopez, 2018). There is also evidence that training in online teaching improves face-to-face teaching, and allows the integration of online resources in face-to-face settings (Kearns, 2016). Additionally, well designed online education can provide opportunities for enhanced rehearsal and individual feedback on a weekly basis in well-designed courses (Maycock, 2018; McGivney- Burelle, 2013).
While much research has been done regarding online contexts, the research about STEM in online contexts has been modest, and for the most part narrow, as noted in the only comprehensive review of the field in some years (Kekalis & Drigas, 2019). Systematic research regarding Hispanic students in online STEM settings has been lacking. This is the problem this proposal seeks to address.
Intellectual Merit:
Although the field of online education has advanced significantly over the last 25 years—from experience, research, and improved technology—much has yet to be done. STEM disciplines are confronted with numerous heightened challenges that have been poorly addressed or altogether ignored. On top of this, the literature has suggested time-and-time-again the challenges to students who are culturally marginalized, have external challenges such as work and family obligations, lack traditions of college-going family members, etc. However, current studies have been heterogeneous and contradictory, have numerous research gaps, and lack a robust methodology. It is true that most STEM disciplines have high- and mid-level teaching similarities with non-STEM disciplines, but they also have numerous substantial mid- and micro- level differences as well.
The following five areas—overall impact of online, design issues, effects of the pandemic, labs, and disciplinary focus—provide some sense of the state-of-the-art in STEM contexts.
Numerous studies have found that when there is rough equivalence in teaching in traditional versus virtual formats, learning achievement is roughly equivalent (Wladis, Hachey, & Conway, 2014; Biel & Brame, 2016; Beltz et al. 2016). For example, Darrah et al. (2014) found that physics labs could be as effective, Faulconer et al. (2018) found that online chemistry students outperformed traditional students, and Somenarain, Akkaraju, & Gharbaran (2010) found that there were no statistical differences in achievement or satisfaction in a wide-scale study of online and face-to-face biology students. Some studies point to less student satisfaction even when achievement is roughly equivalent and that non-design issues (e.g., environmental noise and interruptions) will frequently add stress to students’ experience (Perets et al., 2020; Petillion, & McNeil, 2020; Sedaghatou, 2021; Rennar-Potacco, & Orellana, 2018). For example, Flowers, Raynor, & White (2013) point to a common trend that students have a slight bias in general toward a preference for traditional teaching when flexibility and convenience are not factored in. While some studies have concerns about the effectiveness in advanced classes (Viegas et al., 2018), others found online to be effective in collaborative problem solving (Lin et al., 2014) and higher order thinking skills (Widyaningsih et al. 2020). While one study pointed out a concern for weaker students essentially getting lost in the virtual teaching environment (Francis, Wormington & Hulleman, 2019), Viegas et al. (2018) found that multiple learning modes available in online classes could be especially helpful for students with more learning needs.
Much of the heterogeneity and murkiness of the literature has to do with good design in online teaching. Good design and implementation may or may not create the equivalence desired between traditional and virtual formats, but it is universally accepted that good design (including potentially using hybrid formats rather than fully online formats, and integrating online tools in traditional classes) will significantly improve the learning experience for online and traditional students (August et al. 2015; Chen, Bastedo, & Howard, 2018; Ehrlich, McKenney, & Elkbuli, 2020; Iglesias-Pradas et al., 2021; Pagoto et al., 2021; Rennar-Potacco & Orellana, 2018; Freeman et al. 2007). For example, West et al. (2021) found that a STEM faculty institute (initially supported by an NSF grant) created a high-performing teaching environment that easily undertook the teaching challenges of the pandemic in stride. Numerous studies look at the specific and strategic use of online tools such as discussion groups (Lin et al., 2014; Tibi, 2018), specific virtual programs (Clevenger et al., 2019; Darras, 2021; Silva et al., 2013; and Viegas et al., 2018) and online homework and rehearsal strategies (Cheng et al., 2014; Eichler & Peeples, 2013). Despite the design opportunities, numerous faculty continue to have substantial concerns (Code, Ralph, & Forde 2020; Dumont et al., 2021).
Labs have long been recognized to be particularly challenging in the online context, an enormously important aspect of the learning paradigm in STEM. Some studies have looked at the potential and specific successes of virtual lab experiences. (August et al., 2015; Darrah et al. 2014; Viegas et al., 2018; Hou et al., 2021a, 2021b). As previously mentioned, Beltz et al. (2016) found that well-designed labs in biology with modest hybrid elements outperformed traditional labs. Yet other researchers have focused on the challenges of virtual labs and the negative experiences of students and faculty (Brancaccio-Taras et al., 2021; Sansom et al., 2020).
The pandemic has brought a great deal of confusion to the online learning context because of the conflation of the reality that many instructors faced in emergency remote situations and the potential when good planning, strong design, and extensive training are in place. Those programs and contexts that had limited planning, design experience, and weak generic training tended to suffer the most (Blizak et al., 2020; Brancaccio-Taras et a.l, 2021; Code, Ralph & Forde 2020; Mnguni & Mokiwa, 2020; Pagoto et al., 2021; Perets et al., 2020; Sansom et al., 2020). On the other hand, programs that had a stronger online teaching culture did significantly better in adapting to a fully online environment (Ehrlich, McKenney & Elkbuli, 2020; Iglesias- Pradas et al., 2021; Miltiadous, Callahan, & Schultz, 2020; West et al., 2021).
There is some coverage in the various sciences, technology, and engineering disciplines. Mathematics seems to be significantly under-represented in terms of virtual teaching in higher education, especially in the U.S. context. In terms of disciplinary focus, over a dozen studies have looked at online issues with a mixed STEM sampling or perspective. However, none have done so with a robust comparative focus.
In sum, research on STEM education in higher education has lagged other areas (Hammerness et al., 2022), and this is compounded by the need to target research on Hispanics and other underrepresented minorities. Much of the research is contradictory and has not been well integrated. In addition, comparisons across the disciplines have not been attempted to date. STEM disciplines have been slow to build online evidence-based teaching/learning communities and special venues despite the extraordinary changes in the field and the emergency measures forced by the pandemic.
Our Capacity:
The proposers are a subgroup of the CSUSB Online Learning Research Team (OLRT). OLRT is a well-established, all-university group that meets as a core team monthly, divides into subgroups for individual projects, and has had a successful record of achievement starting before the COVID pandemic. The overall team’s recent research agenda has included a literature review (Van Wart et al., 2019), online teaching challenges (Ni et al., 2020), online student perceptions of quality (Kordrostami and Seitz, 2021, Van Wart et al., 2020a, Van Wart et al. 2020b; Zhang et al., 2020), faculty competencies (Van Wart et al., 2019), perspectives related to online instruction from an accrediting standpoint (Van Wart et al., 2021), the special challenges of COVID and online labs (Hou et al., 2021a, 2021b), and faculty adoption (Dumont et al., 2021).
Ongoing research includes projects on both student and faculty adoption trends in which data has been collected and is in the process of being submitted for publication. The team has already had the experience of including two other institutions (University of Mississippi and University of North Florida) where data collection has produced collaborative results. (They are not included in this project because they are not HSI institutions). This proposal is intended to promote university collaboration and, in particular, begin to create a research network of Hispanic institutions interested in tackling the challenges that face their students in online settings in STEM.
General Statement of Work:
Disciplinary focus.
The project will survey the specific challenges and best practices in eight four disciplines in a single HSI. Those disciplines are: physics, biology, geography and environmental studies, geology, mathematics, computer science, psychology, and policy analysis/public administration. Five research focus areas will be investigated in depth. Those areas are: lecture attributes, student rehearsal strategies, lab issues (such as recreating a reasonable facsimile of laboratory experiences online), and testing challenges which have been particularly problematic in the STEM context, and DEI-related issues. Both faculty and students in each discipline will be surveyed the first year with a wide variety of question items that are both short answer (Likert-style or other) as well as open ended, regarding perceptions of best and poorest practices in various contexts. Then, exploring the issues raised in the data analysis regarding disciplinary and cross- disciplinary, the investigators will conduct faculty and student interviews and focus groups of students in year two of this multi-method study. Year two investigations will allow the investigators to study situational, instructor, and learner variations, ultimately resulting in taxonomic comparisons and detailed recommendations.
Examples of the types of issues are that will be investigated in each of the four research areas are illustrated as question items below.
Lecture items addressed to students include questions about optimal lecture time for online synchronous and asynchronous lectures, ideal and acceptable blends of lecture modes (F2F, synchronous, and asynchronous), factors leading to optimization of synchronous formats, factors leading to optimization of asynchronous formats, concrete examples of excellence, examples of motivating students in an online lecture context, assessments of faculty use of online lecture technology, learning system platform issues, etc. Lecture items for faculty will include questions about their perceptions of the optimal lecture time for online synchronous and asynchronous lectures, ideal and acceptable blends of lecture modes (F2F, synchronous, and asynchronous), factors leading to optimization of synchronous formats, factors leading to optimization of asynchronous formats, challenges and successes in providing stimulating lectures, challenges and successes in keeping lecture preparation manageable, quality and concerns about online lecture technology, learning system platform issues, etc. These surveys have been drafted and ready for IRB submission. They will then be distributed with the assistance of the eight four disciplinary lead faculty.
Student rehearsal items (outside of labs) address student and faculty perceptions of the utility and cost-benefit of automated question pools for study, the use of online homework, the use of quizzes, the helpfulness in assigning points to rehearsal work, and positive and negative examples of lab practices in different disciplines.
Items about labs include the degree to which fully online labs are amenable in each discipline under study, the ability to turn portions of labs into online demonstrations, the role of instructor experience, the use of virtual and augmented reality, and positive and negative examples.
Items related to testing include the use or non-use of fully online testing models as well as hybrid testing models. In the use of online testing modes, we are interested in test structures (types of questions), differences in test structures of online and face-to-face classes, use of test time windows, use of lock-down software, use of video monitoring software, and perceived strengths and weaknesses of the various approaches. We will also inquire about the degree to which classes are hybrid, with lecture and rehearsal activities being primarily online, and labs and tests being face-to-face.
In year two, the project directors will conduct qualitative follow-up in conjunction with the eight four faculty disciplinary leads. Five faculty interviews per discipline will be conducted in year two. In addition, at least three student focus groups will be held in each discipline, either in a class setting or a specially recruited student group.
Management Plan:
The team has already built the questionnaires for faculty and students and identified prospective faculty to act as disciplinary leads. The team will file for IRB approval in the spring in anticipation of possible grant funding. Such application should be a routine expedited review since the human subject confidentiality and ethical issues are modest.
Fall 2022. Receipt of the grant will trigger the initial coordination activities. Those activities would include the following. Re-confirmation by the PIs of the disciplinary leads and two briefings to ensure disciplinary leads understand the theoretical background of the project and their specific roles in assisting with the collection of data. The research and publication opportunities for disciplinary leads would be highlighted in these meetings. The PIs will also plan the online learning website. The pilot for the student and faculty survey instruments will be conducted, led by McIntyre. The evaluator will provide a brief review of the pilot.
Spring 2023. Data will be collected from the remaining disciplines and recollected from the pilot disciplines to provide methodological consistency and verify the pilot results. The primary goal is to ensure that the dataset is as large as possible. The goal is for a total student sample size of at least 1,200 responses across disciplines, with at least 100 responses per discipline (some of the disciplines are relatively small). We have found that achieving a good student response is best provided by targeting specific gateway courses that represent different parts of the curriculum.
The goal of the faculty surveys is a 50% percent response rate from all regular faculty, and a 25% response rate from lecturers. Since there are approximately 139 tenure track and tenured faculty in the targeted disciplines, and approximate 50 fulltime lecturers, our goal in the faculty surveys is 90 respondents. The outreach will be supported by the inclusion of disciplinary leads in each department, as well as the provision of specialized online teaching seminars roundtable discussions in each of the disciplines which will be carefully customized for them in the second semester. The seminars roundtables will be facilitated and coordinated by the Faculty Research Fellow to ensure full participation of the relevant department chairs and college deans. Since the University is under tremendous pressure to enhance graduation rates by the Chancellor and systemwide Board of Regents, this initiative is especially strategic for deans in improving their metrics, which are carefully monitored at the system level. It should be noted that the Faculty Research Fellow conducts dozens of seminars each year with very high participation rates (estimated at over 600 one-time faculty attendees in the 2021-22 academic year), due to strong outreach and the support of the President, Provost, and Deans who have all participated multiple times, as well as most chairs in the University.
Summer 2023. The survey data is analyzed overall and by discipline. Faculty interview and student focus group protocols are established based on the empirical data. The evaluator reviews the formative findings and writes a brief report. Conference presentations start at this time. A university webpage is established featuring online learning research.
Fall 2023. Five faculty interviews and three student focus groups are conducted in each of the eight four disciplines. The qualitative data will be aggregated, integrated with the empirical results, and compared across disciplines.
Spring 2024. At least three papers will be drafted in the beginning of the spring term (overall descriptive paper, one or more disciplinary specific papers, and an impact paper focusing on Hispanic students) based on the multiple methodology approach used (empirical and qualitative). Later in the spring there will be conference presentations, and papers will be submitted for publication as they are ready.
Summer through September 2024. During the final portion of the grant, the conference presentations and article submissions continue. A university-wide, half-day mini-conference is held at the beginning of the summer term. Potential Track 2 collaborators from other universities are asked to participate in anticipation of the next funding request. The final program evaluation is submitted. The final reporting documents are submitted to NSF. Request for Track 2 funding submitted.
Communication Plan:
The project will identify a series of strategies to inform institutional partners of the project as it is undertaken and executed, and to disseminate findings from the pilot study.
As a part of the ongoing DEI-oriented research forum led by the University Faculty Research Fellow, a university-wide forum will be held specifically related to this project as soon after funding as is plausible. The Faculty Research Fellow currently facilitates over 30 faculty seminars a year, directly presenting approximately one-third himself. Such seminars are well attended (approximately 25 faculty participants on average, as well as substantial participation by chairs, directors, and deans). We will ensure the presence of all three deans involved in the project, as well as all relevant department chairs to ensure their active support.
An institution-wide mini-conference is planned at the end of the pilot. With the assistance of the Faculty Research Fellow who conducts dozens for workshops for the university every year and who has well established relationships with the president, provost, deans, and chairs, such a conference will be widely attended because of the institutional impact it is known to have.
The Online Learning Research Team, from which this project originates, has an extensive record of publication regarding online teaching and learning in peer-reviewed journals. For example, one piece entitled “Integrating Students’ Perceptions about Online Learning: A Hierarchy of Factors,” published in the International Journal of Educational Technology in Higher Education in 2020 has already been cited 36 times in Google Scholar. Because of the interdisciplinary nature of the study, and the amount of data that will be collected, it is a goal to publish three papers from this research, which is typical of the group’s past achievement record. One paper will focus on the overall comparative results of the project, one (or more) will focus on the specific disciplines and their challenges in online teaching/learning, and one would compare the URM and non-URM student perceptions and needs.
Relatedly, the team has made numerous conference presentations at a variety of conferences. Because of the different disciplines involved, it is a goal of the project to have at least a half dozen different conference presentations across the targeted disciplines. These conference presentations will aid the team in preparing for broader collaborations in submitting its Track 2 application.
The OLRT will set up a research website to share the datasets and research findings, as well as host the past studies and future plans of the group. A listserv will be created that can function as a newsletter to interested parties inside and outside of the institution. A particular emphasis will be collaboration with the other 23 CSUs, three local community colleges (San Bernardino Valley, Chaffey, and Riverside) and one research institution (University of California Riverside).
The ultimate goal of the communication plan (in Track 2) is to provide a book on best practices in STEM disciplines that highlight the fundamental issues of virtual education but are less chronologically bound to specific software.
Commitment and sustainability:
The OLRT has received multiple funding infusions from the Office of the Faculty Research Fellow, the JHB College of Business and Public Administration (seed funds) and the Provost’s Office. The project has the strong support of three deans. Additionally, the Chancellor’s Office (where research findings have been presented in extended keynote sessions) has indicated a strong interest in taking the project findings “to scale.”
Evaluation Plan:
The evaluator for this project is an educational expert who is outside the OLRT and targeted colleges, but inside CSUSB. She is a well-known scholar and has utilized numerous research methodologies herself. Her job will be to critique the ongoing work of the pilot project in the following ways.
First, she will independently review the beta test data from the student and faculty surveys in a brief written report to the team. Two disciplines will be used as pilot tests, and she will provide her impressions of weaknesses or missed opportunities in the linguistic, presentational, or functional aspects of the data collection plan before it is fully rolled out.
At the end of the first year, she will review the progress and quality of results of the team as it moves to the interview and focus group phase. Again, she will provide a brief formative report to the team.
Finally, at the end of the project she will perform a full summative assessment regarding the quality and generalizability of the results, and practical suggestions for scaling up the project for a Track 2 research application.
Institutional Impacts:
We anticipate that the project will directly affect all faculty in all the targeted disciplines because we expect course adjustments in most of the curricula. As noted, we anticipate a direct effect on 189 regular fulltime faculty including fulltime lectures. Because of the faculty seminars we expect direct involvement of 90 faculty members including all chairs.
We project that all students in the eight four disciplines will be affected by the end of the project because of curricular improvements. The disciplines had a combined student population of 4,736 majors in Fall 2021. We have set a goal of approximately 25%, or 1,200 students as our level of student involvement.
Broader Impacts
Improvements in online education in STEM has been hindered by a weak understanding of the special needs of STEM curricula and mismatched training strategies often used by centralized training staff, leading to poor adaptation of online teaching strategies to STEM education needs. Awareness of the need to correct these weaknesses surged during the COVID-19 pandemic in which student and faculty concerns have continued unabated. While some STEM educators have found ingenious solutions to provide high-quality education, other faculty in various disciplines continue to struggle. HSI students have different needs and resources than the general population of students. Matching their needs and constraints can have an enormous impact on ensuring greater online teaching effectiveness.
The pilot project would certainly have institutional impacts that could be quickly disseminated and replicated in the CSU system given the OLRT’s high visibility in the CSU Chancellor’s Office. The Faculty Research Fellow has twice been the sole presenter at all-CSU meetings (once for vice presidents of technology, and once for all-university system representatives regarding online education). This is borne out by the OLRT success in joining up with other institutions even without funding support.
The proposing team’s successful track record of publication and presentation bodes well for this research to have international impact, given the scattered literature and enormous need for organized, empirically-based, comparative research.
References:
August, S. E., Hammers, M. L., Murphy, D. B., Neyer, A., Gueye, P., & Thames, R. Q. (2015). Virtual engineering sciences learning lab: Giving STEM education a second life. IEEE Transactions on Learning Technologies, 9(1), 18-30. Arbaugh, J. (2014). System, scholar or students? Which most influences online MBA course effectiveness? Journal of Computer Assisted Learning, 30(4), 349-362.
Beltz, D., Desharnais, R., Narguizian, P., & Son, J. (2016). Comparing Physical, Virtual, and Hybrid Flipped Labs for General Education Biology. Online Learning 20 (3) 228 – 243. (NSF supported). Bernard, R.M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3): 379-439.
Biel, R., & Brame, C. J. (2016). Traditional versus online biology courses: connecting course design and student learning in an online setting. Journal of microbiology & biology education, 17(3), 417-422.
Blizak, D., Blizak, S., Bouchenak, O., & Yahiaoui, K. (2020). Students’ perceptions regarding the abrupt transition to online learning during the COVID-19 pandemic: case of faculty of chemistry and hydrocarbons at the University of Boumerdes—Algeria. Journal of Chemical Education, 97(9), 2466-2471.
Boling, E., Hough, M., Krinsky, H., Saleem, H., & Stevens, M. (2012). Cutting the distance in distance education: Perspectives on what promotes positive online learning experiences. Internet and Higher Education.
Brancaccio-Taras, L., Mawn, M. V., Premo, J., & Ramachandran, R. (2021). Teaching in a Time of Crisis: Editorial Perspectives on Adjusting STEM Education to the “New Normal” during the COVID-19 Pandemic.
Chen, B., Bastedo, K., & Howard, W. (2018). Exploring Design Elements for Online STEM Courses: Active Learning, Engagement & Assessment Design. Online Learning, 22(2), 59-75.
Cheng, K. K., Thacker, B. A., Cardenas, R. L., & Crouch, C. (2004). Using an online homework system enhances students’ learning of physics concepts in an introductory physics course. American Journal of Physics, 72(11), 1447-1453.
Code, J., Ralph, R., & Forde, K. (2020). Pandemic designs for the future: perspectives of technology education teachers during COVID-19. Information and Learning Sciences.
Clevenger, C. M., Abdallah, M., Wu, W., & Barrows, M. (2019). Assessing an online tool to promote sustainability competencies in construction engineering education. Journal of Professional Issues in Engineering Education and Practice, 145(1), 04018014.
Dumont, Van Wart, Ni, Beck, and Pei. 2021. Effect of the COVID Pandemic on Faculty Adoption of Online Teaching: Reduced Resistance but Strong Persistent Concerns, Cogent Education, 8, 1.
Darrah, M., Humbert, R., Finstein, J., Simon, M., & Hopkins, J. (2014). Are virtual labs as effective as hands-on labs for undergraduate physics? A comparative study at two major universities. Journal of science education and technology, 23(6), 803-814.
Darras, K. E., Spouge, R. J., de Bruin, A. B., Sedlic, A., Hague, C., & Forster, B. B. (2021). Undergraduate radiology education during the COVID-19 pandemic: a review of teaching and learning strategies. Canadian Association of Radiologists Journal, 72(2), 194-200.
Ehrlich, H., McKenney, M., & Elkbuli, A. (2020). We asked the experts: virtual learning in surgical education during the COVID-19 pandemic—shaping the future of surgical education and training. World Journal of Surgery, 1.
Eichler, J. F., & Peeples, J. (2013). Online homework put to the test: A report on the impact of two online learning systems on student performance in general chemistry. Journal of Chemical Education, 90(9), 1137-1143.
Freeman, S., O’Connor, E., Parks, J.W., Cunningham, M., Hurley, D., Haak, D., Dirks, C., and Wenderoth, M.P. (2007). Prescribed active learning increases performance in introductory biology. CBE—Life Sciences Education, 6(2): 132-139.
Faulconer, E. K., Griffith, J. C., Wood, B. L., Acharyya, S., & Roberts, D. L. (2018). A comparison of online and traditional chemistry lecture and lab. Chemistry Education Research and Practice, 19(1), 392-397.
Flowers, L. O., Raynor Jr, J. E., & White, E. N. (2013). Investigation of academic self-concept of undergraduates in STEM courses. Journal of Studies in Social Sciences, 5(1).
Francis, M. K., Wormington, S. V., & Hulleman, C. (2019). The costs of online learning: Examining differences in motivation and academic outcomes in online and face-to-face community college developmental mathematics courses. Frontiers in psychology, 10, 2054.
Hammerness, K., MacPherson, A., Gupta, P., Chaffee, P., Anderson, K., Lagodich, L., & Abouelkheir, M. (2022). Missed opportunities in online learning. Inside Higher Ed, March 22, https://www.insidehighered.com/views/2022/03/14/stem-students-struggled- online-learning-opinion?utm_source=Inside+Higher+Ed&utm_campaign=981719c6fa- DNU_2021_COPY_02&utm_medium=email&utm_term=0_1fcbc04421-981719c6fa- 236295257&mc_cid=981719c6fa&mc_eid=36bfa927ab
Hernandez-Lara, A. B., & Serradell-Lopez, E. (2018). Student interactions in online discussion forums: Their perception on learning with business simulation games. Behaviour & Information Technology, 37(4), 419-429.
Hou, Y., Muheidat, F., Ghasemkhani, A., Sun, Q., Qiao, H., McIntyre, M., Van Wart M. (2021a). The Adaptation of Online Project-based Learning in Computer Engineering Education Settings, ICL2021. pp. 1166-1173.
Hou, Y., Muheidat, F., Usher, T., Prado, W., Guo, X., Van Wart, M., (2021b). Evaluation of the COVID-19 Shock on STEM Laboratory Courses. IEEE EDUCON 2021 (Best Paper Award).
Iglesias-Pradas, S., Hernández-García, Á., Chaparro-Peláez, J., & Prieto, J. L. (2021). Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: A case study. Computers in Human Behavior, 119, 106713.
Kearns, L.R. (2016). The experience of teaching online and its impact on faculty innovation across delivery methods. Internet and Higher Education, 31: 71-78. Kehrwald, B. (2008). Understanding Social Presence in Text-Based Online Learning Environments, Distance Education, 29(1), 89-106.
Kefalis, C., & Drigas, A. (2019). Web Based and Online Applications in STEM Education. International Journal Engineering Pedagogy., 9(4), 76-85. Kordrostami and Seitz. (in queue). Faculty Online Competence and Student Affective Engagement in Online Learning, Marketing Education Review.
Kushnir, L. P., & Berry, K. C. (2014). Inside, outside, upside down: New directions in online teaching and learning. International Conference e-Learning (pp. 133-140). Lisbon, Portgual : International Association for Development of the Information Society .
Lin, P. C., Hou, H. T., Wu, S. Y., & Chang, K. E. (2014). Exploring college students' cognitive processing patterns during a collaborative problem-solving teaching activity integrating Facebook discussion and simulation tools. The Internet and Higher Education, 22, 51- 56.
Mansbach, J., and Austin, A.E. 2018. Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education, February 2018 (early view): 1-16
Marks, R.B., Sibley, S.D., Arbaugh, J.B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education, 29(4): 531-563.
Martin, F., Wang, C., and Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education, 37: 52-65.
Maycock, K.W. (2018), Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning, 35(1): 121-126.
McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate Studies, 23(5), 477-486.
Miltiadous, A., Callahan, D. L., & Schultz, M. (2020). Exploring engagement as a predictor of success in the transition to online learning in first year chemistry. Journal of Chemical Education, 97(9), 2494-2501.
Mnguni, L., & Mokiwa, H. (2020). The integration of online teaching and learning in STEM education as a response to the COVID-19 pandemic. Journal of Baltic Science Education, 19(6A). Nguyen, T. 2015. The effectiveness of online learning: Beyond no significant difference and future horizons. Merlot Journal of Online Learning and Teaching, 11(2): 309-319.
Ni, A.N. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education, 19(2): 199-215.
Ni, A., Van Wart, M., Medina P., Collins,K., Silvers, E., & Pei, H. (2020) A profile of MPA students’ perceptions of online learning: What MPA students value in online education and what they think would improve online learning experiences, Journal of Public Affairs Education, Published online: 22 October, 2020. DOI: 10.1080/15236803.2020.1820288
Otter, R.R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., Perersen, K., and Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education, 19: 27-35.
Pagoto, S., Lewis, K. A., Groshon, L., Palmer, L., Waring, M. E., Workman, D., ... & Brown, N. P. (2021). STEM undergraduates’ perspectives of instructor and university responses to the COVID-19 pandemic in Spring 2020. PloS one, 16(8), e0256213.
Perets, E. A., Chabeda, D., Gong, A. Z., Huang, X., Fung, T. S., Ng, K. Y., ... & Yan, E. C. (2020). Impact of the emergency transition to remote teaching on student engagement in a non-STEM undergraduate chemistry course in the time of COVID-19. Journal of Chemical Education, 97(9), 2439-2447.
Petillion, R. J., & McNeil, W. S. (2020). Student experiences of emergency remote teaching: Impacts of instructor practice on student learning, engagement, and well-being. Journal of Chemical Education, 97(9), 2486-2493.
Rennar-Potacco, D., & Orellana, A. (2018). Academically supporting STEM students from a distance through videoconferencing: Lessons learned. American Journal of Distance Education, 32(2), 131-149.
Rooij, S. W., & Zirkle, K. (2016). Balancing pedagogy, student readiness and accessability: A case study in collaborative online course development. Internet and Higher Education, 28, 1-7.
Sansom, R. L. (2020). Pressure from the Pandemic: Pedagogical Dissatisfaction Reveals Faculty Beliefs. Journal of Chemical Education, 97(9), 2378-2382.
Sedaghatjou, M., Hughes, J., Liu, M., Ferrara, F., Howard, J., & Mammana, M. F. (2021). Teaching STEM online at the tertiary level during the COVID-19 pandemic. International Journal of Mathematical Education in Science and Technology, 1-17.
Sieber, J.E. (2005). Misconceptions and realities about teaching online. Science and Engineering Ethics, 11: 329-340.
Silva, J. B. D., Simão, J. P. S., Cristiano, M. A. S., Nicolete, P. C., Heck, C., & Coelho, K. S. (2013). A DC electric panel remote lab. International Journal of Online Engineering, 12(4), 30-32. Somenarain, L., Akkaraju, S., & Gharbaran, R. (2010). Student perceptions and learning outcomes in asynchronous and synchronous online learning environments in a biology course. MERLOT Journal of Online Learning and Teaching, 6(2), 353-356.
Son, J. Y. (2016). Comparing physical, virtual, and hybrid flipped labs for general education biology. Online Learning, 20(3), 228-243.
Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging charateristics. The Internet and Higher Education, 7(1), 59-70.
Tallent-Runnels, M.K., Thomas, J.A., Lan, W.Y., Cooper, S., Ahern, T., Shaw, S.M., and Liu, X. (2006). Teaching courses online: A review of the research. Review of Educational Research, 76(1): 93-135.
Tibi, M. H. (2018). Computer Science Students' Attitudes towards the Use of Structured and Unstructured Discussion Forums in Fully Online Courses. Online Learning, 22(1), 93- 106.
Traynor-Nilsen, P. (2017). Increasing student engagement in an online setting. Journal of Higher Education, 17, 54-60.
Van Wart, M., Ni, A., Hamilton, H. and Drudy, S. (2021, in queue). Professional Accrediting Body Roles in Online Instructional Program Quality: The Case of the NASPAA (special project focusing on accreditation issues and online quality) Quality in Higher Education.
Van Wart, M., Ni, A., Medina, P., Canelon, J., Liu, Y., Kordrostami, M., Zhang, J. (2020a). Integrating students’ perspectives about online learning: A hierarchy of factors. International Journal of Educational Technology in Higher Education. 17:53, 1-22. https://doi.org/10.1186/s41239-020-00229-8.
Van Wart, M., Ni, A., Ready, D., Shayo, C., and Court, J. (2020b). Factors leading to online learner satisfaction. Business Educational Innovation Journal, 12(1): 15-24.
Van Wart, M., Ni, A, Rose, L., McWeeney, T., and Worrell, R. A. (2019). Literature Review and Model of Online Teaching Effectiveness Integrating Concerns for Learning Achievement, Student Satisfaction, Faculty Satisfaction, and Institutional Results. (2019). Pan-Pacific Journal of Business Research, 10(1): 1-22.
Viegas, C., Pavani, A., Lima, N., Marques, A., Pozzo, I., Dobboletta, E., ... & Alves, G. (2018). Impact of a remote lab on teaching practices and student learning. Computers & Education, 126, 201-216. Wei, C., Chen, N., and Kinshuk. (2012). A model for social presence in online classrooms. Educational Technology Research and Development, 60(3), 529-545.
West, R. E., Sansom, R., Nielson, J., Wright, G., Turley, R. S., Jensen, J., & Johnson, M. (2021). Ideas for supporting student-centered stem learning through remote labs: a response. Educational Technology Research and Development, 69(1), 263-268.
Widyaningsih, S. W., Yusuf, I., Prasetyo, Z. K., & Istiyono, E. (2020). Online interactive multimedia oriented to HOTS through e-learning on physics material about electrical circuit. JPI (Jurnal Pendidikan Indonesia), 9(1), 1-14.
Wladis, C., Hachey, A. C., & Conway, K. (2014). An investigation of course-level factors as predictors of online STEM course outcomes. Computers & Education, 77, 145-150. Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education, 20(2): 65-77.
Zhang, J., Addae, H., Flaherty, P., Boyraz, M., Schriehans,C., Habich, M., Phillips, A., Johnson, A., Bakeman, M. (2020). Management Students’ Perceptions of Online Teaching Quality, e-Journal of Business Education & Scholarship of Teaching. 14(2): 33-52.