ORIGINAL RESEARCH article

Front. Educ., 03 November 2022

Sec. Digital Education

Volume 7 - 2022 | https://doi.org/10.3389/feduc.2022.916721

Student performance in online learning higher education: A preliminary research

  • Department of Management, Binus Online Learning, Bina Nusantara University, Jakarta, Indonesia

Abstract

The impact of student performance is the focus of online learning because it can determine the success of students and higher education institutions to get good ratings and public trust. This study explores comprehensively the factors that can affect the impact of student performance in online learning. An empirical model of the impact of student performance has been developed from the literature review and previous research. The test of reliability and validity of the empirical model was evaluated through linguist reviews and statistically tested with construct reliability coefficients and confirmatory factor analysis (CFA). Overall, the results of this study prove that the structural model with second-order measurements produces a good fit, while the structural model with first-order measurements shows a poor fit.

Introduction

Apart from the COVID-19 disaster, since 2021, the Indonesian government has launched a distance education system as the forerunner of e-learning. This system brings new colors to the learning process and challenges to adopt and innovate learning. The success of a learning system is highly dependent on mixed conditions, including the learning environment, teaching methods, resources, and learning expectations.

Many developed countries have integrated e-learning systems in higher education, but Indonesia, as a developing country, has not effectively adopted this technology. Several previous studies have acknowledged the severe challenges that hinder the integration of quality e-learning in universities, particularly in developing countries (Al-Adwan et al., 2021; Basir et al., 2021). Therefore, the various benefits of e-learning as a mode of education to improve the teaching-learning process and the barriers and challenges to adopting e-learning technology must also be considered.

The characteristics of students who take online education are different from students who apply traditional learning systems. Broad reach, a high level of flexibility, and easy access are reasons for students who are just starting a career in companies and professionals to improve their careers. For this reason, universities are campaigning for the efficiency of online education to meet the needs of students (Paul and Pradhan, 2019) resulting in a big boom in the education technology (EdTech) segment (Semeshkina, 2021).

The use of technology in online learning can improve the quality of learning, help students’ complete assignments quickly, gain insight and skills. In this study, the meaning of the impact of online learning performance is that online learning abilities can affect student performance in saving resources, productivity, competence, and knowledge (Aldholay et al., 2018). Universities need to pay attention to factors that can improve online learning performance by increasing student satisfaction and involvement.

The researchers succeeded in investigating the determinants of student performance in the context of online learning (Zimmerman and Nimon, 2017; Aldholay et al., 2018; Büchele, 2021). The scope of their study is limited to student factors in learning and college infrastructure as education service providers. This study extends the results of previous researchers by examining input factors [course quality (Jaggars and Xu, 2016; Debattista, 2018), instructor (Daouk et al., 2016), student, institutional (Hadullo et al., 2018), and technology (Kissi et al., 2018; Almaiah and Alismaiel, 2019; Sheppard and Vibert, 2019; Ameri et al., 2020; Yadegaridehkordi et al., 2020)], output [overall quality (Aldholay et al., 2018; Hadullo et al., 2018; Almaiah and Alismaiel, 2019; Thongsri et al., 2019)] and outcome [engagement (Büchele, 2021), satisfaction and performance (Aldholay et al., 2018)] during the online learning process. Based on the experience and knowledge of researchers, these factors can improve the performance of online learning students. This study aims to propose a conceptual framework for students’ performance impact in online learning (Figure 1). Therefore, specifically this paper as a preliminary study in the development of a predetermined model.

FIGURE 1

Literature review

Online learning in higher education

Online learning has become an appropriate and attractive solution for students who pursue their education while undergoing busy activities (Seaman et al., 2018). Therefore, universities are constantly looking for ways to improve the quality of online courses to increase student satisfaction, enrollment, and retention (Legon and Garrett, 2017).

A unique feature of online learning is that students and lecturers are physically far apart and require a medium for delivering course material (Wilde and Hsu, 2019). The interaction of students and lecturers is mediated by technology, and the design of virtual learning environments significantly impacts learning outcomes (Bower, 2019; Gonzalez et al., 2020). For decades, research on online learning has been studied, and the effectiveness of online teaching results from instructional design and planning (Hodges et al., 2020). The COVID-19 pandemic is forcing students worldwide to shift from offline learning to online learning environments. Students and teachers have limited capacities regarding information processing, and there is a chance that a combination of learning modalities may result in the cognitive overload that impacts the ability to learn new information effectively (Patricia Aguilera-Hermida, 2020). In addition, lack of confidence in the new technology they use for learning or the absence of cognitive engagement and social connections have less than the optimal impact on student learning outcomes (Bower, 2019).

The existence of technology, if used effectively, can provide opportunities for students and teachers to collaborate (Bower, 2019; Gonzalez et al., 2020). The success of the transition from offline to online learning is strongly influenced by the intention and usefulness of technology (Yakubu and Dasuki, 2018; Kemp et al., 2019), so the effectiveness of online learning is highly dependent on the level of student acceptance (Tarhini et al., 2015). Therefore, it is essential to analyze the factors related to online learning to achieve student learning outcomes.

Course quality

As online learning continues to mature and evolve in higher education, faculty and support staff (instructional designers, developers, and technologists) need guidance on how to best design and deliver practical online courses. Course quality standards are a valuable component in the instructional design process. They help guide course writers and identify needed improvements in courses and programs and create consistency in faculty expectations and student experience (Scharf, 2017).

In general, quality is an essential factor in online learning to provide a helpful learning experience for students (Barczyk et al., 2017), while course quality supports university learning performance. Quality Matters™ (QM) is an international organization that involves collaboration between institutions and creating a shared understanding of online course quality (Ralston-Berg, 2014). This research measures course quality by three dimensions: course design, course content support, and course assessment (Hadullo et al., 2018). These three dimensions are determinants in assessing the quality of learning.

Instructor factor

The quality of the instructor in delivering the material becomes the input to achieve learning performance (Ikhsan et al., 2019). To facilitate an active learning process, instructors should use strategies to increase participation in learning. While the responsibility for learning lies with the learner, the instructor plays an essential role in enhancing learning and engagement in the online environment (Arghode et al., 2018). As an essential actor in the classroom, instructors must have psychological similarities with students to help academically lacking students by changing perceptions of external barriers and stereotypes (Sullivan et al., 2021).

Several research results have proven the influence of instructor interactivity in the classroom on online teaching, including active learning (Muir et al., 2019), instructor presence (Roque-Hernández et al., 2021), discussion and assessment techniques (Chakraborty et al., 2021; McAvoy et al., 2022), and feedback (Kim and Kim, 2021). Some of these study areas are topics often studied with the needs and values of instructor interaction. In this study, the importance of instructor interactivity in online learning is related to online discussion forum activities and instructor interaction.

Student factor

The characteristics of students who take online learning education are different from those who study conventionally (face to face). Several essential factors drive student success in online learning: understanding computers and the internet, personal desires, motivation from instructors, and reasonable access to online learning systems (Hadullo et al., 2018; Bashir et al., 2021; Glassman et al., 2021; Rahman et al., 2021). Self-efficacy is explained by social cognitive theory as the ability to self-regulation (Bandura, 2010). According to social cognitive theory, people can develop self-efficacy by observing other people’s models of achieving goals and having had various successful attempts in the past to achieve challenging goals (Duchatelet and Donche, 2019). People who have high levels of self-efficacy tend to be confident in their ability to succeed in challenging tasks, such as their own, and observe others to achieve goals.

Institutional factor

Institutional theory has been used to explore organizational behavior toward technology acceptance, as it explains how institutions adapt to institutional change (Rohde and Hielscher, 2021). Currently, most higher education institutions have migrated from traditional to online learning systems, thereby changing traditional learning environments such as the physical presence of teachers, classrooms, and exams (Bokolo et al., 2020). Today’s developing technologies have improved education due to online learning, teleconferencing, computer-assisted learning, web-based distance learning, and other technologies (Bailey et al., 2022; Fauzi, 2022). Online learning systems provide more flexibility and improve teaching and learning processes, offering more opportunities for reflection and feedback (Archambault et al., 2022). Online learning offers interactive teaching, easy access, and is cost-effective mainly (Sweta, 2021).

E-learning technology

E-learning technology in this study is defined as the learning media used by universities in going online learning. The task-technology fit (TTF) model has been used to assess how technology generates performance, evaluate the effect of use and assess the fit between task requirements and technological competence (Wu and Chen, 2017). The TTF model suggests that the user accepts the technology because it is appropriate to the task and improves learning performance (Kissi et al., 2018). Technology acceptance is determined by the individual’s understanding and attitude toward technology, but the compatibility between task and technology must be considered necessary (Zhou et al., 2010). When a student decides to use technology, such as an LMS, their decision is very likely that the assignment and technology match.

Overall quality

Developments and challenges in information systems inspire researchers and practitioners to improve the quality and functionality of a new system to take advantage of its growth potential (Aldholay et al., 2018). Overall quality is understood as a new construct that includes system quality, information quality, and service quality (Ho et al., 2010; Isaac et al., 2017d). System quality is defined as the extent to which users believe that the system is easy to use, easy to learn, easy to connect, and fun to use (Jiménez-Bucarey et al., 2021). Information quality is understood as the extent to which system users think that online learning information is up-to-date, accurate, relevant, comprehensive, and organized (Raija et al., 2010). Service quality is referred to through various attributes, such as tangible, reliability, responsiveness, assurance, functionality, interactivity, and empathy (Preaux et al., 2022).

Student engagement

Student engagement in online learning is when they use online learning platforms to learn, including behavioral, cognitive, and emotional engagement (Hu et al., 2016). Student engagement in online learning is not only due to the behavioral performance of reading teaching materials, asking questions, participating in interactive activities, and completing homework, but more importantly, cognitive performance (Lee et al., 2015). In this study, cognitive behavior is all mental activities that enable students to relate, assess, and consider an event to gain knowledge afterward. In addition, cognitive behavior is closely related to a person’s intelligence and skill level. For example: when someone is studying, building an idea, and solving a problem.

Student behavioral engagement is essential in online learning but is difficult to define clearly and fully reflect student efforts. So, it must consider students’ perception, regulation, and emotional support in the learning process (ChanMin et al., 2015). Students must fully enter online learning, including the quantity of engagement and quality of engagement, communication with others and conscious learning, guidance, assistance from others, and self-management and self-control.

Student satisfaction

Perceived satisfaction is not limited to marketing concepts but can also be used in the context of online learning (Caruana et al., 2016). User satisfaction is one of the leading indicators when assessing success in adopting a new system (Montesdioca and Maçada, 2015; DeLone and McLean, 2016). User satisfaction also refers to perceiving a system as applicable and wanting to reuse it. In the context of online learning, student satisfaction is defined as the extent to which students who use online learning are satisfied with their decision to use it and how well it meets their expectations (Roca et al., 2006). Students who are satisfied while studying with an online learning system will strive to achieve good academic scores.

Student performance impact

In the context of education, performance is the result of the efforts of students and lecturers in the learning process and students’ interest in learning (Mensink and King, 2020). The essence of education is student academic achievement; therefore, student achievement is considered the success of the entire education system. Student academic achievement determines the success and failure of academic institutions (Narad and Abdullah, 2016).

It is crucial to explore problems with online learning systems in higher education to improve the student experience in learning. Therefore, the university’s ability to design effective online learning will impact university performance and student performance. The failure of online learning design and technology can frustrate students and lead to negative perceptions of students (Gopal et al., 2021).

With rapidly changing technology and the introduction of many new systems, researchers focus on the results of using systems in terms of performance improvement to evaluate and measure system success (Montesdioca and Maçada, 2015; Isaac et al., 2017a,b,c,d). Performance impact is defined as the extent to which the use of the system improves the quality of work by helping to complete tasks quickly, enabling control over work, improving job performance, eliminating errors, and increasing work effectiveness (Isaac et al., 2017d; Aldholay et al., 2018). In this study, performance impact is interpreted as an outcome of the use of technology in online learning.

Materials and methods

Participants

As a preliminary study, this study involved 206 students at a private university in Jakarta, Indonesia. At the university, only five study programs fully implement the online learning system. Therefore, we decided to take all study programs as the unit of analysis. This study involved 206 students at a private university in Jakarta, Indonesia, who implemented an online learning system. Those who participated came from five study programs: Management Department, Accounting Department, Information System Department, Computer Science Department, and Industrial Engineering. Sampling used stratified sampling, and each study program received about 41–42 responses. Researchers sent questionnaires to each head of the department to distribute to students online. All questionnaires were successfully received within 1 week. Each participant received a souvenir for taking 15–20 min to answer all the questions in the questionnaire.

A total of 206 data were collected, 96 female students and 110 male students. 108 students access the LMS between 1–3 h a day, 75 students access the LMS less than 3 h per day, and 23 students access the LMS more than 3 h per day. The average employment status of students is private employees (n = 127), as entrepreneurs (n = 10) and the rest are only as students (n = 69) (Table 1).

TABLE 1

DemographicFrequencyPercent
Gender
Male9646.6%
Female11053.4%
Students access the LMS
Less than 1 h per day7536.4%
1–3 h per day10852.4%
More than 3 h per day2311.2%
Job
Private employee12761.7%
Entrepreneur104.9%
Student6933.5%

Demographic of participants (n = 206).

Questionnaire development

The questionnaire used in this study results from several prior studies following the research context, i.e., online learning. In detail can be seen in Table 2. The first draft of the questionnaire containing 80 statement items was tested virtually on ten students and one Indonesian language expert to ensure that each statement was easy to understand. Furthermore, all items must be answered using a 5-point Likert scale, from (1) strongly disagree to (5) strongly agree.

TABLE 2

ConstructDimensionCodeItemSource
Course qualityCourse designCD1Our course id provided by information about the duration, list of books, availability of instructorHadullo et al., 2018
CD2Our course has an attractive and consistent layout improves quality
CD3Our course has relevant, accurate, complete content aligned to objectives
CD4Our course has a well sequenced content neatly arranged in headings and subheadings
Course content supportCCS1The online learning system uses attractive multimedia features
CCS2Online learning system using discussion and chat forums
CCS3Online learning system using video and animation
Course assessmentCA1The online learning system applies online quizzes and personal assessment tests
CA2The online learning system uses a clear course assessment method
CA3The online learning system promises timely feedback from lecturers
Student factorStudent characteristicsSC1I have computer and internet experienceHadullo et al., 2018
SC2I am self-motivated to use e-learning
SC3Our instructors motivate us in e-learning
SC4We have learner-to-learner interactions in out courses
Self-efficacySE1I feel confident finding information by using a search engineAldholay et al., 2018
SE2I feel confident in the online learning sending and receiving e-mail messages
SE3I feel confident in the online learning downloading and uploading files
E-learning techTask-technology fitTTF1I think that using LMS would be well suited for the way I like to learn tasksKissi et al., 2018
TTF2LMS would a good standard to support the way to learn
TTF3I think that using LMS would be a good way to learn tasks.
LMS usageLMS1Regularly use LMSAlmaiah and Alismaiel, 2019; Ameri et al., 2020
LMS2Pleasant experience when using LMS
LMS3I use LMS currently
LMS4I spend a lot of time to use LMS
Overall qualitySystem qualitySYQ1I find the online learning to be easy to useAldholay et al., 2018
SYQ2I find the online learning to be flexible to interact with
SYQ3My interaction with the online learning is clear and understandable
Information qualityIQ1Online learning provides up-to-date knowledge
IQ2Online learning provides accurate knowledge
IQ3Online learning provides relevant knowledge
IQ4Online learning provides comprehensive knowledge
IQ5Online learning provides organized knowledge
Service qualitySQ1I could use the online learning services at anytime, anywhere I want
SQ2Online learning offers multimedia (audio, video, and text) types of course content
SQ3Online learning enables interactive communication
SQ4Get convenience when registering and there is a call center available
Student engagementSkillSK1I put forth effort in the tutorials.Büchele, 2021
SK2I take good notes in the tutorials.
SK3I am looking over the tutorial notes on a regular basis, to make sure I understand the material.
SK4I am listening carefully in the tutorials.
SK5I make sure to prepare the tutorials on a regular basis.
SK6I am well organized in the tutorials.
SK7I do all the exercises.
ParticipationPAR1I raise my hands often in the video conference.
PAR2I participate actively in small-group discussions.
PAR3I help my fellow students, if necessary.
PAR4I have fun in the video conference.
PAR5I ask questions if I don’t understand something.
Institutional factorsIF1Colleges have internet infrastructure with high internet speedHadullo et al., 2018
IF2Colleges have good online learning policies
IF3Colleges have an institutional culture that supports online learning
IF4Colleges are serious about building a good online learning system
Instructor characteristicsIC1Our instructors are very enthusiastic in teachingDaouk et al., 2016
IC2Instructors use non-lecture learning activities such as small group discussions, student-led activities.
IC3Our instructor invites class discussion
IC4Our instructors combine simulations and real-world cases
IC5Our instructors use teaching tools and materials, such as videos
IC6Our instructors are aware of the learning needs of individual students
IC7Our instructor explains the concept clearly
IC8Our instructors connect concepts to student experiences
IC9Our instructors prefer active, collaborative, and cooperative learning over passive learning
IC10Our instructors actively encourage students to ask questions
IC11Instructors often ask questions to monitor student understanding
IC12When necessary, our instructors ask probing questions
IC13Our instructors keep students’ attention
IC14Our instructors assess students through observing their oral performance such as discussions, presentations, and group work
IC15Our instructors use humor appropriately to reinforce retention and interest
SatisfactionSAT1My decision to use the online learning was a wise oneAldholay et al., 2018
SAT2The online learning has met my expectations
SAT3Overall, I am satisfied with the online learning
Performance impactPI1Online learning helps me to accomplish my tasks more quicklyAldholay et al., 2018
PI2Online learning makes it easier to complete my tasks
PI3Online learning saves my money
PI4Online learning improves my learning performance
PI5Online learning enhances my academic effectiveness
PI6Online learning helps reviews and eliminate errors in my work tasks
PI7Online learning helps me to realize my future target
PI8Online learning helps me acquire new knowledge
PI9Online learning helps me acquire new skills
PI10Online learning helps me to come up with innovative ideas

Measurement of construct.

Validation process

The first step in the questionnaire item validation procedure starts from the pre-test and linguist review. Then the empirical data collected was calculated using Confirmatory Factor Analysis (CFA) and Construct Reliability (CR) analysis as a condition for construct validity and internal consistency.

CFA is a statistical tool helpful in finding the form of the construct of a set of manifest variables or testing a variable on the manifest assumptions that build it. Therefore, confirmatory analysis is suitable for testing a theory of variables on the manifest or the indicators that build it. The variables are assumed only to be measured by these indicators (Hair et al., 2019). The CFA results show that the multiple items in the questionnaire measure construct as hypothesized by the underlying theoretical framework. The CFA produces empirical evidence of the validity of scores for the instrument based on the established theoretical framework (George and Mallery, 2019). Construct reliability (CR) measures the internal consistency of the indicators of a variable that shows the degree to which the variables are formed. The limit value of the construct reliability test is accepted if the value is > 0.70 (Hair et al., 2019).

In the Structural Equation Modeling (SEM) framework, both variance and covariance-based, a questionnaire is valid if the loading factor value is 0.5 for analysis of covariance (Hair et al., 2019) and 0.7 for analysis of variance (Hair Joseph et al., 2019). In addition, the average variance extracted (AVE) value is more than 0.5 (Hair et al., 2019). In CFA, several goodness indices such as Chi-square (X2), Normed Chi-Square (NCS) (X2/df), Root Mean Square Error of Approximation (RSMEA), and Comparative Fit Index (CFI) were calculated to assess the model fit of the model framework under research (Kline, 2015).

Chi-square, NCS, and RMSEA statistics as absolute fit indices can be used to indicate the quality of the theoretical model being tested (Kline, 2015; Hair et al., 2019). The X2-test shows the difference between the observed and expected covariance matrices. Therefore, the smaller the X2-value indicates a better fit model (Gatignon, 2010). The X2-test should be insignificant for models with an acceptable fit. However, the statistical significance of the X2-test results is very sensitive to the sample size (Kline, 2015; Hair et al., 2019). Therefore, the NCS should also be considered. NCS is equal to Chi-square divided by degrees of freedom (X2/df). A smaller NCS value indicates a better model fit, and a NCS value equal to or less than 5 supports a good model fit (West et al., 2012; Hair et al., 2019). Another fit index model is the RMSEA. The RMSEA qualifies for the difference between the population covariance matrix and the theoretical model. An RMSEA value smaller than 0.08 indicates a better model and limits acceptable model fit (Gatignon, 2010; West et al., 2012; Hair et al., 2019). CFI was used to assess model fit in this study. If the CFI value is greater than 0.90, an acceptable model fit is indicated (Kline, 2015; Hair et al., 2019). The alpha (α) level in this study was set at 0.05 for the goodness-of-fit chi-square test (X2).

Results

Descriptive analyses

Based on descriptive analysis (Tables 3, 4), the mean value of overall quality items is in the interval range of 3.893 (SYQ3) to 4.417 (SQ1). It shows that students have given positive responses to all overall quality items. Furthermore, the mean value of student engagement items is in the interval range of 3.694 (PAR1) to 4.432 (SK7), which means that students give positive responses to all student engagement items. In the course quality construct, the mean value ranges from 3.966 (CA3) to 4.306 (CD4). That is, all students gave positive responses to the course quality items. The mean value of the student factor is in the range of 3.981 (SC3) to 4.388 (SE2). In e-learning technology, the mean value ranges from 3.743 (AU4) to 4,218. Students gave a positive response to the student factor and e-learning technology. Finally, on the constructs of institutional factors, instructor characteristics, satisfaction, and performance impact, the average student responded positively to all statement items because the mean value was in the range of 3.932–4.291. It can be concluded that all students gave a positive response to all the constructs measured in this study.

TABLE 3

VariableDimensionSLF 2ndItemMeanSLF 1stAVECR
Overall qualitySystem quality0.899SYQ14.2820.7710.6080.822
SYQ24.1800.704
SYQ33.8930.857
Information quality0.994IQ14.0630.7500.6720.911
IQ24.0240.825
IQ34.1210.869
IQ44.0290.832
IQ54.0780.818
Service quality0.693SQ14.4170.5870.6520.875
SQ24.3450.555
SQ33.9270.986
SQ43.9080.991
Student engagementSkill0.936SK14.2860.6340.5510.895
SK23.8060.767
SK33.9270.796
SK4.4.1500.727
SK54.0630.795
SK64.0290.810
SK74.4320.647
Participation0.889PAR13.6940.7510.6140.888
PAR24.2330.841
PAR34.1170.781
PAR43.7670.770
PAR54.2280.771
Course qualityCourse design0.966CD13.9950.7110.5980.855
CD24.0830.847
CD34.0340.796
CD44.3060.731
Course content support0.882CCS14.2820.8130.6580.852
CCS24.3830.854
CCS34.2820.764
Course assessment0.904CA14.2570.8390.6260.833
CA24.1600.809
CA33.9660.720
Student factorStudent characteristics0.716SC14.2090.8630.6230.868
SC24.0970.801
SC33.9810.704
SC44.0630.781
Self-efficacy0.843SE14.1600.7610.5360.775
SE24.3880.799
SE34.0970.626
E-learning techTask-technology fit0.868TTF14.0440.9230.8070.926
TTF24.0780.914
TTF34.0190.857
LMS usage0.924AU13.9660.8030.6220.868
AU24.0240.841
AU34.2180.777
AU43.7430.730

Confirmatory factor analysis and internal consistency—second order.

TABLE 4

ConstructItemMeanSLF 1stAVECR
Institutional factorsIF14.1990.7680.5890.851
IF24.2280.767
IF34.2860.734
IF44.1120.799
Instructor characteristicsIC13.9320.7340.5610.950
IC24.0580.724
IC34.2480.757
IC44.2330.775
IC54.0920.783
IC64.1120.830
IC74.1550.799
IC84.2330.842
IC94.2090.728
IC104.0870.737
IC113.9510.636
IC123.9660.741
IC134.0580.777
IC143.9610.728
IC154.0680.609
SatisfactionSAT14.2230.8310.7650.907
SAT24.0730.900
SAT34.1840.891
Performance impactPI13.9950.7750.6420.946
PI24.0780.772
PI34.1410.547
PI44.0050.863
PI54.0530.874
PI63.9510.818
PI74.1800.881
PI84.2910.840
PI94.2140.788
PI104.1170.800

Confirmatory factor analysis and internal consistency—first order.

Normality test

Hair et al. (2019) illustrated that testing absolute data normality in multivariate analysis. If the data is not normally distributed, it can affect the validity and reliability of the results. In this study, we used the One-Sample Kolmogorov-Smirnov Test with the Monte Carlo method (Metropolis and Ulam, 1949). As a result, the significance value of Monte Carlo is 0.120 > 0.05. The data used in testing the validity and reliability is normally distributed.

The result of internal consistency and confirmatory factor analysis

Tables 3, 4 presents the overall results of the validity and reliability tests which are CFA, AVE, and CR analyzed. The construct concept can be unidimensional or multidimensional, which impacts testing its validity and reliability. The construct is in unidimensional validity and reliability testing using CFA first order. It is multidimensional and carried out with CFA second order. This study’s constructs of course quality, student factor, e-learning tech, overall quality, and student engagement are multidimensional, so they must be measured using a second-order procedure. While the constructs of institutional factors, instructor characteristics, satisfaction, and performance impact are unidimensional, so they must be measured using a first-order procedure.

Reliability testing for all constructs in the theoretical model, both second order and first order, resulted in a CR value of more than 0.7. It means that every dimension and indicator of each measured construct can reflect the primary construct well. In other words, the questionnaire used has a high level of consistency. Likewise, for validity testing, all indicators and dimensions of the primary constructs produce standardized loading factor and AVE values of more than 0.5. It means that each dimension and indicator can reflect its primary construct. In conclusion, the questionnaire used in this study resulted in a high level of validity and reliability. In other word, examination of the correlations between the various factors shows that the factors are highly correlated. The standardized loading factors (SLF) coefficient between the tested factors and items shows that no loading factor value is lower than the bad loading factor limit.

Discriminant validity

Discriminant validity is a concept that means that the two concepts are conceptually different and show a sufficient difference. The point is that a combined set of indicators is not expected to be unidimensional. The discriminant validity test in this study used the Fornell-Lacker criteria. The Fornell-Larcker postulate states that a latent variable shares more variance with the underlying indicator than other latent variables. It means that if interpreted statistically, the AVE value of each latent variable must be greater than the highest r2-value with the value of the other latent variables (Henseler et al., 2015). Table 5 presents information that the AVE root value for each variable is greater than the correlation of other variables. So that discriminant validity is fulfilled correctly.

TABLE 5

SyQIQSQSKParCDCCSCASCISETTFAUIFICSATPI
SyQ0.780
IQ0.7330.820
SQ0.6390.6630.807
SK0.5910.6730.6360.742
Par0.5580.6160.6020.7400.784
CD0.7180.8070.6850.6900.6290.773
CCS0.5930.6920.6530.6330.5770.7120.811
CA0.6350.6970.6060.6460.5840.7270.7160.791
SCI0.4980.5680.5460.5240.5500.5720.5960.5000.789
SE0.5470.5890.5710.6740.6010.6200.6200.5400.4960.732
TTF0.6110.6650.5820.6110.5850.6490.5400.5360.6220.5610.898
AU0.5960.6230.5820.6950.6480.6400.5230.5450.5160.6300.6960.789
IF0.6190.6740.6240.6540.6480.7320.6560.6310.6290.7070.6560.6430.923
IC0.6060.6900.6190.6450.6440.7190.6440.6350.6030.6540.6630.6510.8390.975
SAT0.6670.6700.6130.6750.6220.6580.5760.6410.5430.6180.7620.6720.7380.7030.952
PI0.6670.7510.6210.7100.6390.6950.6330.6110.6140.6940.7800.7150.7650.7770.8140.973

Discriminant validity (Fornell-Lacker).

The bold value indicates the Fornell-Lacker value.

This study also measures the level of goodness of the theoretical model as measured by chi-square, NCS, RMSEA, and NFI statistics. Since chi-square is too sensitive to sample size (Hair et al., 2019), the chi-square ratio approach to degrees of freedom (χ2/df— NCS) was applied. The NCS value is less than 3, meaning that the model fit is acceptable (Hair et al., 2019). Next is the CFI. The results are presented in Table 6 and Figure 2. In the first model, overall quality, course quality, student factor, student engagement, and e-learning technology, as measured by the second-order model, resulted in a Chi-Square value (X2 = 2288.520, p 0.000 < 0.05), which indicates the model is not good. However, the NCS value of the theoretical model indicates model fit (NCS = 2.167 < 3), and the RMSEA value (0.075 < 0.08) indicates the theoretical model fits the population covariance matrix. Because the RMSEA and NCS values meet the model goodness requirements, they support a reasonable fit between the theoretical model and the data. CFI is also used to determine whether the model is sound and fits the data. CFI compares the fit of the theoretical model or the model under test with the independence model in which all latent variables are uncorrelated. In the results of this study, the CFI value of 0.978 is greater than 0.90, so it can be concluded that the model is fit.

TABLE 6

Chi-squareNCS—X2/dfRMSEACFIdfp
Model 1 second order2288.522.1670.0750.9781,0580.000
Model 2 first order1375.332.7840.0990.9694580.000

Goodness of fit model.

FIGURE 2

The second model measured by first order is institutional factors, instructor characteristics, satisfaction, and performance impact (Figure 3). As a result, the first order model shows a poor data fit (χ2 = 1375.33; χ2/df = 2.784; CFI = 0.969; RMSEA = 0.099). The results of the validity and reliability model for the first order, the RMSE value of 0.099 > 0.08, are considered that the measurement model does not meet the fit criteria. However, other researchers state that RMSEA < 0.10 is still considered fit but poor (Singh et al., 2020). In addition, the NCS value of 2.784 < 3 and the CFI of 0.968 > 0.90 is considered to meet the model’s goodness.

FIGURE 3

Discussion and implication

This study aims to propose a conceptual framework for measuring the impact of student performance in online learning on a sample of students from various study programs. Because the learning models in social and technical studies programs are different, measuring the two groups of samples is necessary. This study offers an instrument in the concept of online learning that focuses on measuring student perceptions of course quality, student engagement, e-learning technology, overall quality, student factors, institutional factors, instructor characteristics, and satisfaction that impact student performance. The results of the study prove that the measurement of instrument course quality, student engagement, e-learning technology, overall quality, and student factors on a second-order basis produces good validity and reliability values with the support of model fit. While the instrument measurements on instructor characteristics, institutional factors, satisfaction, and performance impact, though they produced good validity and reliability values, the model’s fit was not satisfactory. In particular, the Lisrel program provides instructions for modifying the refinement of the model by relating the covariate errors to the instructor characteristics and performance impact factors because it produces an RMSEA value that does not fit. However, because this study is an initial finding, treatment is not carried out by relating the covariate error (Hulland et al., 2018; Hair et al., 2019).

Generally, this study provides information that most students respond positively to all constructs. It can be interpreted that students who attend lectures using the online learning method view all exogenous constructs as essential to improving their performance. Our results align with previous literature, which explains that the performance of students participating in online learning programs is still less than optimal (Kim et al., 2021), and there are still students who are unwilling to continue their studies (Xavier and Meneses, 2021). Students are unfamiliar with online learning systems and are used to traditional pedagogical styles (Maheshwari, 2021). In addition, internet access is still low compared to developed countries because infrastructure is still not well developed in Indonesia.

Conclusion, limitation, and future research

This study concludes that the model for measuring student learning performance in universities that implement online learning systems is acceptable. The results of this study contribute to universities and educators improving the performance of student learning outcomes. This model will guide them in achieving practical national education goals or help them improve the current system. From the educator’s view, it helps make plans for teaching materials that are effective and follow students’ needs. For universities, providing input to improve the online learning system that is currently running so that it can produce graduates who can compete in the world of work.

This study has several limitations, such as the lack of sample size, which impacts the value of the model’s fit. In addition, this study does not distinguish the validity and reliability of results between social and engineering studies programs. Therefore, for further research, it is possible to add a larger number of samples to provide more comprehensive results and distinguish the validity and reliability of students from social studies and engineering programs. It is because the courses are different. In addition, to test the hypothesis on the proposed conceptual model, it is possible to distinguish students’ level of performance in social studies and engineering programs using the multigroup analysis method.

Statements

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Author contributions

HP, RI, and YY contributed to the conception and design of the study, wrote the first draft of the manuscript, conceptual framework, and literature review. RI organized the database and performed the statistical analysis. All authors contributed to the manuscript revision, read, and approved the submitted version.

Funding

This work was supported by “Model Evaluasi Pembelajaran Daring Dalam Menilai Kualitas Sistem Pendidikan Tinggi Di Indonesia” (contract number: 3481/LL3/KR/2021 and contract date: 12 July 2021).

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

  • 1

    Al-AdwanA. S.AlbelbisiN. A.HujranO.Al-RahmiW. M.AlkhalifahA. (2021). Developing a holistic success model for sustainable E-learning: A structural equation modeling approach.Sustainability13:9453. 10.3390/su13169453

  • 2

    AldholayA.IsaacO.AbdullahZ.AbdulsalamR.Al-ShibamiA. H. (2018). An extension of Delone and McLean IS success model with self-efficacy.Int. J. Inform. Learn. Technol.35285304. 10.1108/IJILT-11-2017-0116

  • 3

    AlmaiahM. A.AlismaielO. A. (2019). Examination of factors influencing the use of mobile learning system: An empirical study.Educ. Inform. Technol.24885909. 10.1007/s10639-018-9810-7

  • 4

    AmeriA.KhajoueiR.AmeriA.JahaniY. (2020). Acceptance of a mobile-based educational application (LabSafety) by pharmacy students: An application of the UTAUT2 model.Educ. Inform. Technol.25419435. 10.1007/s10639-019-09965-5

  • 5

    ArchambaultL.LearyH.RiceK. (2022). Pillars of online pedagogy: A framework for teaching in online learning environments.Educ. Psychol.57178191. 10.1080/00461520.2022.2051513

  • 6

    ArghodeV.BriegerE.WangJ. (2018). Engaging instructional design and instructor role in online learning environment.Eur. J. Train. Dev.42366380. 10.1108/EJTD-12-2017-0110

  • 7

    BaileyD. R.AlmusharrafN.AlmusharrafA. (2022). Video conferencing in the e-learning context: explaining learning outcome with the technology acceptance model.Educ. Inform. Technol.2776797698. 10.1007/s10639-022-10949-1

  • 8

    BanduraA. (2010). “Self-Efficacy,” in The Corsini Encyclopedia of Psychology, edsWeinerI. B.CraigheadW. E. (Hoboken, NJ: John Wiley & Sons, Inc), 13.

  • 9

    BarczykC. C.HixonE.BuckenmeyerJ.Ralston-BergP. (2017). The effect of age and employment on students’ perceptions of online course quality.Am. J. Distance Educ.31173184. 10.1080/08923647.2017.1316151

  • 10

    BashirA.BashirS.RanaK.LambertP.VernallisA. (2021). Post-COVID-19 adaptations; the shifts towards online learning, hybrid course delivery and the implications for biosciences courses in the higher education setting.Front. Educ.6:711619. 10.3389/feduc.2021.711619

  • 11

    BasirM.AliS.GulliverS. R. (2021). Validating learner-based e-learning barriers: developing an instrument to aid e-learning implementation management and leadership.Int. J. Educ. Manag.3512771296. 10.1108/IJEM-12-2020-0563

  • 12

    BokoloA.KamaludinA.RomliA.Mat RaffeiA. F.Eh PhonD. N.AbdullahA.et al (2020). A managerial perspective on institutions’ administration readiness to diffuse blended learning in higher education: Concept and evidence.J. Res. Technol. Educ.523764. 10.1080/15391523.2019.1675203

  • 13

    BowerM. (2019). Technology-mediated learning theory.Br. J. Educ. Technol.5010351048. 10.1111/bjet.12771

  • 14

    BücheleS. (2021). Evaluating the link between attendance and performance in higher education: the role of classroom engagement dimensions.Assess. Eval. High. Educ.46132150. 10.1080/02602938.2020.1754330

  • 15

    CaruanaA.La RoccaA.SnehotaI. (2016). Learner Satisfaction in Marketing Simulation Games: Antecedents and Influencers.J. Market. Educ.38107118. 10.1177/0273475316652442

  • 16

    ChakrabortyP.MittalP.GuptaM. S.YadavS.AroraA. (2021). Opinion of students on online education during the COVID-19 pandemic.Hum. Behav. Emerg. Technol.3357365. 10.1002/hbe2.240

  • 17

    ChanMinK.Seung WonP.JoeC.HyewonL. (2015). From Motivation to engagement: The role of effort regulation of virtual high school students in mathematics courses.J. Educ. Technol. Soc.18261272.

  • 18

    DaoukZ.BahousR.BachaN. N. (2016). Perceptions on the effectiveness of active learning strategies.J. Appl. Res. High. Educ.8360375. 10.1108/JARHE-05-2015-0037

  • 19

    DebattistaM. (2018). A comprehensive rubric for instructional design in e-learning.Int. J. Inform. Learn. Technolo.3593104. 10.1108/IJILT-09-2017-0092

  • 20

    DeLoneW. H.McLeanE. R. (2016). Information systems success measurement.Found. Trends§Inform. Syst.21116. 10.1561/2900000005

  • 21

    DuchateletD.DoncheV. (2019). Fostering self-efficacy and self-regulation in higher education: a matter of autonomy support or academic motivation?High. Educ. Res. Dev.38733747. 10.1080/07294360.2019.1581143

  • 22

    FauziM. A. (2022). E-learning in higher education institutions during COVID-19 pandemic: current and future trends through bibliometric analysis.Heliyon8:e09433. 10.1016/j.heliyon.2022.e09433

  • 23

    GatignonH. (2010). “Confirmatory factor analysis,” in Statistical analysis of management data, ed.GatignonH. (New York, NY: Springer New York), 59122. 10.1007/978-1-4419-1270-1_4

  • 24

    GeorgeD.MalleryP. (2019). IBM SPSS Statistics 25 Step by Step: A Simple Guide and Reference.London: Routledge. 10.4324/9780429056765

  • 25

    GlassmanM.KuznetcovaI.PeriJ.KimY. (2021). Cohesion, collaboration and the struggle of creating online learning communities: Development and validation of an online collective efficacy scale.Comput. Educ. Open2:100031. 10.1016/j.caeo.2021.100031

  • 26

    GonzalezT.de la RubiaM. A.HinczK. P.Comas-LopezM.SubiratsL.FortS.et al (2020). Influence of COVID-19 confinement on students’ performance in higher education.PLoS One15:e0239490. 10.1371/journal.pone.0239490

  • 27

    GopalR.SinghV.AggarwalA. (2021). Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19.Educ. Inform. Technol.2669236947. 10.1007/s10639-021-10523-1

  • 28

    HadulloK.ObokoR.OmwengaE. (2018). Status of e-learning quality in Kenya: Case of Jomo Kenyatta University of agriculture and technology postgraduate students.Int. Rev. Res. Open Distribut. Learn.19:24. 10.19173/irrodl.v19i1.3322

  • 29

    HairJ.WilliamC. B.BarryJ. B.RolphE. A. (2019). Multivariate Data Analysis.United Kingdom: Cengage Learning EMEA.

  • 30

    Hair JosephF.Risher JeffreyJ.SarstedtM.Ringle ChristianM. (2019). When to use and how to report the results of PLS-SEM.Eur. Bus. Rev.31224. 10.1108/EBR-11-2018-0203

  • 31

    HenselerJ.RingleC. M.SarstedtM. (2015). A new criterion for assessing discriminant validity in variance-based structural equation modeling.J. Acad. Market. Sci.43115135. 10.1007/s11747-014-0403-8

  • 32

    HoL. A.KuoT. H.LinB. (2010). Influence of online learning skills in cyberspace.Int. Res.205571. 10.1108/10662241011020833

  • 33

    HodgesC. B.MooreS.LockeeB. B.TrustT.BondM. A. (2020). The Difference Between Emergency Remote Teaching and Online Learning [Online].Boulder, CO: EDUCAUSE.

  • 34

    HuM.LiH.DengW.GuanH. (2016). “Student engagement: One of the necessary conditions for online learning,” in Proceedings of the 2016 International Conference on Educational Innovation through Technology (EITT), Tainan. 122126. 10.1109/EITT.2016.31

  • 35

    HullandJ.BaumgartnerH.SmithK. M. (2018). Marketing survey research best practices: evidence and recommendations from a review of JAMS articles.J. Acad. Market. Sci.4692108. 10.1007/s11747-017-0532-y

  • 36

    IkhsanR. B.SaraswatiL. A.MuchardieB. G.VionalSusiloA. (2019). “The determinants of students’ perceived learning outcomes and satisfaction in BINUS online learning,” in Proceedings of the 2019 5th International Conference on New Media Studies (CONMEDIA), Bali, 6873. 10.1109/CONMEDIA46929.2019.8981813

  • 37

    IsaacO.AbdullahZ.RamayahT.MutaharA.AlrajawyI. (2017a). Towards a better understanding of internet technology usage by Yemeni employees in the public sector: an extension of the task-technology fit (TTF) model.Res. J. Appl. Sci.12205223.

  • 38

    IsaacO.AbdullahZ.RamayahT.Mutahar AhmedM. (2017b). Examining the relationship between overall quality, user satisfaction and internet usage: An integrated individual, technological, organizational and social perspective.Asian J. Inform. Technol.16100124.

  • 39

    IsaacO.AbdullahZ.RamayahT.MutaharA. M. (2017c). Internet usage within government institutions in Yemen: An extended technology acceptance model (TAM) with internet self-efficacy and performance impact.Sci. Int.29737747.

  • 40

    IsaacO.AbdullahZ.RamayahT.MutaharA. M. (2017d). Internet usage, user satisfaction, task-technology fit, and performance impact among public sector employees in Yemen.Int. J. Inform. Learn. Technol.34210241. 10.1108/IJILT-11-2016-0051

  • 41

    JaggarsS. S.XuD. (2016). How do online course design features influence student performance?Comput. Educ.95270284. 10.1016/j.compedu.2016.01.014

  • 42

    Jiménez-BucareyC.Acevedo-DuqueÁMüller-PérezS.Aguilar-GallardoL.Mora-MoscosoM.VargasE. C. (2021). Student’s satisfaction of the quality of online learning in higher education: An empirical study.Sustainability13:11960. 10.3390/su132111960

  • 43

    KempA.PalmerE.StrelanP. (2019). A taxonomy of factors affecting attitudes towards educational technologies for use with technology acceptance models.Br. J. Educ. Technol.5023942413. 10.1111/bjet.12833

  • 44

    KimD.JungE.YoonM.ChangY.ParkS.KimD.et al (2021). Exploring the structural relationships between course design factors, learner commitment, self-directed learning, and intentions for further learning in a self-paced MOOC.Comput. Educ.166:104171. 10.1016/j.compedu.2021.104171

  • 45

    KimS.KimD.-J. (2021). Structural relationship of key factors for student satisfaction and achievement in asynchronous online learning.Sustainability13:6734. 10.3390/su13126734

  • 46

    KissiP. S.NatM.ArmahR. B. (2018). The effects of learning–family conflict, perceived control over time and task-fit technology factors on urban–rural high school students’ acceptance of video-based instruction in flipped learning approach.Educ. Technol. Res. Dev.6615471569. 10.1007/s11423-018-9623-9

  • 47

    KlineR. B. (2015). Principles and Practice of Structural Equation Modeling, Fourth Edn. New York, NY: Guilford Publications.

  • 48

    LeeE.PateJ. A.CozartD. (2015). Autonomy support for online students.TechTrends595461. 10.1007/s11528-015-0871-9

  • 49

    LegonR.GarrettR. (2017). The Changing Landscape of Online Education, Quality Matters & Eduventures Survey of Chief Online Officers, 2017.Available online aat:https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-First-Survey-Report.pdf(accessed March 20, 2022).

  • 50

    MaheshwariG. (2021). Factors influencing entrepreneurial intentions the most for university students in Vietnam: educational support, personality traits or TPB components?Educ. Train.6311381153. 10.1108/ET-02-2021-0074

  • 51

    McAvoyP.HuntT.CulbertsonM. J.McClearyK. S.DeMeuseR. J.HessD. E. (2022). Measuring student discussion engagement in the college classroom: a scale validation study.Stud. High. Educ.4717611775. 10.1080/03075079.2021.1960302

  • 52

    MensinkP. J.KingK. (2020). Student access of online feedback is modified by the availability of assessment marks, gender and academic performance.Br. J. Educ. Technol.511022. 10.1111/bjet.12752

  • 53

    MetropolisN.UlamS. (1949). The Monte Carlo method.J. Am. Stat. Assoc.44335341. 10.1080/01621459.1949.10483310

  • 54

    MontesdiocaG. P. Z.MaçadaA. C. G. (2015). Measuring user satisfaction with information security practices.Comput. Secur.48267280. 10.1016/j.cose.2014.10.015

  • 55

    MuirT.MilthorpeN.StoneC.DymentJ.FreemanE.HopwoodB. (2019). Chronicling engagement: students’ experience of online learning over time.Distance Educ.40262277. 10.1080/01587919.2019.1600367

  • 56

    NaradA.AbdullahB. (2016). Academic performance of senior secondary school students: Influence of parental encouragement and school environment.Rupkatha J. Interdiscipl. Stud. Human.81219. 10.21659/rupkatha.v8n2.02

  • 57

    Patricia Aguilera-HermidaA. (2020). College students’ use and acceptance of emergency online learning due to COVID-19.Int. J. Educ. Res. Open1:100011. 10.1016/j.ijedro.2020.100011

  • 58

    PaulR.PradhanS. (2019). Achieving student satisfaction and student loyalty in higher education: A focus on service value dimensions.Serv. Market. Q.40245268. 10.1080/15332969.2019.1630177

  • 59

    PreauxJ.CasadesúsM.BernardoM. (2022). A conceptual model to evaluate service quality of direct-to-consumer telemedicine consultation from patient perspective.Telemed. e-Health[Online ahead of print]10.1089/tmj.2022.0089

  • 60

    RahmanM. H. A.UddinM. S.DeyA. (2021). Investigating the mediating role of online learning motivation in the COVID-19 pandemic situation in Bangladesh.J. Comput. Assist. Learn.3715131527. 10.1111/jcal.12535

  • 61

    RaijaH.HeliT.ElisaL. (2010). DeLone & McLean IS success model in evaluating knowledge transfer in a virtual learning environment.Int. J. Inform. Syst. Soc. Change13648. 10.4018/jissc.2010040103

  • 62

    Ralston-BergP. (2014). Surveying student perspectives of quality: Value of QM rubric items.Int. Learn. J.3:10. 10.18278/il.3.1.9

  • 63

    RocaJ. C.ChiuC.-M.MartínezF. J. (2006). Understanding e-learning continuance intention: An extension of the technology acceptance model.Int. J. Hum. Comput. Stud.64683696. 10.1016/j.ijhcs.2006.01.003

  • 64

    RohdeF.HielscherS. (2021). Smart grids and institutional change: Emerging contestations between organisations over smart energy transitions.Energy Res. Soc. Sci.74:101974. 10.1016/j.erss.2021.101974

  • 65

    Roque-HernándezR. V.Díaz-RoldánJ. L.López-MendozaA.Salazar-HernándezR. (2021). Instructor presence, interactive tools, student engagement, and satisfaction in online education during the COVID-19 Mexican lockdown.Interact. Learn. Environ.2021:1912112. 10.1080/10494820.2021.1912112

  • 66

    ScharfP. (2017). The Importance of Course Quality Standards in Online Education [Online].Hoboken, NJ: Wiley University Services.

  • 67

    SeamanJ. E.AllenI. E.SeamanJ. (2018). Grade Increase: Tracking Distance Education in the United States.Boston, MA: The Babson Survey Research Group.

  • 68

    SemeshkinaM. (2021). Five Major Trends In Online Education To Watch Out For In 2021 [Online]. Forbes.Available online at:https://www.forbes.com/sites/forbesbusinesscouncil/2021/02/02/five-major-trends-in-online-education-to-watch-out-for-in-2021/?sh=2678d06621eb(Accessed 25 September, 2021).

  • 69

    SheppardM.VibertC. (2019). Re-examining the relationship between ease of use and usefulness for the net generation.Educ. Inform. Technol.2432053218. 10.1007/s10639-019-09916-0

  • 70

    SinghS.JavdaniS.BerezinM. N.SichelC. E. (2020). Factor structure of the critical consciousness scale in juvenile legal system-involved boys and girls.J. Commun. Psychol.4816601676. 10.1002/jcop.22362

  • 71

    SullivanT. J.VoigtM. K.ApkarianN.MartinezA. E.HagmanJ. E. (2021). “Role with it: Examining the impact of instructor role models in introductory mathematics courses on student experiences,” in Proceedings of the 2021 ASEE Virtual Annual Conference Content Access, Washington DC.

  • 72

    SwetaS. (2021). “Educational Data Mining in E-Learning System,” in Modern Approach to Educational Data Mining and Its Applications, ed.SwetaS. (Singapore: Springer Singapore), 112. 10.1007/978-981-33-4681-9_1

  • 73

    TarhiniA.HoneK.LiuX. (2015). A cross-cultural examination of the impact of social, organisational and individual factors on educational technology acceptance between British and Lebanese university students.Br. J. Educ. Technol.46739755. 10.1111/bjet.12169

  • 74

    ThongsriN.ShenL.BaoY. (2019). Investigating factors affecting learner’s perception toward online learning: evidence from ClassStart application in Thailand.Behav. Inform. Technol.3812431258. 10.1080/0144929X.2019.1581259

  • 75

    WestS. G.TaylorA. B.WuW. (2012). “Model fit and model selection in structural equation modeling,” in Handbook of Structural Equation Modeling, EdHoyleR. H. (New York, NY: Guilford Press), 209231.

  • 76

    WildeN.HsuA. (2019). The influence of general self-efficacy on the interpretation of vicarious experience information within online learning.Int. J. Educ. Technol. High. Educ.16:26. 10.1186/s41239-019-0158-x

  • 77

    WuB.ChenX. (2017). Continuance intention to use MOOCs: Integrating the technology acceptance model (TAM) and task technology fit (TTF) model.Comput. Hum. Behav.67221232. 10.1016/j.chb.2016.10.028

  • 78

    XavierM.MenesesJ. (2021). The tensions between student dropout and flexibility in learning design: The voices of professors in open online higher education.Int. Rev. Res. Open Distrib. Learn.227288. 10.19173/irrodl.v23i1.5652

  • 79

    YadegaridehkordiE.NilashiM.ShuibL.SamadS. (2020). A behavioral intention model for SaaS-based collaboration services in higher education.Educ. Inform. Technol.25791816. 10.1007/s10639-019-09993-1

  • 80

    YakubuM. N.DasukiS. I. (2018). Factors affecting the adoption of e-learning technologies among higher education students in Nigeria: A structural equation modelling approach.Inform. Dev.35492502. 10.1177/0266666918765907

  • 81

    ZhouT.LuY.WangB. (2010). Integrating TTF and UTAUT to explain mobile banking user adoption.Comput. Hum. Behav.26760767. 10.1016/j.chb.2010.01.013

  • 82

    ZimmermanT. D.NimonK. (2017). The online student connectedness survey: Evidence of initial construct validity.Int. Rev. Res. Open Distrib. Learn.18:22. 10.19173/irrodl.v18i3.2484

Summary

Keywords

overall quality, course quality, e-learning technology, online learning, student engagement, student and institutional factors, instructor characteristics, student performance

Citation

Prabowo H, Ikhsan RB and Yuniarty Y (2022) Student performance in online learning higher education: A preliminary research. Front. Educ. 7:916721. doi: 10.3389/feduc.2022.916721

Received

09 May 2022

Accepted

22 September 2022

Published

03 November 2022

Volume

7 - 2022

Edited by

Shashidhar Venkatesh Murthy, James Cook University, Australia

Reviewed by

Betty Kusumanigrum, Universitas Sarjanawiyata Tamansiswa, Indonesia; Enjy Abouzeid, Suez Canal University, Egypt

Updates

Copyright

*Correspondence: Ridho Bramulya Ikhsan,

This article was submitted to Digital Education, a section of the journal Frontiers in Education

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Figures

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics