Abstract
Evidence-based practices grounded in the learning sciences provide an opportunity for improved learning experiences in traditional in-person, as well as in hybrid and online environments. We advocate specifically that large-scale, online learning experiences, such as Massive Open Online Courses (MOOCs) benefit from applications of the learning sciences. To that end, we present a description of how and why we use specific learning science practices in a biochemistry MOOC with the intention to contribute to the discussion about the quality of online learning experiences and lower the barrier for other practitioners seeking a framework for implementing evidence-based course design. We believe that the application of the learning sciences makes online learning experiences more rigorous and effective, and practitioners should optimize the use of these strategies through clever tests in specific contexts.
Introduction
The quality of learning experiences across the 13,500 Massive Open Online Courses (MOOCs) offered since 2012 (Shah, 2019) is heterogeneous. There are calls to assess the design of MOOCs, establish criteria for development, and standardize quality assurance measures (Alario-Hoyos et al., 2014a,b; Kopp and Lackner, 2014; Yousef et al., 2014; Doherty et al., 2015; Lee et al., 2016). Best practices for teaching and learning rooted in the learning sciences advance scientific teaching methods through the adoption of active learning (Handelsman et al., 2004; Freeman et al., 2014) and promote efficient and effective ways to study (Miyatsu et al., 2018). Digital learning designers translate many of these best practices to online learning environments. Our aim is to document this translation from the perspective of MOOC developers by presenting a framework for applying evidence-based strategies in a biochemistry MOOC and advocating for testing the effectiveness of instructional practices in specific online contexts.
As practitioners concerned with maximizing learning gains in an understudied, yet widely used non-traditional learning environment (Veletsianos and Shepherdson, 2016), we are agnostic to adhering to strategies based in one conceptual framework of educational research over another. Digital learning designers should have all possible tools at their disposal for creating engaging and effective learning experiences. To this end, we take a systematic, eclectic, instructional design approach (Yanchar and Gabbitas, 2011; Honebein and Sink, 2012) to learning engineering, where we draw upon practices rooted in behaviorism, cognitivism, social learning, and constructivism to address the challenges of online learning at scale.
Here we document practices supported by the learning sciences that we readily implement in an online learning environment to contribute to conversations in the community of practice about evidence-based approaches to MOOC design. We hope these conversations lower the barrier for other developers of online and hybrid experiences to apply these methods. We found the following practices both robust from a theoretical standpoint, and practical in terms of application in our course design. We outline the evidence-based practices applied to course assets organized by function (pre-instruction, multimedia content, formative assessment, supporting resources, and summative assessment). However, it is worth noting that many of the practices overlap in their usage. This article is an exposition on why and how to use the learning sciences in a specific context as a model for others while highlighting a few strategies, and not an exhaustive review of these practices, nor a formal assessment of their efficacy.
Course Overview
The biochemistry MOOC has a hierarchical structure composed of mainly eight different topic units (like chapters in a book), each broken down into one or two learning sequences (analogous to sections of each chapter). The learning sequences contain a series of course assets (Figure 1, colorful inner nodes) informed by evidence-based principles (Figure 1, gray outer nodes). These course assets are designed to prime the learner for the material (pre-instructional strategies; Figure 1, green nodes), convey foundational information (multimedia content; Figure 1, yellow node), build understanding (formative assessments; Figure 1, blue nodes), and support learning (supporting resources; Figure 1, purple nodes). In addition to topic units and other supplemental material, the course ends with a comprehensive exam (summative assessment; Figure 1, orange node).
Figure 1
Evidence-Based Practices Employed
Pre-Instruction
To prime our MOOC learners for the introduction of new material in each learning sequence, we employ two pre-instructional techniques, learning objectives and advance organizers (Figure 1, green nodes). Positioned at the beginning of the learning sequences, we outline one or two big picture goals for each sequence as well as several measurable objectives the learner should meet by engaging with the sequence material. By outlining the goals and objectives that we used to develop and align the assessments during course development, we clearly communicate our expectations for the learner, and promote transparency in the course (Hartley and Davies, 1976). We know from a survey of our hybrid learning experiences that students find these goals and objectives useful for studying, even when not explicitly directed to use them in this way. We also use a type of advance organizer (Weisberg, 1970; Hartley and Davies, 1976; Meng and Patty, 1991; Bui and McDaniel, 2015) in the form of image handouts. By presenting key images ahead of instruction, the learners have an opportunity to preview the material, scaffold their note-taking, and make deeper connections during the video segments. These handouts contain a series of images that appear in the sequences, each coupled with an open-ended, yet focused question (Wylie and Chi, 2014) designed to make connections between what each image represents and the learning sequence topic (Supplementary Figure 1). We do not formally assess the coupled questions, but rather prompt self-explanation. Learners answer the questions in their own words, revealing their own thought process and understanding, which promotes metacognitive processes (Chi et al., 1994; Aleven and Koedinger, 2002; Ainsworth and Th Loizou, 2003).
Multimedia Content
To connect learners' prior knowledge to more advanced concepts in biochemistry, we offer a series of video and images, which we designed to maximize engagement and minimize extraneous cognitive load (Sweller, 1994, 2005; Sweller et al., 2019) (Figure 1, yellow node). We filmed a live classroom, where the professor writes on a chalkboard to discuss the material unscripted. Students may prefer the professor writing on a board over presenting slides because the method is more interactive, the pace of the presentation is more accessible, and the information is streamlined (Armour et al., 2016). There is also evidence that unscripted lectures may be more engaging to learners than scripted deliveries (Thornton et al., 2017). This method of multimedia content delivery meets the principles of multimedia learning that promote generative processing (personalization, voice, and embodiment) as well as some that maintain essential processing (modality and segmenting) (Clark and Mayer, 2016; Mayer, 2017). Often embedded within the videos are brief animations of the dynamic, process-oriented concepts and complex structures overlaid on the active discussion in class (Supplementary Video 1). These animations are consistent in style and color to familiarize learners with the representation of specific concepts (pre-training) (Mayer et al., 2002). We also took care to adhere to the remaining multimedia learning principles (coherence, signaling, redundancy, and contiguity) while maintaining scientific accuracy, to promote a deeper understanding of the core concepts represented and maintain a manageable cognitive load for learners (Sweller, 2005; Clark and Mayer, 2016; Mayer, 2017).
Formative Assessments
We include over 600 formative assessment questions to guide learners through the process of constructing a foundational understanding of biochemistry and transferring that understanding to novel contexts (Figure 1, blue nodes). All the assessments are graded automatically, including multiple choice responses, checkbox, and numerical and text inputs. To gauge understanding of the new material introduced in each video segment, we intersperse test yourself questions (TYs) between videos. We include concrete examples in TYs to help learners generalize abstract ideas and transfer knowledge to different contexts (Paivio et al., 1994; Weinstein et al., 2018). These concrete examples can also serve a dual purpose to personally connect the learner to the material. Our examples often have relevance to current research, medical applications, or everyday life, which can help promote intrinsic motivation to learn (Ambrose et al., 2010). One such example is a TY in the pH and buffers unit, where we ask learners to calculate the pH of a homemade buttermilk substitute given the volume of lemon juice and milk, the amount of citric acid in lemon juice, and the dissociation constant of citric acid. This question goes beyond a simple calculation and engages the learner in how biochemistry applies to daily living.
Engaging learners is a challenge for most formative assessments, so we constructed problem sets (PSs) at the end of every unit, to generate interest, as well as prompt learners to think deeply about what they learned and apply their knowledge to new contexts. We take inspiration from narrative-centered learning in gaming (Mott et al., 1999) to leverage the great appeal of storytelling in education. The PS questions are connected by an ongoing story about taking biochemistry and engaging in research as an undergraduate, where we inject the learners as characters in the narrative. The assessments that compose the PSs reference objectives covered throughout the associated learning sequence(s), and as such represent both spaced retrieval and interleaving practices (Karpicke and Roediger, 2007; Roediger and Butler, 2011; Birnbaum et al., 2013). Retrieval practices such as these, and the resulting testing effect, are useful techniques for studying and retaining information on longer timescales (Roediger and Karpicke, 2006; Rowland, 2014). The PS questions also contain scaffolding to break complex questions into component parts, offering opportunities for the learners to adapt their approaches based on the immediate feedback given (Reiser, 2004). An example of scaffolding in the course is where we task the learners with a series of questions about a mathematical description of cooperative binding. First, we ask learners to identify the correct mathematical transformations of their raw data to set up the calculation. Then the learners need to select the correct mathematical expression to fit the data, and perform calculations to generate a model from the transformed data and mathematical expression selected earlier. Finally, learners must interpret the output of the model by explaining the biochemical logic of their results. Learners have less confusion, indicated by fewer discussion forum posts about challenging assessments, when we break down assessments into stepwise processes.
To give feedback to the learners on whether they answered a formative assessment correctly in real time, and provide an opportunity to reassess and adapt their strategy to answer again, every formative assessment in the MOOC has immediate grading for correctness. Moreover, since previous work demonstrates the benefit of formative feedback (Bangert-Drowns and Morgan, 1991; Moreno, 2004; Nicol and Macfarlane-Dick, 2006; Hattie and Timperley, 2007; Shute, 2008), we offer detailed and specific feedback that clarifies attributes of the question target, the larger topic at hand, or the relationships between correct and incorrect responses (Shute, 2008). In addition to learning more about the correct answer, or why other options are incorrect, we present the process to arrive at the correct solution through a worked example for some question types. Worked examples guide the learner step-by-step to the solution from an expert's point of view, with the intention of facilitating the transfer of knowledge and skills to similar questions (Atkinson et al., 2000).
Supporting Resources
We include a number of resources for our learners that are meant to supplement their learning, although they are not required or count toward advancing progress (Figure 1, purple nodes). There is a separate section of the MOOC for study resources, which includes text, graphics, videos, and links that detail specific concepts, skills, and techniques that the learners should have as prerequisite knowledge. These study resources are linked explicitly in the sections of the course in which their reference could be useful. For example, in the enzyme catalysis learning sequence, we link relevant TYs to the organic chemistry study resources for a refresher on functional groups and reaction terminology. By embedding references to material that can function as a review, we are attempting to activate learner prior knowledge, which is a foundational step in facilitating lasting learning (Ambrose et al., 2010).
We also include optional molecular viewer activities. We use a molecular visualization software that structural biochemists employ in their research to help learners view and manipulate protein structural models from deposited data. Thus, these molecular viewer activities are an example of authentic learning, where learners apply their newly learned biochemistry skills in a context that is shared with professional scientists (Herrington et al., 2004; Herrington and Kervin, 2007; Lombardi, 2007; Oliver et al., 2007). The connection of educational materials to real-life applications is related to expert-level thinking (Semsar et al., 2011), and may help adult learners see themselves as belonging in scientific fields (Rossiter, 2007). Although these activities exploring protein structure are consequential for learning, the assessments associated with them do not influence the learners' grades. Low-stakes tasks such as these facilitate trying something intellectually risky and relieve some pressure to perform on assessments, which may also help intrinsic motivation and self-esteem (Nicol and Macfarlane-Dick, 2006). In this context, the optional molecular viewer assignments reduce the pressure to perform while engaging with a potentially unfamiliar tool.
At the bottom of each page in the course (excluding summative assessments), we include the opportunity to engage in the discussion forum. Discussion forum posts offer a way for learners to personally engage with each other and the instructors over the course material. Learners are encouraged to introduce themselves and their motivations for taking the MOOC, which we hope connects the material to personal values they hold. By articulating the value that the MOOC offers for them, learners may feel more motivated to sustain participation (Canning et al., 2018). The discussion forum is also a place where learners can have informal conversations about the material, ask for help from staff or peers, and answer each others' questions and comments. Engaging with other learners and staff, and helping each other build an understanding of course material can contribute to academic achievement and satisfaction (Jung et al., 2002; Kellogg et al., 2014).
Summative Assessment
The final, summative assessment of the biochemistry MOOC is a competency exam (CE) (Figure 1, orange node). We designed the CE to test the majority of learning objectives introduced in the course by engaging in scientific thinking, synthesizing concepts, and transferring knowledge to new contexts. We use the hierarchy of cognitive processing outlined by Bloom's taxonomy (Krathwohl, 2002; Krathwohl and Anderson, 2009) to guide the creation of all assessments. According to the taxonomy, many CE questions require application, analysis, and evaluation, similar to PS questions. Both CE and PS questions are more cognitively demanding to answer than TYs, which require more recall.
Discussion
We used a systematic, eclectic, instructional design approach to drive our design decisions in a biochemistry MOOC. Largely this approach follows from our unique perspectives as biology PhDs with pedagogical experience. Our roles in course development involve learning engineering (Wilcox et al., 2016), which lays at the intersections of applying expertise in the subject matter, educational technology, instructional and graphic design, educational research, data analytics, teaching, and project management. Although familiar with the learning science literature, we are free from the constraints of specializing in a specific area of educational research set by tenure track or grants. This freedom allows us to maintain a practitioners' point of view, draw inspiration from the different frameworks of how learning works, and collectively deploy an amalgamation of strategies that enhance the design of our learning experiences.
As scientists, we value evidence-based practices, and the empirical approach the learning sciences offers. As developers of digital learning experiences, we understand that there are orchestration constraints in MOOC development that present challenges to implementing best practices for teaching and learning outlined by the literature. There is a great need to test educational design decisions in context and assess the relevant variables in successful implementation (Moir, 2018). These evaluations of design should be both formal, as randomized control trials, which is the standard for testing the effects of intervention in education (US Department of Education; Institute of Education Sciences; National Center for Education Evaluation Regional Assistance, 2003), and also more informal through the iterative revisions necessary to keep MOOCs current and rigorous. Testing in specific contexts is needed furthermore because there are differences in practice across institutions and/or courses in what best promotes engagement and learning. For example, the relationship between video length and learner engagement is a popular topic because of the heavy reliance on multimedia in many MOOCs. In one study, researchers recommend that videos should be <6 min to maximize engagement (Guo et al., 2014), however, our own research indicates that video length is not a significant determinant in engagement in another one of our biology MOOCs (Thornton et al., 2017). This exemplifies the need to test different strategies, collect learner data, and make evidence-based design decisions informed by implementation research that are relevant to each course.
Application of the learning sciences to hybrid course and MOOC design provides a strong foundation of evidence-based practices that one can optimize for different online learning experiences. Making design decisions grounded in scientific evidence is a crucial first step. Equally important is the dissemination of these decision-making processes and subsequent evaluation of their implementation. By documenting the process of incorporating and testing applications of the learning sciences, we collectively can contribute to enriching the community of practice for digital learning designers while providing a more robust experience for learners. We see this perspective as a step forward to incite conversations and actions around evidence-based design efforts in online educational environments.
Statements
Author contributions
DG and MW contributed to the conception and scope of the perspective. DG wrote the initial draft of the manuscript, and both authors contributed to manuscript revision, read and approved the submitted version.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Supplementary material
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2020.500481/full#supplementary-material
Supplementary Figure 1An example of an image handout from the protein structure learning sequence highlighting essential images complete with focused, open-ended concept questions.
Supplementary Video 1A clip of a video segment detailing the steps in calcium ion transportation, where we cut back and forth between the professor explaining each step to a live classroom and an animation of this dynamic process added in post-production.
References
1
AinsworthS.Th LoizouA. (2003). The effects of self-explaining when learning with text or diagrams. Cogn. Sci. 27, 669–681. 10.1207/s15516709cog2704_5
2
Alario-HoyosC.Pérez-SanagustÃnM.CormierD.Delgado-KloosC. (2014a). Proposal for a conceptual framework for educators to describe and design MOOCs. J. Univers. Comput. Sci. 20, 6–23. 10.3217/jucs-020-01-0006
3
Alario-HoyosC.Pérez-SanagustÃnM.KloosC. D.Muñoz-MerinoP. J. (2014b). Recommendations for the design and deployment of MOOCs: insights about the MOOC digital education of the future deployed in MirÃadaX, in Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality - TEEM'14 (Salamanca: ACM Press), 403–408. 10.1145/2669711.2669931
4
AlevenV. A. W. M. M.KoedingerK. R. (2002). An effective metacognitive strategy: learning by doing and explaining with a computer-based cognitive tutor. Cogn. Sci. 26, 147–179. 10.1207/s15516709cog2602_1
5
AmbroseS. A.BridgesM. W.DiPietroM.LovettM. C.NormanM. K. (2010). How Learning Works: Seven Research-Based Principles for Smart Teaching. Hoboken, NJ: John Wiley and Sons.
6
ArmourC.SchneidS. D.BrandlK. (2016). Writing on the board as students' preferred teaching modality in a physiology course. Adv. Physiol. Educ.40, 229–233. 10.1152/advan.00130.2015
7
AtkinsonR. K.DerryS. J.RenklA.WorthamD. (2000). Learning from examples: instructional principles from the worked examples research. Rev. Educ. Res. 70, 181–214. 10.3102/00346543070002181
8
Bangert-DrownsR. L.MorganM. (1991). The instructional effect of feedback in test-like events. Rev. Educ. Res. 61, 213–238. 10.3102/00346543061002213
9
BirnbaumM. S.KornellN.BjorkE. L.BjorkR. A. (2013). Why interleaving enhances inductive learning: the roles of discrimination and retrieval. Mem. Cognit. 41, 392–402. 10.3758/s13421-012-0272-7
10
BuiD. C.McDanielM. A. (2015). Enhancing learning during lecture note-taking using outlines and illustrative diagrams. J. Appl. Res. Mem. Cogn. 4, 129–135. 10.1016/j.jarmac.2015.03.002
11
CanningE. A.HarackiewiczJ. M.PriniskiS. J.HechtC. A.TibbettsY.HydeJ. S. (2018). Improving performance and retention in introductory biology with a utility-value intervention. J. Educ. Psychol. 110, 834–849. 10.1037/edu0000244
12
ChiM. T. H.LeeuwN. D.ChiuM.-H.LavancherC. (1994). Eliciting self-explanations improves understanding. Cogn. Sci. 18, 439–477. 10.1207/s15516709cog1803_3
13
ClarkR. C.MayerR. E. (2016). e-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning. Hoboken, NJ: John Wiley and Sons. 10.1002/9781119239086
14
DohertyI.HarbuttD.SharmaN. (2015). Designing and developing a MOOC. Med. Sci. Educ. 25, 177–181. 10.1007/s40670-015-0123-9
15
FreemanS.EddyS. L.McDonoughM.SmithM. K.OkoroaforN.JordtH.et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. U. S. A. 111, 8410–8415. 10.1073/pnas.1319030111
16
GuoP. J.KimJ.RubinR. (2014). How video production affects student engagement: an empirical study of MOOC videos, in Proceedings of ACM Conference on Learning at Scale (L@S) (Atlanta, GA), 4–5. 10.1145/2556325.2566239
17
HandelsmanJ.Ebert-MayD.BeichnerR.BrunsP.ChangA.DeHaanR.et al. (2004). Education: scientific teaching. Science304, 521–522. 10.1126/science.1096022
18
HartleyJ.DaviesI. K. (1976). Preinstructional strategies: the role of pretests, behavioral objectives, overviews and advance organizers. Rev. Educ. Res. 46:28. 10.3102/00346543046002239
19
HattieJ.TimperleyH. (2007). The power of feedback. Rev. Educ. Res. 77, 81–112. 10.3102/003465430298487
20
HerringtonJ.KervinL. (2007). Authentic learning supported by technology: ten suggestions and cases of integration in classrooms. Educ. Media Int. 44, 219–236. 10.1080/09523980701491666
21
HerringtonJ.ReevesT. C.OliverR.WooY. (2004). Designing authentic activities in web-based courses. J. Comput. High. Educ. 16, 3–29. 10.1007/BF02960280
22
HonebeinP. C.SinkD. L. (2012). The practice of eclectic instructional design. Perform. Improv. 51, 26–31. 10.1002/pfi.21312
23
JungI.ChoiS.LimC.LeemJ. (2002). Effects of different types of interaction on learning achievement, satisfaction and participation in Web-based instruction. Innov. Educ. Teach. Int. 39, 153–162. 10.1080/14703290252934603
24
KarpickeJ. D.RoedigerH. L. (2007). Expanding retrieval practice promotes short-term retention, but equally spaced retrieval enhances long-term retention. J. Exp. Psychol. Learn. Mem. Cogn. 33, 704–719. 10.1037/0278-7393.33.4.704
25
KelloggS.BoothS.OliverK. (2014). A social network perspective on peer supported learning in MOOCs for educators. Int. Rev. Res. Open Distrib. Learn. 15:5. 10.19173/irrodl.v15i5.1852
26
KoppM.LacknerE. (2014). Do MOOCS need a special instructional design? in Proc. EDULEARN14 Conf. (Barcelona), 11.
27
KrathwohlD. R. (2002). A revision of Bloom's taxonomy: an overview. Theory Pract. 41, 212–218. 10.1207/s15430421tip4104_2
28
KrathwohlD. R.AndersonL. W. (2009). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. Harlow: Longman.
29
LeeG.KeumS.KimM.ChoiY. (2016). A study on the development of a MOOC design model. Educ. Technol. Int. 17, 1–37. Available online at: http://kset.or.kr/eti_ojs/index.php/instruction/article/view/69
30
LombardiM. M. (2007). Authentic learning for the 21st century: an overview. Educ. Learn. Initiat. 1, 1–12. Available online at: https://library.educause.edu/resources/2007/1/authentic-learning-for-the-21st-century-an-overview
31
MayerR. E. (2017). Using multimedia for e-learning: multimedia for e-learning. J. Comput. Assist. Learn. 33, 403–423. 10.1111/jcal.12197
32
MayerR. E.MathiasA.WetzellK. (2002). Fostering understanding of multimedia messages through pre-training: evidence for a two-stage theory of mental model construction. J. Exp. Psychol. Appl. 8, 147–154. 10.1037/1076-898X.8.3.147
33
MengK.PattyD. (1991). Field dependence and contextual organizers. J. Educ. Res. 84, 183–189. 10.1080/00220671.1991.10886013
34
MiyatsuT.NguyenK.McDanielM. A. (2018). Five popular study strategies: their pitfalls and optimal implementations. Perspect. Psychol. Sci. 13, 390–407. 10.1177/1745691617710510
35
MoirT. (2018). Why is implementation science important for intervention design and evaluation within educational settings?Front. Educ.3:61. 10.3389/feduc.2018.00061
36
MorenoR. (2004). Decreasing cognitive load for novice students: effects of explanatory versus corrective feedback in discovery-based multimedia. Instr. Sci. 32, 99–113. 10.1023/B:TRUC.0000021811.66966.1d
37
MottB. W.CallawayC. B.ZettlemoyerL. S.LeeS. Y.LesterJ. C. (1999). Towards narrative-centered learning environments, in Proc. 1999 AAAI Fall Symp. Narrat. Intell. (Menlo Park), 78–82.
38
NicolD. J.Macfarlane-DickD. (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud. High. Educ. 31, 199–218. 10.1080/03075070600572090
39
OliverR.HerringtonA.HerringtonJ.ReevesT. C. (2007). Representing authentic learning designs supporting the development of online communities of learners. J. Learn. Des. 2, 1–21. 10.5204/jld.v2i2.36
40
PaivioA.WalshM.BonsT. (1994). Concreteness effects on memory: when and why?J. Exp. Psychol. Learn. Mem. Cogn. 20, 1196–1204. 10.1037/0278-7393.20.5.1196
41
ReiserB. J. (2004). Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J. Learn. Sci. 13, 273–304. 10.1207/s15327809jls1303_2
42
RoedigerH. L.ButlerA. C. (2011). The critical role of retrieval practice in long-term retention. Trends Cogn. Sci. 15, 20–27. 10.1016/j.tics.2010.09.003
43
RoedigerH. L.KarpickeJ. D. (2006). The power of testing memory: basic research and implications for educational practice. Perspect. Psychol. Sci. 1, 181–210. 10.1111/j.1745-6916.2006.00012.x
44
RossiterM. (2007). Possible selves: an adult education perspective. New Dir. Adult Contin. Educ.114, 5–15. 10.1002/ace.252
45
RowlandC. A. (2014). The effect of testing versus restudy on retention: a meta-analytic review of the testing effect. Psychol. Bull. 140, 1432–1463. 10.1037/a0037559
46
SemsarK.KnightJ. K.BirolG.SmithM. K. (2011). The Colorado Learning Attitudes about Science Survey (CLASS) for use in biology. CBE—Life Sci. Educ. 10, 268–278. 10.1187/cbe.10-10-0133
47
ShahD. (2019). Class Central's top 100 MOOCs of all time (2019 edition). Cl. Cent. Available online at: https://www.classcentral.com/report/top-moocs-2019-edition/ (accessed September 16, 2019).
48
ShuteV. J. (2008). Focus on formative feedback. Rev. Educ. Res. 78, 153–189. 10.3102/0034654307313795
49
SwellerJ. (1994). Cognitive load theory, learning difficulty, and instructional design. Learn. Instr. 4, 295–312. 10.1016/0959-4752(94)90003-5
50
SwellerJ. (2005). Implications of cognitive load theory for multimedia learning, in The Cambridge Handbook of Multimedia Learning, ed MayerR. E. (New York, NY: Cambridge University Press), 19–30. 10.1017/CBO9780511816819.003
51
SwellerJ.van MerriënboerJ. J. G.PaasF. (2019). Cognitive architecture and instructional design: 20 Years Later. Educ. Psychol. Rev. 31, 261–292. 10.1007/s10648-019-09465-5
52
ThorntonS.RileyC.WiltroutM. E. (2017). Criteria for video engagement in a biology MOOC, in Proceedings of the Fourth (2017) ACM Conference on Learning @ Scale - L@S'17 (Cambridge, MA: ACM Press), 291–294. 10.1145/3051457.3054007
53
US Department of Education; Institute of Education Sciences; National Center for Education Evaluation and Regional Assistance (2003). Identifying and Implementing Educational Practices Supported by Rigorous Evidence: A User Friendly Guide. Washington, DC: Coalition for Evidence-Based Policy.
54
VeletsianosG.ShepherdsonP. (2016). A systematic analysis and synthesis of the empirical MOOC literature published in 2013–2015. Int. Rev. Res. Open Distrib. Learn. 17:2. 10.19173/irrodl.v17i2.2448
55
WeinsteinY.MadanC. R.SumerackiM. A. (2018). Teaching the science of learning. Cogn. Res. Princ. Implic. 3:2. 10.1186/s41235-017-0087-y
56
WeisbergJ. S. (1970). The use of visual advance organizers for learning earth science concepts. J. Res. Sci. Teach. 7, 161–165. 10.1002/tea.3660070212
57
WilcoxK. E.SarmaS.LippelP. H. (2016). Online Education: A Catalyst for Higher Education Reforms. Cambridge, MA: Massachusetts Institute of Technology.
58
WylieR.ChiM. T. H. (2014). The self-explanation principle in multimedia learning, in The Cambridge Handbook of Multimedia Learning, ed MayerR. E. (New York, NY: Cambridge University Press), 413–432. 10.1017/CBO9781139547369.021
59
YancharS. C.GabbitasB. W. (2011). Between eclecticism and orthodoxy in instructional design. Educ. Technol. Res. Dev. 59, 383–398. 10.1007/s11423-010-9180-3
60
YousefA. M. F.ChattiM. A.SchroederU.WosnitzaM. (2014). What drives a successful MOOC? An empirical examination of criteria to assure design quality of MOOCs, in 2014 IEEE 14th International Conference on Advanced Learning Technologies (Athens: IEEE), 44–48. 10.1109/ICALT.2014.23
Summary
Keywords
learning sciences, online education, instructional design, digital learning, learning engineering, MOOC
Citation
Gordon DG and Wiltrout ME (2021) A Framework for Applying the Learning Sciences to MOOC Design. Front. Educ. 5:500481. doi: 10.3389/feduc.2020.500481
Received
25 September 2019
Accepted
17 December 2020
Published
18 January 2021
Volume
5 - 2020
Edited by
Leman Figen Gul, Istanbul Technical University, Turkey
Reviewed by
Adamantios Koumpis, University of Passau, Germany; Clifford A. Shaffer, Virginia Tech, United States
Updates
Copyright
© 2021 Gordon and Wiltrout.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Darcy G. Gordon dggordon@mit.eduMary Ellen Wiltrout mew27@mit.edu
This article was submitted to Digital Education, a section of the journal Frontiers in Education
Disclaimer
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.