Skip to main content

REVIEW article

Front. Educ., 11 October 2019
Sec. Digital Education
Volume 4 - 2019 | https://doi.org/10.3389/feduc.2019.00113

Innovative Pedagogies of the Future: An Evidence-Based Selection

  • Institute of Educational Technology, The Open University, Milton Keynes, United Kingdom

There is a widespread notion that educational systems should empower learners with skills and competences to cope with a constantly changing landscape. Reference is often made to skills such as critical thinking, problem solving, collaborative skills, innovation, digital literacy, and adaptability. What is negotiable is how best to achieve the development of those skills, in particular which teaching and learning approaches are suitable for facilitating or enabling complex skills development. In this paper, we build on our previous work of exploring new forms of pedagogy for an interactive world, as documented in our Innovating Pedagogy report series. We present a set of innovative pedagogical approaches that have the potential to guide teaching and transform learning. An integrated framework has been developed to select pedagogies for inclusion in this paper, consisting of the following five dimensions: (a) relevance to effective educational theories, (b) research evidence about the effectiveness of the proposed pedagogies, (c) relation to the development of twenty-first century skills, (d) innovative aspects of pedagogy, and (e) level of adoption in educational practice. The selected pedagogies, namely formative analytics, teachback, place-based learning, learning with drones, learning with robots, and citizen inquiry are either attached to specific technological developments, or they have emerged due to an advanced understanding of the science of learning. Each one is presented in terms of the five dimensions of the framework.

Introduction

In its vision for the future of education in 2030, the Organization for Economic Co-operation and Development (OECD, 2018) views essential learner qualities as the acquisition of skills to embrace complex challenges and the development of the person as a whole, valuing common prosperity, sustainability and wellbeing. Wellbeing is perceived as “inclusive growth” related to equitable access to “quality of life, including health, civic engagement, social connections, education, security, life satisfaction and the environment” (p. 4). To achieve this vision, a varied set of skills and competences is needed, that would allow learners to act as “change agents” who can achieve positive impact on their surroundings by developing empathy and anticipating the consequences of their actions.

Several frameworks have been produced over the years detailing specific skills and competences for the citizens of the future (e.g., Trilling and Fadel, 2009; OECD, 2015, 2018; Council of the European Union, 2018). These frameworks refer to skills such as critical thinking, problem solving, team work, communication and negotiation skills; and competences related to literacy, multilingualism, STEM, digital, personal, social, and “learning to learn” competences, citizenship, entrepreneurship, and cultural awareness (Trilling and Fadel, 2009; Council of the European Union, 2018). In a similar line of thinking, in the OECD Learning Framework 2030 (OECD, 2018) cognitive, health and socio-emotional foundations are stressed, including literacy, numeracy, digital literacy and data numeracy, physical and mental health, morals, and ethics.

The question we are asked to answer is whether the education vision of the future, or the development of the skills needed to cope with an ever-changing society, has been met, or can be met. The short answer is not yet. For example, the Programme for International Student Assessment (PISA) has been ranking educational systems based on 15-year-old students' performance on tests about reading, mathematics and science every 3 years in more than 90 countries. In the latest published report (OECD, 2015), Japan, Estonia, Finland, and Canada are the four highest performing OECD countries in science. This means that students from these countries on average can “creatively and autonomously apply their knowledge and skills to a wide variety of situations, including unfamiliar ones” (OECD, 2016a, p.2). Yet about 20% of students across participating countries are shown to perform below the baseline in science and proficiency in reading (OECD, 2016b). Those most at risk are socio-economically disadvantaged students, who are almost three times more likely than their peers not to meet the given baselines. These outcomes are quite alarming; they stress the need for evidence-based, effective, and innovative teaching and learning approaches that can result in not only improved learning outcomes but also greater student wellbeing. Overall, an increasing focus on memorization and testing has been observed in education, including early years, that leaves no space for active exploration and playful learning (Mitchell, 2018), and threatens the wellbeing and socioemotional growth of learners. There is an increased evidence-base that shows that although teachers would like to implement more active, innovative forms of education to meet the diverse learning needs of their students, due to a myriad of constraints teachers often resort to more traditional, conservative approaches to teaching and learning (Ebert-May et al., 2011; Herodotou et al., 2019).

In this paper, we propose that the distance between educational vision and current teaching practice can be bridged through the adoption and use of appropriate pedagogy that has been tested and proven to contribute to the development of the person as a whole. Evidence of impact becomes a central component of the teaching practice; what works and for whom in terms of learning and development can provide guidelines to teaching practitioners as to how to modify or update their teaching in order to achieve desirable learning outcomes. Educational institutions may have already adopted innovations in educational technology equipment (such as mobile devices), yet this change has not necessarily been accompanied by respective changes in the practice of teaching and learning. Enduring transformations can be brought about through “pedagogy,” that is improvements in “the theory and practice of teaching, learning, and assessment” and not the mere introduction of technology in classrooms (Sharples, 2019). PISA analysis of the impact of Information Communication Technology (ICT) on reading, mathematics, and science in countries heavily invested in educational technology showed mixed effects and “no appreciable improvements” (OECD, 2015, p.3).

The aim of this study is to review and present a set of innovative, evidence-based pedagogical approaches that have the potential to guide teaching practitioners and transform learning processes and outcomes. The selected pedagogies draw from the successful Innovating Pedagogy report series (https://iet.open.ac.uk/innovating-pedagogy), produced by The Open University UK (OU) in collaboration with other centers of research in teaching and learning, that explore innovative forms of teaching, learning and assessment. Since 2012, the OU has produced seven Innovating Pedagogy reports with SRI international (USA), National Institute of Education (Singapore), Learning In a NetworKed Society (Israel), and the Center for the Science of Learning & Technology (Norway). For each report, teams of researchers shared ideas, proposed innovations, read research papers and blogs, and commented on each other's draft contributions in an organic manner (Sharples et al., 2012, 2013, 2014, 2015, 2016; Ferguson et al., 2017, 2019). Starting from an initial list of potential promising educational innovations that may already be in currency but not yet reached a critical mass of influence on education, these lists were critically and collaboratively examined, and reduced to 9–11 main topics identified as having the potential to provoke major shifts in educational practice.

After seven years of gathering a total of 70 innovative pedagogies, in this paper seven academics from the OU, authors of the various Innovating Pedagogy reports, critically reflected on which of these approaches have the strongest evidence and/or potential to transform learning processes and outcomes to meet the future educational skills and competences described by OECD and others. Based upon five criteria and extensive discussions, we selected six approaches that we believe have the most evidence and/or potential for future education:

• Formative analytics,

• Teachback,

• Place-based learning,

• Learning with robots,

• Learning with drones,

• Citizen inquiry.

Formative analytics is defined as “supporting the learner to reflect on what is learned, what can be improved, which goals can be achieved, and how to move forward” (Sharples et al., 2016, p.32). Teachback is a means for two or more people to demonstrate that they are progressing toward a shared understanding of a complex topic. Place-based learning derives learning opportunities from local community settings, which help students connect abstract concepts from the classroom and textbooks with practical challenges encountered in their own localities. Learning with robots could help teachers to free up time on simple, repetitive tasks, and provide scaffolding to learners. Learning with drones is being used to support fieldwork by enhancing students' capability to explore outdoor physical environments. Finally, citizen inquiry describes ways that members of the public can learn by initiating or joining shared inquiry-led scientific investigations.

Devising a Framework for Selection: The Role of Evidence

Building on previous work (Puttick and Ludlow, 2012; Ferguson and Clow, 2017; Herodotou et al., 2017a; John and McNeal, 2017; Batty et al., 2019), we propose an integrated framework for how to select pedagogies. The framework resulted from ongoing discussions amongst the seven authors of this paper as to how educational practitioners should identify and use certain ways of teaching and learning, while avoiding others. The five components of the model are presented below:

Relevance to effective educational theories: the first criterion refers to whether the proposed pedagogy relates to specific educational theories that have shown to be effective in terms of improving learning.

Research evidence about the effectiveness of the proposed pedagogies: the second criterion refers to actual studies testing the proposed pedagogy and their outcomes.

Relation to the development of twenty-first century skills: the third criterion refers to whether the pedagogy can contribute to the development of the twenty-first century skills or the education vision of 2030 (as described in the introduction section).

Innovative aspects of pedagogy: the fourth criterion details what is innovative or new in relation to the proposed pedagogy.

Level of adoption in educational practice: the last criterion brings in evidence about the current level of adoption in education, in an effort to identify gaps in our knowledge and propose future directions of research.

A major component of the proposed framework is effectiveness, or the generation of evidence of impact. The definition of what constitutes evidence varies (Ferguson and Clow, 2017; Batty et al., 2019), and this often relates to the quality or strength of evidence presented. The Strength of Evidence pyramid by John and McNeal (2017) (see Figure 1) categorizes different types of evidence based on their strength, ranging from expert opinions as the least strong type of evidence to meta-analysis or synthesis as the strongest or most reliable form of evidence. While the bottom of the pyramid refers to “practitioners' wisdom about teaching and learning,” the next two levels refer to peer-reviewed and published primary sources of evidence, both qualitative and quantitative. They are mostly case-studies, based on either the example of a single institution, or a cross-institutional analysis involving multiple courses or institutions. The top two levels involve careful consideration of existing resources of evidence and inclusion in a synthesis or meta-analysis. For example, variations of this pyramid in medical studies present Randomized Control Trials (RCTs) at the second top level of the pyramid, indicating the value of this approach for gaining less biased quality evidence.

FIGURE 1
www.frontiersin.org

Figure 1. The strength of evidence pyramid (John and McNeal, 2017).

Another approach proposed by the innovation foundation Nesta presents evidence on a scale of 1 to 5, showcasing the level of confidence with the impact of an intervention (Puttick and Ludlow, 2012). Level 1 studies describe logically, coherently and convincingly what has been done and why it matters, while level 5 studies produce manuals ensuring consistent replication of a study. The evidence becomes stronger when studies prove causality (e.g., through experimental approaches) and can be replicated successfully. While these frameworks are useful for assessing the quality or strength of evidence, they do not make any reference to how the purpose of a study can define which type of evidence to collect. Different types of evidence could effectively address different purposes; depending on the objective of a given study a different type of evidence could be used (Batty et al., 2019). For example, the UK government-funded research work by the Educational Endowment Foundation (EEF) is using RCTs, instead of for example expert opinions, as the purpose of their studies is to capture the impact of certain interventions nationally across schools in the UK.

Education, as opposed to other disciplines such as medicine and agriculture, has been less concerned with evaluating different pedagogical approaches and determining their impact on learning outcomes. The argument often made is the difficulty in evaluating learning processes, especially through experimental methodologies, due to variability in teaching conditions across classrooms and between different practitioners, that may inhibit any comparisons and valid conclusions. In particular, RCTs have been sparse and often criticized as not explaining any impact (or absence of impact) on learning, a limitation that could be overcome by combining RCT outcomes with qualitative methodologies (Herodotou et al., 2017a). Mixed-methods evaluations could identify how faithfully an intervention is applied to different learning contexts or for example, the degree to which teachers have been engaged with it. An alternative approach is Design-Based Research (DBR); this is a form of action-based research where a problem in the educational process is identified, solutions informed by existing literature are proposed, and iterative cycles of testing and refinement take place in order to identify what works in practice in order to improve the solution. DBR often results in guidelines or theory development (e.g., Anderson and Shattuck, 2012).

An evidence-based mindset in education has been recently popularized through the EEF. Their development of the teaching and learning toolkit provides an overview of existing evidence about certain approaches to improving teaching and learning, summarized in terms of impact on attainment, cost and the supporting strength of evidence. Amongst the most effective teaching approaches are the provision of feedback, development of metacognition and self-regulation, homework for secondary students, and mastery learning (https://educationendowmentfoundation.org.uk). Similarly, the National Center for Education and Evaluation (NCEE) in the US conducts large-scale evaluations of education programs with funds from the government. Amongst the interventions with the highest effectiveness ratings are phonological awareness training, reading recovery, and dialogic reading (https://ies.ed.gov/ncee/).

The importance of evidence generation is also evident in the explicit focus of Higher Education institutions in understanding and increasing educational effectiveness as a means to: tackle inequalities and promote educational justice (see Durham University Evidence Center for Education; DECE), provide high quality education for independent and lifelong learners (Learning and Teaching strategy, Imperial College London), develop criticality and deepen learning (London Center for Leadership in Learning, UCL Institute of Education), and improve student retention and performance in online and distance settings [Institute of Educational Technology (IET) OU].

The generation of evidence can help identify or debunk possible myths in education and distinguish between practitioners' beliefs about what works in their practice as opposed to research evidence emerging from systematically assessing a specific teaching approach. A characteristic example is the “Learning Styles” myth and the assumption that teachers should identify and accommodate each learner's special way of learning such as visual, auditory and kinesthetic. While there is no consistent evidence that considering learning styles can improve learning outcomes (e.g., Rohrer and Pashler, 2010; Kirschner and van Merriënboer, 2013; Newton and Miah, 2017), many teachers believe in learning styles and make efforts in organizing their teaching around them (Newton and Miah, 2017). In the same study, one third of participants stated that they would continue to use learning styles in their practice despite being presented with negative evidence. This suggests that we are rather in the early days of transforming the practice of education and in particular, developing a shared evidence-based mindset across researchers and practitioners.

In order to critically review the 70 innovative pedagogies from the seven Innovating Pedagogy reports, over a period of 2 months the seven authors critically evaluated academic and gray literature that was published after the respective reports were launched. In line with the five criteria defined above, each author contributed in a dynamic Google sheet what evidence was available for promising approaches. Based upon the initial list of 70, a short-list of 10 approaches was pre-selected. These were further fine-tuned to the final six approaches identified for this study based upon the emerging evidence of impact available as well as potential opportunity for future educational innovation. The emerging evidence and impact of the six approaches were peer-reviewed by the authoring team after contributions had been anonymized, and the lead author assigned the final categorizations.

In the next section, we present each of the proposed pedagogies in relation to how they meet the framework criteria, in an effort to understand what we know about their effectiveness, what evidence exist showcasing impact on learning, how each pedagogy accommodates the vision of the twenty-first century skills development, innovation aspects and current levels of adoption in educational practice.

Selected Pedagogies

Formative Analytics

Relevance to Effective Educational Theories

As indicated by the Innovating Pedagogy 2016 report (Sharples et al., 2016, p.32), “formative analytics are focused on supporting the learner to reflect on what is learned, what can be improved, which goals can be achieved, and how to move forward.” In contrast to most analytics approaches that focus on analytics of learning, formative analytics aims to support analytics for learning, for a learner to reach his or her goals through “smart” analytics, such as visualizations of potential learning paths or personalized feedback. For example, these formative analytics might help learners to effectively self-regulate their learning. Zimmerman (2000) defined self-regulation as “self-generated thoughts, feelings and actions that are planned and cyclically adapted to the attainment of personal learning goals.” Students have a range of choices and options when they are learning in blended or online environments as to when, what, how, and with whom to study, with minimal guidance from teachers. Therefore, “appropriate” Self-Regulated Learning (SRL) strategies are needed for achieving individual learning goals (Hadwin et al., 2011; Trevors et al., 2016).

With the arrival of fine-grained log-data and the emergence of learning analytics there are potentially more, and perhaps new, opportunities to map how to support students with different SRL (Winne, 2017). With trace data on students' affect (e.g., emotional expression in text, self-reported dispositions), behavior (e.g., engagement, time on task, clicks), and cognition (e.g., how to work through a task, mastery of task, problem-solving techniques), researchers and teachers are able to potentially test and critically examine pedagogical theories like SRL theories on a micro as well as macro-level (Panadero et al., 2016; D'Mello et al., 2017).

Research Evidence About the Effectiveness of the Proposed Pedagogies

There is an emergence of literature that uses formative analytics to support SRL and to understand how students are setting goals and solve computer-based tasks (Azevedo et al., 2013; Winne, 2017). For example, using the software tool nStudy (Winne, 2017) recently showed that trace data from students in forms of notes, bookmarks, or quotes can be used to understand the cycles of self-regulation. In a study of 285 students learning French in a business context, using log-file data (Gelan et al., 2018) found that engaged and self-regulated students outperformed students who were “behind” in their study. In an introductory mathematics course amongst 922 students, Tempelaar et al. (2015) showed that a combination of self-reported learning dispositions from students in conjunction with log-data of actual engagement in mathematics tasks provide effective formative analytics feedback to students. Recently, Fincham et al. (2018) found that formative analytics could actively encourage 1,138 engineering learners to critically reflect upon one of their eight adopted learning strategies, and where needed adjust it.

Relation to the Development of Twenty-First Century Skills

Beyond providing markers for formative feedback on cognitive skills (e.g., mastery of mathematics, critical thinking), formative analytics tools have also been used for more twenty-first century affective (e.g., anxiety, self-efficacy) and behavioral (e.g., group working) skills. For example, a group widget developed by Scheffel et al. (2017) showed that group members were more aware of their online peers and their contributions. Similarly, providing automatic computer-based assessment feedback on mastery of mathematics exercises but also providing different options to work-out the next task allowed students with math anxiety to develop more self-efficacy over time when they actively engaged with formative analytics (Tempelaar et al., 2018). Although implementing automated formative analytics is relatively easier with structured cognitive tasks (e.g., multiple choice questions, calculations), there is an emerging body of research that focuses on using more complex and unstructured data, such as text as well as emotion data (Azevedo et al., 2013; Panadero et al., 2016; Trevors et al., 2016), that can effectively provide formative analytics beyond cognition.

Innovative Aspects of Pedagogy

By using fine-grained data and reporting this directly back to students in the form of feedback or dashboards, the educational practice is substantially influenced, and subsequently innovated. In particular, instead of waiting for feedback from a teacher at the end of an assessment task, students can receive formative analytics on demand (when they want to), or ask for the formative analytics that link to their own self-regulation strategies. This is a radical departure from more traditional pedagogies that either place the teacher at the center, or expect students to be fully responsible for their SRL.

Level of Adoption in Educational Practice

Beyond the widespread practice of formative analytics in computer-based assessment (Scherer et al., 2017), there is an emerging field of practice whereby institutions are providing analytics dashboards directly to students. For example, in a recent review on the use of learning analytics dashboards, Bodily et al. (2018) conclude that many dashboards use principles and conceptualizations of SRL, which could be used to support teachers and students, assuming they have the capability to use these tools. However, substantial challenges remain as to how to effectively provide these formative analytics to teachers (Herodotou et al., 2019) and students (Scherer et al., 2017; Tempelaar et al., 2018), and how to make sure positive SRL strategies nested within students are encouraged and not hampered by overly prescriptive and simplistic formative analytics solutions.

Teachback

Relevance to Effective Educational Theories

The method of Teachback, and the name, were originally devised by the educational technologist Gordon Pask (1976), as a means for two or more people to demonstrate that they are progressing toward a shared understanding of a complex topic. It starts with an expert, teacher, or more knowledgeable student explaining their knowledge of a topic to another person who has less understanding. Next, the less knowledgeable student attempts to teach back what they have learned to the more knowledgeable person. If that is successful, the one with more knowledge might then explain the topic in more detail. If the less knowledgeable person has difficulty in teaching back, the person with more expertise tries to explain in a clearer or different way. The less knowledgeable person teaches it again until they both agree.

A classroom teachback session could consist of pairs of students taking turns to teach back to each other a series of topics set by the teacher. For example, a science class might be learning the topic of “eclipses.” The teacher splits the class into pairs and asks one student in each pair to explain to the other what they know about “eclipse of the sun.” Next, the class receives instruction about eclipses from the teacher, or a video explanation. Then, the second student in the pair teaches back what they have just learned. The first student asks questions to clarify such as, “What do you mean by that?” If either student is unsure, or the two disagree, then they can ask the teacher. The students may also jointly write a short explanation, or draw a diagram of the eclipse, to explain what they have learned.

The method is based on the educational theory of “radical constructivism” (e.g., von Glaserfeld, 1987) which sees knowledge as an adaptive process, allowing people to cope in the world of experience by building consensus through mutually understood language. It is a cybernetic theory, not a cognitive one, in which structured conversation and feedback among individuals create a system that “comes to know” by creating areas of mutual understanding.

Research Evidence About the Effectiveness of the Proposed Pedagogies

Some doctors and healthcare professionals have adopted teachback in their conversations with patients to make sure they understand instructions on how to take medication and manage their care. In a study by Negarandeh et al. (2013) with 43 diabetic patients, a nurse conducted one 20-min teachback session for each patient, each week over 3 weeks. A control group (N = 40) spent similar times with the nurse, but received standard consultations. The nurse asked questions such as “When you get home, your partner will ask you what the nurse said. What will you tell them?” Six weeks after the last session, those patients who learned through teachback knew significantly more about how to care for their diabetes than the control group patients. Indeed, a systematic review study of 12 published articles covering teachback for patients showed positive outcomes on a variety of measures, though not all were statistically significant (Ha Dinh et al., 2016).

Relation to the Development of Twenty-First Century Skills

Teachback has strong relevance in a world of social and conversational media, with “fake news” competing for attention alongside verified facts and robust knowledge. How can a student “come to know” a new topic, especially one that is controversial. Teachback can be a means to develop the skills of questioning knowledge, seeking understanding, and striving for agreement.

Innovative Aspects of Pedagogy

The conversational partner in Teachback could be an online tutor or fellow student, or an Artificial Intelligence (AI) system that provides a “teachable agent”. With a teachable agent, the student attempts to teach a recently-learned topic to the computer and can see a dynamic map of the concepts that the computer agent has “learned” (www.teachableagents.org/). The computer could then attempt to teachback the knowledge. Alternatively, AI techniques can enhance human teachback by offering support and resources for a productive conversation, for example to search for information or clarify the meaning of a term.

Rudman (2002) demonstrated a computer-based variation on teachback. In this study, one person learned the topic of herbal remedies from a book and became the teacher. A second person then attempted to learn about the same topic by holding a phone conversation with the more-knowledgeable teacher. The phone conversation between the two people was continually monitored by an AI program that detected keywords in the spoken dialogue. Whenever the AI program recognized a keyword or phrase in the conversation (such as the name of a medicinal herb, or its properties), it displayed useful information on the screen of the learner, but not the teacher. Giving helpful feedback to the learner balanced the conversation, so that both could hold a more constructive discussion.

Level of Adoption in Educational Practice

The method has seen some adoption into medical practice (https://bit.ly/2Xr9qY5). It has also been tested at small scale for science education (Gutierrez, 2003). Reciprocal teaching has been adopted in some schools for teaching of reading comprehension (Oczuks, 2003).

Placed-Based Learning

Place-based learning derives learning opportunities from local community settings. These help students to connect abstract concepts from the classroom and textbooks with practical challenges encountered in their own localities. “Place” can refer to learning about physical localities, but also the social and cultural layers embedded within neighborhoods; and engaging with communities and environments as well as observing them. It can be applied as much to arts and humanities focused learning as science-based learning. Place-based learning can encompass service learning, where students, and teachers solve local community problems, and through place-based learning acquire and learn a range of skills (Sobel, 2004). Mobile and networked technologies have opened up new possibilities for constructing and sharing knowledge, and reaching out to different stakeholders. Learning can take place while mobile, enabling communication across students and teachers, and beyond the field site. The physical and social aspects of the environment can be enhanced or augmented by digital layers to enable a richer experience, and greater access to resources and expertise.

Relevance to Effective Educational Theories

Place-based learning draws upon experiential models of learning (e.g., Kolb, 1984), where active engagement with a situation and resulting experiences are reflected upon to help conceptualize learning, which in turn may trigger further explorations or experimentation. It may be structured as problem-based learning. Unplanned or unintentional learning outcomes may occur as a result of engagements, so place-based learning also draws on incidental learning (e.g., Kerka, 2000). Place-based learning declares that a more “authentic” and meaningful learning experience can happen in relevant environments, aligning with situated cognition, that states that knowledge is situated within physical, social and cultural contexts (Brown et al., 1989). Learning episodes are often encountered with and through other people, a form of socio-cultural learning (e.g., Vygotsky, 1978). Networked technologies can enhance what experiences may be possible, and through the connections that might be made, recently articulated as connectivism (e.g., Siemens, 2005; Ito et al., 2013).

Research Evidence About the Effectiveness of the Proposed Pedagogies

Place-based learning draws on a range of pedagogies, and in part derives its authority from research into their efficacy (e.g., experiential learning, situated learning, problem-based learning). For example, in a study of 400 US high school students Ernst and Monroe (2004) found that environment-based teaching both significantly improved students' critical thinking skills, and also their disposition toward critical thinking. Research has shown that learning is very effective if carried out in “contexts familiar to students' everyday lives” (Bretz, 2001, p.1112). In another study, Linnemanstons and Jordan (2017) found that educators perceived students to display greater engagement and understanding of concepts when learning through experiential approaches in a specific place. Semken and Freeman (2008) trialed a method to test whether “sense of place” could be measured as learning outcome when students are taught through place-based science activities. Using a set of psychometric surveys tested on a cohort of 31 students, they “observed significant gains in student place attachment and place meaning” (p.1042). In an analysis of 23 studies exploring indigenous education in Canada, Madden (2015) showed that place-based education can play an effective role in decolonizing curriculum, fostering understandings of shared histories between indigenous and non-indigenous learners in Canada. Context-aware systems that are triggered by place can provide location relevant learning resources (Kukulska-Hulme et al., 2015), enhancing the ecology of tools available for place-based learning. However, prompts to action from digital devices might also be seen as culturally inappropriate in informal, community based learning where educational activities and their deployment needs to be considered with sensitivity (Gaved and Peasgood, 2017).

Relation to the Development of Twenty-First Century Skills

Critical thinking and problem solving are central to this experiential-based approach to learning. Contextually based, place-based learning requires creativity and innovation by participants to manage and respond to often unexpected circumstances with unexpected learning opportunities and outcomes likely to arise. As an often social form of learning, communication and collaboration are key skills developed, with a need to show sensitivity to local circumstances. An ability to learn the skills to manage social and cross-cultural interactivity will be central for a range of subject areas taught through place-based learning, such as language learning or human geography. Increasingly, place-based learning is enhanced or augmented by mobile and networked technologies, so digital literacy skills need to be acquired to take full advantage of the tools now available.

Innovative Aspects of Pedagogy

Place-based learning re-associates learning with local contexts, at a time when educators are under pressure to fit into national curricula and a globalized world. It seeks to re-establish students with a sense of place, and recognize the opportunities of learning in and from local community settings, using neighborhoods as the specific context for experiential and problem-based learning. It can provide a mechanism for decolonializing curriculum, recognizing that specific spaces can be understood to have different meanings to different groups of people, and allowing diverse voices to be represented. Digital and networked technologies extend the potential for group and individual learning, reaching out and sharing knowledge with a wider range of stakeholders, enabling flexibility in learning, and a greater scale of interactions. Networked tools enable access to global resources, and learning beyond the internet, with smartphones and tablets (increasingly owned by the learners themselves) as well as other digital tools linked together for gathering, analyzing and reflection on data and interactions. Context and location aware technologies can trigger learning resources on personal devices, and augment physical spaces: augmented reality tools can dynamically overlay data layers and context sensitive virtual information (Klopfer and Squire, 2008; Wu et al., 2013).

Level of Adoption in Educational Practice

Place-based learning could be said to pre-date formal classroom based learning in the traditional sense of work based learning (e.g., apprenticeships), or informal learning (e.g., informal language learning). Aspects of place-based learning have a long heritage, such as environmental education and learning though overcoming neighborhood challenge, with the focus on taking account of learning opportunities “beyond the schoolhouse gate” (Theobald and Curtiss, 2000). Place-based learning aligns with current pedagogical interests in education that is “multidisciplinary, experiential, and aligned with cultural and ecological sustainability” (Webber and Miller, 2016, p.1067).

Learning With Robots

Relevance to Effective Educational Theories

Learning through interaction and then reflecting upon the outcomes of these interactions prompted Papert (1980) to develop the Logo Turtles. It can be argued that these turtles were one of the first robots to be used in schools whose theoretical premises were grounded within a Constructivist approach to learning. Constructivism translates into a pedagogy where students actively engage in experimental endeavors often based within real–world problem solving undertakings. This was how the first turtles were used to assist children to understand basic mathematical concepts. Logo turtles have morphed into wheeled robots in current Japanese classrooms where 11- and 12-year olds learn how to program them and then compete in teams to create the code needed to guide their robots safely through an obstacle course. This latter approach encourages children to “Think and Learn Together with Information and Communication Technology” as discussed by Dawes and Wegerif (2004). Vygotsky's theoretical influence is then foregrounded in this particular pedagogical context, where his sociocultural theory recognizes and emphasizes the role of language within any social interaction to prompt cognitive development.

Research Evidence About the Effectiveness of the Proposed Pedagogies

The early work of Papert has been well documented but more recently Benitti (2012) reviewed the literature about the use of robotics in schools. The conclusions reached from this meta-analysis, where the purpose of each study was taken into account, together with the type of robot used and the demographics of the children who took party in the studies suggested that the use of robots in classrooms can enhance learning. This was found particularly with the practical teaching in STEM subjects, although some studies did not reveal improvements in learning. Further work by Ospennikova et al. (2015) showed how this technology can be applied to teaching physics in Russian secondary schools and supports the use of learning with robots in STEM subjects. Social robots for early language learning have been explored by Kanero et al. (2018); this has proved to be positive for story telling skills (Westlund and Breazeal, 2015). Kim et al. (2013) have illustrated that social robots can assist with the production of more speech utterances for young children with ASD. However, none of the above studies illustrate that robots are more effective than human teachers, but this pedagogy is ripe for more research findings.

Relation to the Development of Twenty-First Century Skills

Teaching a robot to undertake a task through specific instructions mimics the way human teachers behave with pupils when they impart a rule set or heuristics to the pupils using a variety of rhetoric techniques in reaction to the learner's latest attempt at completing a given task. This modus operandi has been well documented by Jerome Bruner and colleagues and has been termed as “scaffolding” (Wood et al., 1979). This latter example illustrates a growing recognition of the expanding communicative and expressive potential found through working with robots and encouraging teamwork and collaboration.

Innovative Aspects of Pedagogy

The robot can undertake a number of roles, with different levels of involvement in the learning task. Some of the examples mentioned above demonstrate the robot taking on a more passive role (Mubin et al., 2013). This is when it can be used to teach programming, such as moving the robot on a physical route with many obstacles. Robots can also act as peers and learn together with the student or act as a teacher itself. The “interactive cat” (iCat) developed by Philips Research is an example of a robotic teacher helping language learning. It has a mechanical rendered cat face and can express emotion. This was an important feature with respect to social supportiveness, an important attribute belonging to human tutors. Research showed that social supportive behavior exhibited by the robot tutor had a positive effect on students' learning. The supportive behaviors exhibited by iCat tutor were non-verbal behavior, such as smiling, attention building, empathy, and communicativeness.

Level of Adoption in Educational Practice

Interest in learning with robots in the classroom and beyond is growing but purchasing expensive equipment which will require technical support can prevent adoption. There are also ethical issues that need to be addressed since “conversations” with embodied robots that can support both learning and new forms of assessment must all sustain equity within an ethical framework. As yet these have not been agreed within the AI community.

Learning With Drones

Relevance to Effective Educational Theories

Outdoor fieldwork is a long-standing student-centered pedagogy across a range of disciplines, which is increasingly supported by information technology (Thomas and Munge, 2015, 2017). Within this tradition, drone-based learning, a recent innovation, is being used to support fieldwork by enhancing students' capability to explore outdoor physical environments. When students engage in outdoor learning experiences, reflect on those experiences, conceptualize their learning and experiment with new actions, they are engaging in experiential learning (Kolb, 1984). The combination of human senses with the multimedia capabilities of a drone (image and video capture) means that the learning experience can be rich and multimodal. Another key aspect is that learning takes place through research, scientific data collection and analysis; drones are typically used to assist with data collection from different perspectives and in places that can be difficult to access. In the sphere of informal and leisure learning in places such as nature reserves and cultural heritage sites (Staiff, 2016), drone-based exploration is based on discovery and is a way to make the visitor experience more attractive.

Research Evidence About the Effectiveness of the Proposed Pedagogies

There is not yet much research evidence on drone-based learning, but there are some case studies, teachers' accounts based on observations of their students, and pedagogically-informed suggestions for how drones may be applied to educational problems and the development of students' knowledge and practical skills. For example, a case study conducted in Malaysia with postgraduate students taking a MOOC (Zakaria et al., 2018) was concerned with students working on a video creation task using drones, in the context of problem-based learning about local issues. The data analysis showed how active the students had been during a task which involved video shooting and editing/production. In the US, it was reported that a teacher introduced drones to a class of elementary students with autism in order to enhance their engagement and according to the teacher the results were “encouraging” since the students stayed on task better and were more involved with learning (Joch, 2018). In the context of education in Australia, Sattar et al. (2017) give suggestions for using drones to develop many kinds of skills, competences and understanding in various disciplines, also emphasizing the learners' active engagement.

Relation to the Development of Twenty-First Century Skills

Sattar et al. (2017) argue that using drone technology will prepare and equip students with the technical skills and expertise which will be in demand in future, enhance their problem-solving skills and help them cope with future technical and professional requirements; students can be challenged to develop skills in problem-solving, analysis, creativity and critical thinking. Other ideas put forward in the literature suggest that drone-based learning can stimulate curiosity to see things that are hidden from view, give experience in learning through research and analyzing data, and it can help with visual literacies including collecting visual data and interpreting visual clues. Another observation is that drone-based learning can raise issues of privacy and ethics, stimulating discussion of how such technologies should be used responsibly when learning outside the classroom.

Innovative Aspects of Pedagogy

Drones enable learners to undertake previously impossible actions on field trips, such as looking inside inaccessible places or inspecting a landscape from several different perspectives. There is opportunity for rich exploration of physical objects and spaces. Drone-based learning can be a way to integrate skills and literacies, particularly orientation and motor skills with digital literacy. It is also a new way to integrate studies with real world experiences, showing students how professionals including land surveyors, news reporters, police officers and many others use drones in their work. Furthermore, it has been proposed as an assistive technology, enabling learners who are not mobile to gain remote access to sites they would not be able to visit (Mangina et al., 2016).

Level of Adoption in Educational Practice

Accounts of adoption into educational practice suggest that early adopters with an interest in technology have been the first to experiment with drones. There are more accounts of adoption in community settings, professional practice settings and informal learning than in formal education at present. For example, Hodgson et al. (2018) describe how ecologists use drones to monitor wildlife populations and changes in vegetation. Drones can be used to capture images of an area from different angles, enabling communities to collect evidence of environmental problems such as pollution and deforestation. They are used after earthquakes and hurricanes, to assess the damage caused by these disasters, to locate victims, to help deliver aid, and to enhance understanding of assistance needs (Sandvik and Lohne, 2014). They also enable remote monitoring of illegal trade without having to confront criminals.

Citizen Inquiry

Citizen science is an increasingly popular activity that has the potential to support growth and development in learning science. Active participation by the public in scientific research encourages this. This is due to its potential to educate the public—including young people—and to support the development of skills needed for the workplace, and contribute to findings of real science research. An experience that allows people to become familiar with the work of scientists and learn to make their own science has potential for learning. Citizen science activities can take place online on platforms such as Zooniverse, which hosts some of the largest internet-based citizen science projects or nQuire (nQuire.org.uk), which scaffolds a wide range of inquiries, or can be offline in a local area (e.g., a bioblitz). In addition, mobile and networked technologies have opened up new possibilities for these investigations (see e.g., Curtis, 2018).

Relevance to Effective Educational Theories

Most current citizen science initiatives engage the general public in some way. For example, they may be in the role of volunteers, often non-expert individuals, in projects generated by scientists such as species recognition and counting. In these types of collaboration the public contributes to data collection and analysis tasks such as observation and measurement. The key theory which underpins this work is that of inquiry learning. “Inquiry-based learning is a powerful generalized method for coming to understand the natural and social world through a process of guided investigation” (Sharples et al., 2013, p.38). It has been described as a powerful way to encourage learning by encouraging learners to use higher-order thinking skills during the conduct of inquiries and to make connections with their world knowledge.

Inquiry learning is a pedagogy with a long pedigree. First proposed by Dewey as learning through experience it came to the fore in the discovery learning movement of the sixties. Indeed, the term citizen inquiry has been coined which “fuses the creative knowledge building of inquiry learning with the mass collaborative participation exemplified by citizen science, changing the consumer relationship that most people have with research to one of active engagement” (Sharples et al., 2013, p.38).

Researchers using this citizen inquiry paradigm have described how it “shifts the emphasis of scientific inquiry from scientists to the general public, by having non-professionals (of any age and level of experience) determine their own research agenda and devise their own science investigations underpinned by a model of scientific inquiry. It makes extensive use of web 2.0 and mobile technologies to facilitate massive participation of the public of any age, including youngsters, in collective, online inquiry-based activities” (Herodotou et al., 2017b). This shift offers more opportunities for learning in these settings.

Research Evidence About the Effectiveness of the Proposed Pedagogies

Research has shown that learning can be developed in citizen science projects. Herodotou et al. (2018) citing a review by Bonney et al. (2009) have found that systematic involvement in citizen science projects produces learning outcomes in a number of ways, including increasing accuracy and degree of self-correction of observations. A number of studies have examined the learning which takes place during the use of iSpot (see Scanlon et al., 2014; Silvertown et al., 2015). Preliminary results showed that novice users can reach a fairly sophisticated understanding of identification over time (Scanlon et al., 2014). Also, Aristeidou et al. (2017, p 252) examined citizen science activities on nQuire, and reported that some participants perceived learning as a reason for feeling satisfied with their engagement, with comments such as “insight into some topics” and “new information.”

Through an online survey, Edwards et al. (2017) reported that citizen science participants of the UK Wetland Bird Survey and the Nest Record Scheme had learned on various dimensions. This was found to be related in part to their prior levels of education. Overall, there is a growing number of studies investigating the relationship between citizen science and learning with some positive indications that projects can be designed to encourage learning (Further studies on learning from citizen science are also discussed by Ballard et al., 2017, and Boakes et al., 2016).

Relation to the Development of Twenty-First Century Skills

The skills required by citizens in the twenty-first century are those derived from citizen science projects. They “need the skills and knowledge to solve problems, evaluate evidence, and make sense of complex information from various sources.” (Ferguson et al., 2017, p.12). As noted by OECD (2015) a significant skill students need to develop is learn to “think like a scientist.” This is perceived as an essential skill across professions and not only the science-related ones. In particular, STEM education and jobs are no longer viewed as options for the few or for the “gifted.” “Engagement with STEM can develop critical thinking, teamwork skills, and civic engagement. It can also help people cope with the demands of daily life. Enabling learners to experience how science is made can enhance their content knowledge in science, develop scientific skills and contribute to their personal growth. It can also increase their understanding of what it means to be a scientist” (Ferguson et al., 2017, p.12).

Innovative Aspects of Pedagogy

One of the innovations of this approach is that it enables potentially any citizen to engage and understand scientific activities that are often locked behind the walls of experimental laboratories. Thinking scientifically should not be restricted to scientists; it should be a competency that citizens develop in order to engage critically and reflect on their surroundings. Such skills will enable critical understanding of public debates such as fake news and more active citizenship. Technologically, the development of these skills can be supported by platforms such as nQuire, the vision of which is to scaffold the process of scientific research and facilitate development of relevant skills amongst citizens.

Level of Adoption in Educational Practice

Citizen science activities are mainly found in informal learning settings, with rather limited adoption to formal education. “For example, the Natural History Museum in London offers citizen science projects that anyone can join as an enjoyable way to interact with nature. Earthworm Watch is one such project that runs every spring and autumn in the UK. It is an outdoor activity that asks people to measure soil properties and record earthworms in their garden or in a local green space. Access to museums such as the Natural History Museum is free of charge allowing all people, no matter what their background, to interact with such activities and meet others with similar interests.” (Ferguson et al., 2017, p.13) At the moment, adoption is dependent on individual educators rather than a policy. Two Open University examples are the incorporation of the iSpot platform into a range of courses from short courses such as Neighborhood Nature to MOOCs such as An introduction to Ecology on the FutureLearn platform. In recent years there are more accounts of citizen science projects within school settings (see e.g., Doyle et al., 2018; Saunders et al., 2018; Schuttler et al., 2018).

Discussion

In this paper, we discussed six innovative approaches to teaching and learning that originated from seven Innovating Pedagogy reports (Sharples et al., 2012, 2013, 2014, 2015, 2016; Ferguson et al., 2017, 2019), drafted between 2012 and 2019 by leading academics in Educational Technology at the OU and institutions in the US, Singapore, Israel, and Norway. Based upon an extensive peer-review by seven OU authors, evidence and impact of six promising innovative approaches were gathered, namely formative analytics, teachback, place-based learning, learning with robots, learning with drones, and citizen inquiry. For these six approaches there is strong or emerging evidence that they can effectively contribute to the development of skills and competences such as critical thinking, problem-solving, digital literacy, thinking like a scientist, group work, and affective development.

The maturity of each pedagogy in terms of evidence generation varies with some pedagogies such as learning with drones being less mature and others such as formative analytics being more advanced. In Table 1, we used the evidence classifications in Figures 1, 2 to provide our own assessment of the overall quality of evidence (strength of evidence and level of confidence (scale 1–5) based on NESTA's standards of evidence shown in Figure 2) for each pedagogy, as a means to identify gaps in current knowledge and direct future research efforts.

TABLE 1
www.frontiersin.org

Table 1. Future directions of selected pedagogies.

FIGURE 2
www.frontiersin.org

Figure 2. Standards of evidence by Nesta.

The proposed pedagogies have great potential in terms of reducing the distance between aspirations or vision for the future of education and current educational practice. This is evident in their relevance to effective educational theories including experiential learning, inquiry learning, discovery learning, and self-regulated learning, all of which are interactive and engaging ways of learning. Also, the review of existing evidence showcases their potential to support learning processes and desirable learning outcomes in both the cognitive and emotional domain. Yet, this list of pedagogies is not exhaustive; additional pedagogies that could potentially meet the selection criteria—and which can be found in the Innovating Pedagogy report series—are for example, playful learning emphasizing the need for play, exploration and learning through failure, virtual studios stressing learning flexibility through arts and design, and dynamic assessment during which assessors support learners in identifying and overcoming learning difficulties.

Conclusions

In this paper we presented six approaches to teaching and learning and stressed the importance of evidence in transforming the educational practice. We devised and applied an integrated framework for selection that could be used by both researchers and educators (teachers, pre-service teachers, educational policy makers etc.) as an assessment tool for reflecting on and assessing specific pedagogical approaches, either currently in practice or intended to be used in education in the future. Our framework goes beyond existing frameworks that focus primarily on the development of skills and competences for the future, by situating such development within the context of effective educational theories, evidence from research studies, innovative aspects of the pedagogy, and its adoption in educational practice. We made the case that learning is a science and that the testing of learning interventions and teaching approaches before applying these to practice should be a requirement for improving learning outcomes and meeting the expectations of an ever-changing society. We wish this work to spark further dialogue between researchers and practitioners and signal the necessity for evidence-based professional development that will inform and enhance the teaching practice.

Author Contributions

CH: introduction, discussion, confusion sections, revision of manuscript. MS: teachback. MG: place-based learning. BR: formative analytics. ES: citizen inquiry. AK-H: learning with drones. DW: learning with robots.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

Anderson, T., and Shattuck, J. (2012). Design-based research: a decade of progress in education research? Educ. Res. 41, 16–25. doi: 10.3102/0013189X11428813

CrossRef Full Text | Google Scholar

Aristeidou, M., Scanlon, E., and Sharples, M. (2017). “Design processes of a citizen inquiry community,” in Citizen Inquiry: Synthesising Science and Inquiry Learning, eds C. Herodotou, M. Sharples, and E. Scanlon (Abingdon: Routledge), 210–229. doi: 10.4324/9781315458618-12

CrossRef Full Text | Google Scholar

Azevedo, R., Harley, J., Trevors, G., Duffy, M., Feyzi-Behnagh, R., Bouchet, F., et al. (2013). “Using trace data to examine the complex roles of cognitive, metacognitive, and emotional self-regulatory processes during learning with multi-agent systems,” in International Handbook of Metacognition and Learning Technologies, eds R. Azevedo and V. Aleven (New York, NY: Springer New York), 427–449. doi: 10.1007/978-1-4419-5546-3_28

CrossRef Full Text | Google Scholar

Ballard, H. L., Dixon, C. G. H., and Harris, E. M. (2017). Youth-focused citizen science: examining the role of environmental science learning and agency for conservation. Biol. Conserv. 208, 65–75. doi: 10.1016/j.biocon.2016.05.024

CrossRef Full Text | Google Scholar

Batty, R., Wong, A., Florescu, A., and Sharples, M. (2019). Driving EdTech Futures: Testbed Models for Better Evidence. London: Nesta.

Google Scholar

Benitti, F. B. V. (2012). Exploring the educational potential of robotics in schools: a systematic review. Comput. Educ. 58, 978–988. doi: 10.1016/j.compedu.2011.10.006

CrossRef Full Text | Google Scholar

Boakes, E. H., Gliozzo, G., Seymour, V., Harvey, M., Smith, C., Roy, D. B., et al. (2016). Patterns of contribution to citizen science biodiversity projects increase understanding of volunteers' recording behaviour. Sci. Rep. 6:33051. doi: 10.1038/srep33051

PubMed Abstract | CrossRef Full Text | Google Scholar

Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D., Xhakaj, F., et al. (2018). “Open learner models and learning analytics dashboards: a systematic review,” in Proceedings of the 8th International Conference on Learning Analytics and Knowledge (Sydney, NSW: ACM), 41–50.

Google Scholar

Bonney, R., Cooper, C. B., Dickinson, J., Kelling, S., Phillips, T., Rosenberg, K. V., et al. (2009). Citizen science: a developing tool for expanding science knowledge and scientific literacy. Bioscience 59, 977–984. doi: 10.1525/bio.2009.59.11.9

CrossRef Full Text | Google Scholar

Bretz, S. L. (2001). Novak's theory of education: human constructivism and meaningful learning. J. Chem. Educ. 78:1107. doi: 10.1021/ed078p1107.6

CrossRef Full Text | Google Scholar

Brown, J. S., Collins, A., and Duguid, P. (1989). Situated cognition and the culture of learning. Educ. Res. 18, 32–42. doi: 10.3102/0013189X018001032

CrossRef Full Text | Google Scholar

Council of the European Union (2018). Council Recommendations of 22 May 2018 on Key Competences for Lifelong Learning. Brussel: Council of the European Union.

Google Scholar

Curtis, V. (2018). “Online citizen science and the widening of academia: distributed engagement with research and knowledge production,” in Palgrave Studies in Alternative Education (Cham: Palgrave Macmillan). doi: 10.1007/978-3-319-77664-4

CrossRef Full Text | Google Scholar

Dawes, L., and Wegerif, R. (2004). Thinking and Learning With ICT: Raising Achievement in Primary Classrooms. London: Routledge. doi: 10.4324/9780203506448

CrossRef Full Text | Google Scholar

D'Mello, S., Dieterle, E., and Duckworth, A. (2017). Advanced, analytic, automated (AAA) measurement of engagement during learning. Educ. Psychol. 52, 104–123. doi: 10.1080/00461520.2017.1281747

PubMed Abstract | CrossRef Full Text | Google Scholar

Doyle, C., Li, Y., Luczak-Roesch, M., Anderson, D., Glasson, B., Boucher, M., et al. (2018). What is Online Citizen Science Anyway? An Educational Perspective. arXiv [Preprint]. arXiv:1805.00441.

Google Scholar

Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., and Jardeleza, S. E. (2011). What we say is not what we do: effective evaluation of faculty professional development programs. BioScience 61, 550–558. doi: 10.1525/bio.2011.61.7.9

CrossRef Full Text | Google Scholar

Edwards, R., McDonnell, D., Simpson, I., and Wilson, A. (2017). “Educational backgrounds, project design and inquiry learning in citizen science,” in Citizen Inquiry: Synthesising Science and Inquiry Learning, eds C. Herodotou, M. Sharples, and E. Scanlon (Abingdon: Routledge), 195–209. doi: 10.4324/9781315458618-11

CrossRef Full Text | Google Scholar

Ernst, J., and Monroe, M. (2004). The effects of environment-based education on students' critical thinking skills and disposition toward critical thinking. Environ. Educ. Res. 10, 507–522. doi: 10.1080/1350462042000291038

CrossRef Full Text | Google Scholar

Ferguson, R., Barzilai, S., Ben-Zvi, D., Chinn, C. A., Herodotou, C., Hod, Y., et al. (2017). Innovating Pedagogy 2017: Open University Innovation Report 6. Milton Keynes: The Open University.

Google Scholar

Ferguson, R., and Clow, D. (2017). “Where is the evidence? A call to action for learning analytics,” in Proceedings of the 6th Learning Analytics Knowledge Conference (Vancouver, BC: ACM), 56–65.

Google Scholar

Ferguson, R., Coughlan, T., Egelandsdal, K., Gaved, M., Herodotou, C., Hillaire, G., et al. (2019). Innovating Pedagogy 2019: Open University Innovation Report 7. Milton Keynes: The Open University.

Google Scholar

Fincham, O. E., Gasevic, D., Jovanovic, J. M., and Pardo, A. (2018). From study tactics to learning strategies: an analytical method for extracting interpretable representations. IEEE Trans. Lear. Technol. 12, 59–72. doi: 10.1109/TLT.2018.2823317

CrossRef Full Text | Google Scholar

Gaved, M., and Peasgood, A. (2017). Fitting in versus learning: a challenge for migrants learning languages using smartphones. J. Interact. Media Educ. 2017:1. doi: 10.5334/jime.436

CrossRef Full Text | Google Scholar

Gelan, A., Fastré, G., Verjans, M., Martin, N., Janssenswillen, G., Creemers, M., et al. (2018). Affordances and limitations of learning analytics for computer-assisted language learning: a case study of the VITAL project. Comp. Assist. Lang. Learn. 31, 294–319. doi: 10.1080/09588221.2017.1418382

CrossRef Full Text | Google Scholar

Gutierrez, R. (2003). “Conversation theory and self-learning,” in Science Education Research in the Knowledge-Based Society, eds D. Psillos, P. Kariotoglou, V. Tselfes, E. Hatzikraniotis, G. Fassoulopoulos, and M. Kallery (Dordrecht: Springer Netherlands, 43–49.

Google Scholar

Ha Dinh, T. T., Bonner, A., Clark, R., Ramsbotham, J., and Hines, S. (2016). The effectiveness of the teach-back method on adherence and self-management in health education for people with chronic disease: a systematic review. JBI Database Syst. Rev. Implement. Rep. 14, 210–247. doi: 10.11124/jbisrir-2016-2296

PubMed Abstract | CrossRef Full Text | Google Scholar

Hadwin, A., Järvelä, S., and Miller, M. (2011). “Self-regulated, co-regulated, and socially shared regulation of learning,” in Handbook of Self-Regulation of Learning and Performance, eds B. Zimmerman and D. Schunk (New York, NY: Routledge), 65–84.

Google Scholar

Herodotou, C., Aristeidou, M., Sharples, M., and Scanlon, E. (2018). Designing citizen science tools for learning: lessons learnt from the iterative development of nQuire. Res Pract. Technol. Enhanced Lear. 13:4. doi: 10.1186/s41039-018-0072-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Herodotou, C., Heiser, S., and Rienties, B. (2017a). Implementing randomised control trials in open and distance learning: a feasibility study. Open Learn. 32, 147–162. doi: 10.1080/02680513.2017.1316188

CrossRef Full Text | Google Scholar

Herodotou, C., Rienties, B., Verdin, B., and Boroowa, A. (2019). Predictive learning analytics ‘at scale’: guidelines to successful implementation in higher education. J. Learn. Anal. 6, 85–95. doi: 10.18608/jla.2019.61.5

CrossRef Full Text | Google Scholar

Herodotou, C., Sharples, M., and Scanlon, E. (2017b). “Introducing citizen inquiry,” in Citizen Inquiry: Synthesising Science and Inquiry Learning, eds C. Herodotou, M. Sharples, E. Scanlon (Routledge).

Google Scholar

Hodgson, J., Terauds, A., and Pin Koh, L. (2018). ‘Epic Duck Challenge’ Shows Drones Can Outdo People at Surveying Wildlife [Online]. The Conversation. Available online at: https://theconversation.com/epic-duck-challenge-shows-drones-can-outdo-people-at-surveying-wildlife-90018 (accessed May 23, 2019).

Google Scholar

Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K., et al. (2013). Connected Learning: An Agenda for Research and Design. Irvine, CA: Digital Media and Learning Research Hub.

Google Scholar

Joch, A. (2018, March 27). With drones, students tackle complex topics. EdTech Magazine, Online article.

Google Scholar

John, K. S., and McNeal, K. (2017). The Strength of Evidence Pyramid [Online]. National Association of Geoscience Teachers. Available online at: https://nagt.org/nagt/profdev/workshops/geoed_research/pyramid.html (accessed May 23, 2019).

Google Scholar

Kanero, J., Geçkin, V., Oranç, C., Mamus, E., Küntay, A. C., and Göksun, T. (2018). Social robots for early language learning: current evidence and future directions. Child Dev. Perspect. 12, 146–151. doi: 10.1111/cdep.12277

CrossRef Full Text | Google Scholar

Kerka, S. (2000). “Incidental learning,” in Trends and Issues Alert (Columbus, OH: Center on Education and Training for Employment, Ohio State University).

Google Scholar

Kim, E. S., Berkovits, L. D., Bernier, E. P., Leyzberg, D., Shic, F., Paul, R., et al. (2013). Social robots as embedded reinforcers of social behavior in children with autism. J. Autism Dev. Disord. 43, 1038–1049. doi: 10.1007/s10803-012-1645-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Kirschner, P. A., and van Merriënboer, J. J. G. (2013). Do learners really know best? urban legends in education. Educ. Psychol. 48, 169–183. doi: 10.1080/00461520.2013.804395

CrossRef Full Text | Google Scholar

Klopfer, E., and Squire, K. (2008). Environmental detectives—the development of an augmented reality platform for environmental simulations. Educ. Technol. Res. Dev. 56, 203–228. doi: 10.1007/s11423-007-9037-6

CrossRef Full Text | Google Scholar

Kolb, D. (1984). Experiential Learning as the Science of Learning and Development. Englewood Cliffs, NJ: Prentice Hall.

Google Scholar

Kukulska-Hulme, A., Gaved, M., Paletta, L., Scanlon, E., Jones, A., and Brasher, A. (2015). Mobile incidental learning to support the inclusion of recent immigrants. Ubiquitous Learn. 7, 9–21. doi: 10.18848/1835-9795/CGP/v07i02/58070

CrossRef Full Text | Google Scholar

Linnemanstons, K. A., and Jordan, C. M. (2017). Learning through place: evaluation of a professional development program for understanding the impact of place-based education and teacher continuing education needs. J. Sustain. Educ. 12, 1–25. Retrieved from: http://www.susted.com/wordpress/content/learning-through-place-evaluation-of-a-professional-development-program-for-understanding-the-impact-of-place-based-education-and-teacher-continuing-education-needs_2017_02/

Google Scholar

Madden, B. (2015). Pedagogical pathways for Indigenous education with/in teacher education. Teach. Teach. Educ. 51, 1–15. doi: 10.1016/j.tate.2015.05.005

CrossRef Full Text | Google Scholar

Mangina, E., O' Keeffe, E., Eyerman, J., and Goodman, L. (2016). “Drones for live streaming of visuals for people with limited mobility,” in 2016 22nd International Conference on Virtual System & Multimedia (VSMM), 1–6. doi: 10.1109/VSMM.2016.7863162

CrossRef Full Text | Google Scholar

Mitchell, R. (2018). Experts Warn Play Time is ‘Disappearing’ as Emphasis is Placed on Performance and Tests [Online]. The West Australian. Available online at: http://bit.ly/2FTIVGh (accessed June 27, 2018).

Google Scholar

Mubin, O., Stevens, C. J., Shahid, S., Al Mahmud, A., and Dong, J.-J. (2013). A review of the applicability of robots in education. J. Technol. Educ. Learn. 1, 209–216. doi: 10.2316/Journal.209.2013.1.209-0015

CrossRef Full Text | Google Scholar

Negarandeh, R., Mahmoodi, H., Noktehdan, H., Heshmat, R., and Shakibazadeh, E. (2013). Teach back and pictorial image educational strategies on knowledge about diabetes and medication/dietary adherence among low health literate patients with type 2 diabetes. Prim. Care Diabetes 7, 111–118. doi: 10.1016/j.pcd.2012.11.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Newton, P. M., and Miah, M. (2017). Evidence-based higher education – Is the learning styles ‘myth’ important? Front. Psychol. 8:444. doi: 10.3389/fpsyg.2017.00444

PubMed Abstract | CrossRef Full Text | Google Scholar

Oczuks, L. (2003). Reciprocal Teaching at Work: Strategies for Improving Reading Comprehension. Newark, DE: International Reading Association.

Google Scholar

OECD (2015). Students, Computers and Learning: Making the Connection, PISA. Paris: OECD Publishing. doi: 10.1787/9789264239555-en

CrossRef Full Text | Google Scholar

OECD (2016a). United Kingdom Country Note. Programme for International Student Assessment (PISA) – Results from PISA 2015. Paris: OECD Publishing.

Google Scholar

OECD (2016b) PISA 2015 Results (Volume I): Excellence and Equity in Education. Paris: OECD Publishing.

Google Scholar

OECD (2018). The Future of Education and Skills. Education 2030. Paris: OECD Publishing.

Google Scholar

Ospennikova, E., Ershov, M., and Iljin, I. (2015). Educational robotics as an inovative educational technology. Proc. Soc. Behav. Sci. 214, 18–26. doi: 10.1016/j.sbspro.2015.11.588

CrossRef Full Text | Google Scholar

Panadero, E., Klug, J., and Järvelä, S. (2016). Third wave of measurement in the self-regulated learning field: when measurement and intervention come hand in hand. Scand. J. Educ. Res. 60, 723–735. doi: 10.1080/00313831.2015.1066436

CrossRef Full Text | Google Scholar

Papert, S. (1980). Mindstorms: Children, Computers and Powerful Ideas. New York, NY: Basic Books.

Google Scholar

Pask, G. (1976). Conversation Theory, Applications in Education and Epistemology. Amsterdam: Elsevier.

Google Scholar

Puttick, R., and Ludlow, J. (2012). Standards of Evidence for Impact Investing. London: Nesta.

Google Scholar

Rohrer, D., and Pashler, H. (2010). Recent research on human learning challenges conventional instructional strategies. Educ. Res. 39, 406–412. doi: 10.3102/0013189X10374770

CrossRef Full Text | Google Scholar

Rudman, P. (2002). Investigating domain information as dynamic support for the learner during spoken conversations (Unpublished Ph.D thesis). University of Birmingham, Birmingham.

Google Scholar

Sandvik, K. B., and Lohne, K. (2014). The rise of the humanitarian drone: giving content to an emerging concept. Millennium 43, 145–164. doi: 10.1177/0305829814529470

CrossRef Full Text | Google Scholar

Sattar, F., Tamatea, L., and Nawaz, M. (2017). Droning the pedagogy: future prospect of teaching and learning. Int. J. Educ. Pedagog. Sci. 11, 1622–1627.

Google Scholar

Saunders, M. E., Roger, E., Geary, W. L., Meredith, F., Welbourne, D. J., Bako, A., et al. (2018). Citizen science in schools: engaging students in research on urban habitat for pollinators. Austral Ecol. 43, 635–642. doi: 10.1111/aec.12608

CrossRef Full Text | Google Scholar

Scanlon, E., Woods, W., and Clow, D. (2014). Informal participation in science in the UK: identification, location and mobility with iSpot. J. Educ. Technol. Soc. 17, 58–71.

Google Scholar

Scheffel, M., Drachsler, H., de Kraker, J., Kreijns, K., Slootmaker, A., and Specht, M. (2017). Widget, widget on the wall, am I performing well at all? IEEE Trans. Learn. Technol. 10, 42–52. doi: 10.1109/TLT.2016.2622268

CrossRef Full Text | Google Scholar

Scherer, R., Greiff, S., and Kirschner, P. A. (2017). Editorial to the special issue: current innovations in computer-based assessments. Comput. Hum. Behav. 76, 604–606. doi: 10.1016/j.chb.2017.08.020

CrossRef Full Text | Google Scholar

Schuttler, S. G., Sears, R. S., Orendain, I., Khot, R., Rubenstein, D., Rubenstein, N., et al. (2018). Citizen science in schools: students collect valuable mammal data for science, conservation, and community engagement. Bioscience 69, 69–79. doi: 10.1093/biosci/biy141

CrossRef Full Text | Google Scholar

Semken, S., and Freeman, C. B. (2008). Sense of place in the practice and assessment of place-based science teaching. Sci. Educ. 92, 1042–1057. doi: 10.1002/sce.20279

CrossRef Full Text | Google Scholar

Sharples, M. (2019). Practical Pedagogy: 40 Ways to Teach and Learn. London: Rutledge.

Google Scholar

Sharples, M., Adams, A., Alozie, N., Ferguson, F., FitzGerald, E., Gaved, M., et al. (2015). Innovating Pedagogy 2015. Milton Keynes: Open University.

Google Scholar

Sharples, M., Adams, A., Ferguson, R., Gaved, M., McAndrew, P., Rienties, B., et al. (2014). Innovating Pedagogy 2014. Milton Keynes: Open University.

Google Scholar

Sharples, M., de Roock, R., Ferguson, R., Gaved, M., Herodotou, C., Koh, E., et al. (2016). Innovating Pedagogy 2016: Open University Innovation Report 5. Milton Keynes: The Open University.

Google Scholar

Sharples, M., McAndrew, P., Weller, M., Ferguson, R., FitzGerald, E., Hirst, T., et al. (2012). Innovating Pedagogy 2012. Milton Keynes: Open University.

Google Scholar

Sharples, M., McAndrew, P., Weller, M., Ferguson, R., FitzGerald, E., Hirst, T., et al. (2013). Innovating Pedagogy 2013. Milton Keynes: Open University.

Google Scholar

Siemens, G. (2005). Connectivism: a learning theory for the digital age. Int. J. Instr. Technol. Distance Learn 2, 3–10. Available online at: https://web.archive.org/web/20190612101622/http://www.itdl.org/Journal/Jan_05/article01.htm

Google Scholar

Silvertown, J., Harvey, M., Greenwood, R., Dodd, M., Rosewell, J., Rebelo, T., et al. (2015). Crowdsourcing the identification of organisms: a case-study of iSpot. ZooKeys 480, 125–146. doi: 10.3897/zookeys.480.8803

CrossRef Full Text | Google Scholar

Sobel, D. (2004). Place-Based Education: Connecting Classrooms and Communities. Great Barrington, MA: Orion Society.

Google Scholar

Staiff, R. (2016). Re-imagining Heritage Interpretation: Enchanting the Past-Future. London: Routledge. doi: 10.4324/9781315604558

CrossRef Full Text | Google Scholar

Tempelaar, D. T., Rienties, B., and Giesbers, B. (2015). In search for the most informative data for feedback generation: learning analytics in a data-rich context. Comput. Hum. Behav. 47, 157–167. doi: 10.1016/j.chb.2014.05.038

CrossRef Full Text | Google Scholar

Tempelaar, D. T., Rienties, B., Mittelmeier, J., and Nguyen, Q. (2018). Student profiling in a dispositional learning analytics application using formative assessment. Comput. Hum. Behav. 78, 408–420. doi: 10.1016/j.chb.2017.08.010

CrossRef Full Text | Google Scholar

Theobald, P., and Curtiss, J. (2000). Communities as curricula. Forum Appl. Res. Public Policy 15, 106–111.

Google Scholar

Thomas, G., and Munge, B. (2015). “Best practice in outdoor environmental education fieldwork: pedagogies to improve student learning,” in Experiencing the Outdoors, eds M. Robertson, G. Heath, and R. Lawrence (Brill Sense), 165–176. doi: 10.1007/978-94-6209-944-9_14

CrossRef Full Text | Google Scholar

Thomas, G., and Munge, B. (2017). Innovative outdoor fieldwork pedagogies in the higher education sector: optimising the use of technology. J. Outdoor Environ. Educ. 20, 7–13. doi: 10.1007/BF03400998

CrossRef Full Text | Google Scholar

Trevors, G., Feyzi-Behnagh, R., Azevedo, R., and Bouchet, F. (2016). Self-regulated learning processes vary as a function of epistemic beliefs and contexts: mixed method evidence from eye tracking and concurrent and retrospective reports. Learn. Instr. 42, 31–46. doi: 10.1016/j.learninstruc.2015.11.003

CrossRef Full Text | Google Scholar

Trilling, B., and Fadel, C. (2009). 21st Century Skills: Learning for Life in Our Times. San Francisco: John Wiley & Sons.

Google Scholar

von Glaserfeld, E. (1987). “Einführung in den radikalen Konstruktivismus,” in Wissen, Sprache und Wirklichkeit. Wissenschaftstheorie Wissenschaft und Philosophie, Vol. 24 (Wiesbaden: Vieweg+Teubner Verlag).

Google Scholar

Vygotsky, L. S. (1978). Mind in Society. Cambridge, MA: Harvard University Press.

Google Scholar

Webber, G., and Miller, D. (2016). Progressive pedagogies and teacher education: a review of the literature. McGill J. Educ. 51, 1061–1079. doi: 10.7202/1039628ar

CrossRef Full Text | Google Scholar

Westlund, J. K., and Breazeal, C. (2015). “The interplay of robot language level with children's language learning during storytelling,” in Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts (Portland, OR: ACM). doi: 10.1145/2701973.2701989

CrossRef Full Text | Google Scholar

Winne, P. H. (2017). Leveraging big data to help each learner upgrade learning and accelerate learning science. Teach. Coll. Rec. 119, 1–24.

Google Scholar

Wood, D., Bruner, J. S., and Ross, G. (1979). The role of tutoring in problem solving. J. Child Psychol. Psychiatry 17, 89–100. doi: 10.1111/j.1469-7610.1976.tb00381.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Wu, H.-K., Lee, S. W.-Y., Chang, H.-Y., and Liang, J.-C. (2013). Current status, opportunities and challenges of augmented reality in education. Comp. Educ. 62, 41–49. doi: 10.1016/j.compedu.2012.10.024

CrossRef Full Text | Google Scholar

Zakaria, N. Y. K., Zaini, H., Hamdan, F., and Norman, H. (2018). Mobile game-based learning for online assessment in collaborative learning. Int. J. Eng. Technol. 7, 80–85. doi: 10.14419/ijet.v7i4.21.21620.

CrossRef Full Text | Google Scholar

Zimmerman, B. J. (2000). “Attaining self-regulation: a social cognitive perspective,” in Handbook of Self-Regulation, eds M. Boekaerts, P. R. Pintrich, and M. Zeidner (San Diego, CA: Elsevier), 13–39. doi: 10.1016/B978-012109890-2/50031-7

CrossRef Full Text | Google Scholar

Keywords: evidence-based practice, educational innovation, pedagogy, teaching and learning, educational effectiveness, educational theories, 21st century skills

Citation: Herodotou C, Sharples M, Gaved M, Kukulska-Hulme A, Rienties B, Scanlon E and Whitelock D (2019) Innovative Pedagogies of the Future: An Evidence-Based Selection. Front. Educ. 4:113. doi: 10.3389/feduc.2019.00113

Received: 01 June 2019; Accepted: 30 September 2019;
Published: 11 October 2019.

Edited by:

Jon Mason, Charles Darwin University, Australia

Reviewed by:

Sue Erica Smith, Charles Darwin University, Australia
Eric C. K. Cheng, The Education University of Hong Kong, Hong Kong
Bea Staley, Charles Darwin University, Australia

Copyright © 2019 Herodotou, Sharples, Gaved, Kukulska-Hulme, Rienties, Scanlon and Whitelock. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Christothea Herodotou, christothea.herodotou@open.ac.uk

Download