Teaching practices are becoming more complex. This is partly due to the adoption of technologies in teaching and learning (Kirschner, 2015; Laurillard, 2012). In this paper, we use the term learning technologies (LTs) to refer to digital tools used in the classroom (Laurillard, 2013) and technology-enhanced learning (TEL) to refer to the learning activities employing such technologies (Goodyear & Retalis, 2010). Critique has been raised concerning educational institutions’ slow uptake of technologies, on the one hand (e.g. Johnson, Smith, Willis, Levine, & Haywood, 2011), and how the tools are used on the other (Laurillard, 2012; Håkansson-Lindqvist, 2015).
The debate concerning how tools are used is not new, and it has often centred on teachers’ skills in integrating these technologies (e.g. Seifert & Sutton, 2009). Apart from concluding that teaching remains traditional, even when LTs are implemented (Gudmundsdóttir, Dalaaker, Egeberg, Hatlevik, & Tømte, 2014), it has been pointed out that the strategies needed to engage students when using LTs may not be the same as those used in classrooms without technologies (Weitze Laerke, 2016).
Studies have reported that student engagement is strongly related to retention and grades for all students (Boekaerts, 2016; Finn, 1989; Fredricks, Blumenfeld, & Paris, 2004; Reeve, 2012; Wang & Eccles, 2013). This study adopts the view of Fredricks (et al., 2004) who suggested that student engagement is a multi-layered construct with behavioural, cognitive and emotional dimensions. The behavioural dimension refers to action students undertake to learn, the cognitive dimension reflects the concentration and effort to master a content, and the emotional dimension is associated with an acceptance of teacher instruction and feelings of interest. For example, engaged students are typically associated with students who are open minded to teacher instruction, concentrate on learning the subject, display persistence and direct their energy toward behaviour that supports learning (Fredricks, et al., 2004). The more engaged the student, the more effort the student direct toward their learning activity (Bergdahl, Knutsson, & Fors, 2018). Conversely, disengaged students are more likely to withdraw, give up in the face of challenge, reject classroom rules (Wang, Fredricks, Ye, Hofkens, & Linn, 2017) and run a greater risk for dropping out of school (Finn, 1989).
Research has argued that while engagement and motivation are related (but different) constructs strongly related to learning outcome, motivation alone is not sufficient for students to persist in education (e.g. Christenson & Reschly, 2012). Instead, engagement is vital for learning, as “engagement mediates the motivation- to-achievement relation” (Reeve, 2012, p 163). Students can choose to engage with the learning material and that engagement may give rise to motivation for learning. However, students engage in short and long term school assignment for different reasons, and while task-oriented instruction may fit all students, long term goals are motivating for students that already are highly engaged (Bergdahl et al., 2018b). While motivation is referred to as psychological processes inside the student, engagement is a phenomenon that manifests in the interaction that takes place between the student and the subject; i.e. content, peers, teachers, tools. Engagement shapes motivation (Reeve, 2012) and is itself shaped by its context (Fredricks et al., 2004).
As education has become increasingly affected by digitalisation; many studies have responded by exploring how different digital technologies may affect student engagement, for examples clickers (Han & Finkelstein, 2013), blogs (Cakir, 2013) and learning virtual worlds (Pellas, 2014). These studies suggest that the use of technologies may engage students in both online and blended learning (e.g. Pellas, 2014, Cakir, 2013). While the previous studies have mainly focused on the tools brought into the classrooms, others have a more outspoken emphasis on learning designs. When offered scaffolding, the students can reach further than they could do alone. The learning mechanism in scaffolding can be provided by the teacher, peers or software, as prompts, response, modelling, or be embedded in the structure (Reiser & Tabak, 2016). However, having access to technologies also challenges student’s ability to self-regulate; e.g. students’ ability to abstain from the desire, or compulsion, to play games or update social media sites and instead prioritise learning. Hence, teachers’ pedagogical skills to engage students when learning with technologies are critical for successful education.
Herrington and Reeves (2011) and Laurillard (2012) approached how LTs could be used to facilitate student engagement by making students active participants when learning with technologies. They share the view that collaborative learning, peer modelling and feedback, when designing for TEL, are imperative for a successful education. LTs can be orchestrated to make learning more effective by being learner-centred. A learner-centred design has the goal of facilitating learners in becoming active participants in their learning process (Laurillard & Derntl, 2014). This underpins a design in which the technologies are arranged to facilitate multiple types of learning-focused interactions, such as student/student, student/content and student/teacher interactions (Goodyear & Dimitriadis, 2013; Grissom, McCauley, & Murphy, 2017). In her conversational framework, Laurillard (2013) suggested that peer modelling and feedback are core requirements for effective learning. She discussed that the delivery of educational materials is as important as the development of the materials. As such, she detailed how a virtual learning environment (VLE) may offer possibilities for synchronous and asynchronous dialogues, inter-student exchange and multiple types of interactions (ibid.). Learner-centred interaction is viewed as key to effective learning, as “without a modelling environment, learners receive no help in deciding how good their actions are” (Laurillard, 2012, p 208). Used in a dialogic way, LTs may facilitate ways in which students can rethink their conceptualisations with the help of peer modelling and feedback.
Research focused on comparing student dialogue in forums, with and without scaffolding, has shown that students who used a simple interface without scaffolding will not engage in as deep, varied, coherent and extended argumentation with their peers, and they will only use the forum to exchange information (McAlister, Ravenscroft, & Scanlon, 2004). It has also been pointed out that, when students engage in forums, general social norms present challenges to them, both socially and cognitively (Andriessen & Baker, 2016). The aim of this study was to explore whether teachers and researchers could design learning activities in collaboration that facilitate student engagement. As such, the following research questions guided this study:
The paper is structured as follows: First, we offer an overview of each phase of the design process; we then present findings and discuss the intervention and our results. Finally, we share lessons learnt and point to implications for future design.
In this study, using a design-based research (DBR) methodology (Anderson & Shattuck, 2012), teachers and researchers collaboratively designed the intervention.
The intervention was implemented in an upper secondary school in Stockholm, Sweden. While completing a prior study, teachers were asked if they were interested in a design-and technology collaboration. Two teachers (referred to as T1 and T2) agreed, and the intervention was developed in collaboration with them. T1 was a novice in IT, and T2 was a lead teacher in IT development at the school. It was decided that the intervention would be implemented in T1’s Swedish lessons in two 2nd-year classes. The intervention covered the design of four lessons. (As they were implemented in two classes, the intervention stretched over eight lessons). The lessons were spread over 3 weeks; (the process of planning and evaluating spanned 4 months). T1 participated in all evaluations. T2 participated in the first and second evaluation. Although there was continuous dialogue between the authors, the first author guided the Future Workshop, intervention and evaluation with the teachers, conducted the observation and analysed the data.
During the design process, the first author and two teachers met. To establish a shared foundation for developing the intervention a see-saw technique inspired by Winters and Mor (2008) was employed. This meant that both the researcher and the teachers shared their understanding and experience. Even though all collaboration was characterised by ongoing mutual discussions, there was a clear turn taking, allowing focus on one stakeholder at a time.
The researcher shared insights on engagement (a brief overview is offered below), before the Future Workshop was initiated. Then insights from a previous study were shared:
Factors to be aware of that might hinder engagement:
- Digital tools that are fit for one user at a time.
- Learning activities in which students’ learning process remained invisible.
Factors that may promote engagement:
- Learning activities that encompass a variety of interactions.
- Applications that are used to enable simultaneous dialogues from all students.
- The teacher acknowledging student contribution as it happens.
The purpose of a Future Workshop is to guide stakeholders through the process of systematically identifying problems, followed by playfully exploring and imagining solutions (Kensing & Madsen, 1991). Following the DBR tradition, the underlying desire is that an intervention arrives at suggesting design principles and transfers the ownership of the intervention to its stakeholders (Anderson & Shattuck, 2012). Inspired by Kensing and Madsen (1991), the workshop consisted of three phases, as follows: the critique phase, fantasy phase and implementation phase. During the critique phase, the teachers identified what they had noticed hindered their students from engaging in learning (see Table 1).
|Outcome of the critique phase – ‘What present matters teachers want to change’|
|Feedback||Limited duration of focus|
|No direct feedback provided||Technologies distract students|
|Students want to engage socially with peers|
|Relevance||‘Getting started’ in the activity is hard|
|Willingness to study only derives from a demand to obtain grades||Students are not committed to explore the content of the subject|
|Students have no long-term goals||Students struggle with concentration|
|Students do not find the task meaningful or understand how it relates to the curricula||Students display insufficient drive to engage and persist|
|Students do not think that the learning activity is fun||Students are afraid of failure, and this prevents them from trying|
|General engagement and self-regulation|
|When it comes to engagement, there is a big difference between subjects and the extent to which students engage||The students need to be ‘reprogrammed’ to view technology not as entertainment, but as a tool for learning|
Table 1 shows the outcome of the critique phase: i.e. teachers’ perceptions of hindrances to student engagement collated in five categories as follows: feedback, limited duration of focus, relevance, getting started and general engagement and self-regulation. In terms of self-regulation, teachers brought up that the students often did not view the laptop as a tool for learning, as such students were often prone to use the laptop for entertainment, and their ability to self-regulate needed to develop in order for them to direct their effort toward learning and persist in the face of challenges. Other reasons teachers brought up were how students who lack motivation need to find an activity fun, meaningful or being linked to grades to be willing to engage. According to the teachers, these students are easily distracted by peers and tools, and often struggle with concentration instead of exploring the subject content. The teachers recognised that students did not get any feedback or often had to wait to for teachers to return with the written feedback days later. The teachers wished that they could provide timely feedback to students, but also recognised that it was hard to realise under the current conditions.
During the fantasy phase, the teachers collaborated to express what tools they could agree to use and visualised potential gains from the intervention (see Table 2). We grouped the outcome from the critique phase in themes and set them aside. The teachers were now asked to clear their mind and together brain storm on how technologies can be used. The vision in the outcome of the fantasy phase included a pragmatic use of tools. For example, when it comes to differences of space: (to digitally gather all students’ work, open for possibility to reflect online, as online space does not have the same limitations as paper), or social cost (an assessment program would allow individuals to receive private and automatic feedback on tested knowledge). After the teachers had expressed their ideas, we drew their ideas and suggested tools on a white board. Four tools were suggested: a virtual learning environment, a learner assessment application (referred to as mini-test), a progress bar and badges. We then moved to collaboratively match the ideas with tools to start the design of a learning activity. The pedagogical solutions aimed to remove the problems identified in the critique phase. These were displayed in a problem–solution matrix (see Table 3).
|Fantasy phase – ‘What teachers dream of’|
|Visions of what a technology-enhanced learning (TEL) intervention may bring|
|Digitally assembled submission||Every student can answer individually|
|Possibility of showing knowledge not asked for||Feedback quick and easy|
|Possibility to reflect||Familiar routines and structures to provide|
|Failure not costly||Skill matrix|
|Goals to meet||Diagrams of progress|
|Virtual learning environment (VLE)||Goals to meet (progress bar)|
In Table 3, the teachers and researcher in collaboration sought to match the hindrances, and suggested solutions. When completed, Table 3 would be the matrix which we would follow in our design conversations, when evaluating each design element.
|Problem–solution matrix – ‘How to find a possible/realistic solution’|
|Identified hindrance to student engagement||Suggested solution||Thought to be addressed by|
|Students struggle with concentration and easily become distracted by friends and mobile phones||Limit the time of the learning activity. Make the learning activity structured. The relevance of learning could be met if students could read peer postings||
|Students have no goal in their studies||Engagement of their peers is made visible. Having peers who engage can inspire others to engage||VLE – as above|
|Learning activity is not considered fun||See what peers contribute that can be fun/relevant/meaningful||VLE– as above|
|Relevance of knowledge: students do not agree that the knowledge is relevant, relates to curricula, goals or everyday life||Relevant terms are tested in a mini-test. Students are both contributors and readers of real-time postings in the VLE, which make the contributions relevant||Assessment application – as above
VLE– as above
|Students are primarily motivated by grades||Seeing how peers and high-achieving students think and plan their studies can inspire students who yet do not set long-term goals||VLE – as above|
One of the core ideas with DBR is that teachers are invited to explore and develop learning collaboratively with researchers. In this study, the Future Workshop derived fully from the teachers’ perceptions of what supported and hindered engagement. The teachers collaboratively suggested IT solutions. When teachers suggested that the intervention could be implemented during ‘argumentative speech’ – the researcher engaged to collaboratively formulate how the systematic iterative cycles could take place to smoothly fit into that plan (see Table 4). While collaboratively thinking about the practical organisation of the learning activity, one teacher focused more on solutions in terms of IT support, and the other teacher focused more on the actual implementation and problems that might arise.
|Content description||Learning technologies|
|‘Show prior knowledge in mini-test’
‘Post your thesis statement in the VLE’
|Mini-test was launched to confirm students’ existing level of knowledge||VLE: multiple simultaneous dialogues prompting students’ participation, peer feedback and teacher monitoring|
Lessons 2 and 3
|‘Post three arguments reflecting ethos, pathos and logos in the VLE’||VLE: multiple simultaneous dialogues prompting students’ participation, peer feedback and teacher monitoring|
|‘Find and select relevant sources; to back up your logos argument, post these in the VLE’||VLE: multiple simultaneous dialogues prompting students’ participation, peer feedback and teacher monitoring|
The teachers agreed that T1 would be implementing the intervention. T1 was not used to working with technologies but looked forward to trying to use them while receiving support. To survey student knowledge prior to starting the course, we chose to conduct a mini-test using an assessment application, SocrativeTM. The online mini-tests enabled the students and teachers to access the results instantly. The students could not see each other’s results. We chose to use Google ClassroomTM as the VLE, as this was available at the school. It was not implemented throughout all classes, and not all functions were enabled. T1 pre-tested the technology before the lessons, and later managed five out of eight lessons without support.
In addition to the compulsory laptop, the teacher was equipped with a tablet, in which the shared workspace was instantly accessed. We designed the lesson content in each cycle to meet all the suggested solutions in all cycles and learning activities (see Table 4). During the first lesson, both SocrativeTM and Google ClassroomTM were used. During lessons 2–4, only Google ClassroomTM was used. The lessons were divided into three segments, which all focused on different aspects of argumentative speech. The idea was that the students would become familiar with the design of the learning activities, and as the lessons were similar this would enable iterations of the design.
The learning activities matched the iterative cycles. Each lesson included an instruction to write thesis statements and arguments reflecting ethos, pathos or logos. After posting their contribution in the VLE, students gained access to peer contributions and could engage in giving (and receiving) feedback.
The object of the problem–solution matrix was identifying hindrances to student engagement. As such, each cycle was followed by an evaluation in which the teachers and researcher systematically analysed the intervention. Following the problem–solution matrix, we addressed each line and reflected on how the intervention had influenced student engagement (Table 3). While descriptions of problems are provided below as a part of the design process, these evaluations were also transcribed and analysed. When the researcher and teachers identified that a hindrance to engagement was not solved, we made appropriate adjustments. The identified problems were as follows:
First cycle/first iteration (after lesson 1, see Table 4):
Problem 1) A time-related problem: When too much time was offered, some students lost their concentration, and when too little time was offered, students became stressed. Moreover, the time to complete the tasks varied in the two classes. We agreed that it was good to keep the time limit and adjusted the time allowed for the task to 2 minutes more in the class needing more time.
Second cycle/second iteration (after lesson 3):
Problem 1) A feedback problem: When students could choose which contributions to reply to, not all students received feedback. We agreed it was better to ensure that all students received feedback. While students were still free to give feedback to peers of choice, the teacher created feedback-pairs to ensure that each student received at least one comment.
Problem 2) Some students’ inability to finish on time: To address this problem, the teacher requested that the groups post all (individual) contributions together as a group. This way, the students needed to support each other to complete their task.
Problem 3) Teacher frustration: The students did not complete their tasks (speech) according to schedule. The teacher predicted that they would not be ready in time and rescheduled the student presentations.
After the third cycle (after lesson 4) the teachers and researcher met to evaluate the intervention.
The DBR process included a Future Workshop, which in itself is both a process and data. We gathered the input that teachers contributed with during the phases on post it notes and video recorded the session. Critique phase (Table 1), fantasy phase (Table 2), and the problem solutions matrix (Table 3) are data collected during the Future Workshop. At the end of an intervention, a final evaluation was used to compare whether the targets had been met or not.
Field notes (referred to as observation) and cyclic evaluations (referred to as evaluation) were used to capture the intervention. To extract data from the intervention, the researcher took field notes during observations of T1’s instructions during implementation of the learning design. Teacher-researcher dialogues, which occurred during the time of the observation, were also included in the field notes. Three observations were done during the implementation, one per cycle. After each cycle, the teachers and researcher met to evaluate the intervention. The evaluations took 30–50 minutes. They were recorded and transcribed (118 minutes). To adapt to the teachers’ availability, the third evaluation was completed over the phone with one of the teachers.
This research study reports on teacher experiences of teaching with technologies and how, in collaboration with researchers their design could facilitate engagement. The two teachers who agreed to participate signed informed consent forms prior to starting the initial workshop (Appendix A). To ensure anonymity, teachers are referred to as T1 and T2, and where applicable gendered pronouns are omitted throughout the paper.
To approach influencing factors on student engagement, thematic analysis was used following the six-phase approach to identify codes and themes, as suggested by Braun and Clarke (2012). The analysis was conducted within a framework of critical realism/post positivism (Willig, 1999). The critical realism/post positivistic standpoint holds that the researcher identifies the themes, and this identification is subjective and coloured by previous experience, rather than discovering an existing truth. The dataset consisted of transcriptions of all evaluations and field notes from classroom observations. After data familiarisation, the data were analysed using a line-by-line technique (Braun & Clarke, 2006). Codes were written onto sticky notes, with reference to the line in the dataset. The dataset was analysed in at least two full cycles to ensure there was no code drifting, and both latent (descriptive) and semantic (conceptual and interpretive) codes were identified across the dataset (see Braun & Clarke, 2012). The analysis is better described as iterative, rather than following a 1-2-3- sequence, as the researcher looped between the dataset and development of the themes as the codes were collated to shape candidate themes. After the candidate themes had been identified, they were refined, and a thematic map was drafted to visualise the relations between the themes and subthemes.
The DBR project had as its aim to collaboratively work with teachers to increase engagement. We began by informing teachers of engagement theory and in the critique-phase asked them if they had identified any hindrances to student engagement. The particular outcomes of the teacher-researcher collaboration that were observed was that the teachers expressed awareness of hindrances to engagement for their students. They described twelve unique hindrances for students to engage, and commented that student engagement varies in the different subjects. The teachers discussed digital solutions and technologies that they wanted or would consider using. After collaboratively identifying the problems in the classroom, we spoke about the goals of the intervention: to engage students in learning activities. We collaboratively matched the LTs in the problem solution matrix, with an aim to remove hindrances for students to engage – and use a LT and instruction to facilitate engagement. Three iterations were done. Evaluations pointed to that the teacher T1 became more familiar with the (new) way of working and the tools in use. T1 adapted practices to explore the potentials the learning technologies offered. In the end, what facilitated engagement was the ways the LTs were used in conjunction with how the teaching practices emerged to bridge the analogue and digital learning environment.
When analysing observations of teacher instruction, and teachers’ evaluation of cyclic iterations, two important themes, associated with aspects of students’ engagement, were identified, as follows: 1) LTs and teaching practices, and 2) Teacher’s experience of LTs and interaction in the classroom.
The first lesson started with introducing the learner assessment application, SocrativeTM. The assessment application was used for a mini-test, which offered the teacher insight into students’ level of knowledge – ‘I instantly saw who had understood what from the previous teaching’ (T1) – and gave the students instant confirmation of what knowledge they had: ‘It becomes more relevant as they can see the result of their work immediately. It is often this which is hard to reach’ (T2).
The teacher provided scaffolding by repeatedly returning to the tablet and checking contributions, prompting students to act and providing timely feedback directed to selected individuals and groups. The LTs made new interactions between students and the teacher possible, and the teacher responded to these changes:
T1: ‘They are rather quiet and seem to focus on their task’. [Turning to the class]: ‘Two [students] still have not posted their source.’ The teacher uses the ICT [information and communication technology] tool to obtain an overview of the student participation and quality of their comments. Teacher feeds back the overview to the class. They are seen, acknowledged and respond both online and in the physical classroom as the teacher shouts out the names of students. […] 11.05, the class is still silent, focused on working intensively. (Observation)
That the teacher could interact with individuals, groups or the whole class, in this timely manner, enhanced the possibility for the students to perceive the support as relevant. That the scaffolding was instant made it efficient:
‘I could get a quick overview of what the students had handed in, and I could react super quickly and directly. That worked really well’ (T1).
The teacher expressed a sense of being in control of the learning process and seeing student contributions enabled interactions with those in need.
During all the cycles, the students completed the planned task before the lesson was over. The teacher then had to decide on what to do during the remaining minutes:
“They [the students] feel: Well, there’s only ten minutes left, so I don’t have to start with anything. […] There is a gain in taking out a game, or something like that – but there is no gain in starting to look at the planned task” (T1).
The teacher would not extend the designed learning activity to adapt to the time left; instead, in each cycle, the students were allowed to leave early, even though the speech was not completed:
“I have to move the presentation date. Working this way has prevented [the] students from becoming ready. Either you focus on the process, or you focus on letting the students work” (Observation).
The teacher had expressed that the students would not be ready as planned and that focusing on both process and outcome was ‘simply too much.’ This expression may reflect a change process, in which the affordances of technologies challenges the traditional ways of teaching and learning. The considerations reflect uncertainty about what constitutes valid knowledge to assess, and how to assess a learning process itself, (rather than the traditionally used representation of learning from a final exam) and may question deeply rooted views of teaching and learning; potential consequences such as lack of evidence, acceptance from students or colleagues, how to follow progression, et cetera. As such ‘simply too much’ can be understood, not only as the teacher being overwhelmed by overviewing both the process and the outcome, but facing the complexity of teaching and learning as old ideas are being successively replaced with newer ones.
During the Future Workshop, teachers identified hindrances to student engagement. One of these was a view that, when students enrolled at the school, they often brought with them a habit of using technology for entertainment rather than as a tool to support learning:
“Students need to be ‘re-educated’ to view laptops as a learning- rather than an entertainment-tool” (Table 3).
One of the teachers also expressed that, with the constant access to the Internet, many students view the Internet as a saviour, stating that many students think, the
“Internet should solve this for me – this is a very common and rather huge problem I feel. […] There is a group of students who are not even trying to find the answers themselves – they turn directly to the Internet” (T2).
The evaluation confirmed the problem that had been brought up by the teachers.
R: “We talked about this that they [the students] have a predisposition to turn to the internet to find answers- You tried to bypass this by asking, ‘What are you feeling, how can you be perceived as trustworthy?’.”
T1: “Yes, that too was really good, some also dared to ask questions in the classroom and that is what you take with you to the black board: ‘How do you do this’, and students engaged in helping out to giving the answers – then that may help a student to get it: ‘Aha, is that the way to go about it?’.” (Evaluation)
As the aim was to address hindrances to engage by implementing solutions that would facilitate engagement (and implicitly prevent the identified problems to re-surface), tasks were designed to not include answers that could be found online. Instead, to complete tasks students would have to work hands-on with the conceptualising and re-conceptualising of argumentation. To succeed students directed their attention and action toward peer dialogues, content and feedback in the VLE.
When students routinely turned toward the Internet, the teacher could not know what the student was doing behind the screen. When the teacher stood at the front of the classroom with an extra tablet brought in, set to display the shared workspace, the students reacted:
“You could see that they got a lot more engaged because it [their contributions] got printed [on the tablet screen that the teacher watched] – as I saw directly when they wrote something” (T1).
The students became engaged by knowing that the teacher could see their work. As the postings and contributions were continuously accessible, the teacher said there was
“/…/a chance to walk around more and talk to the more insecure ones” (T1).
The students were triggered by the notion of being seen. On their laptops, they could also follow their peers’ contributions and comments. The teacher started to move around the classroom with the tablet in hand, carrying instant access to the students’ contributions.
When students posted their contributions together and accessed each other’s contributions, the teacher perceived that they directed their engagement toward their peers and used their peers as a source:
“It [the instruction] posed a problem for many. ‘How do I create an argument?’ A frustration. Moreover, they wanted to help each other, so their engagement was directed both toward their own and each other’s learning. I have not experienced that, in this way, prior to this intervention.” (T1)
The teacher stated that, when the students were frustrated, they turned to each other instead of giving up. Having insight into the learning process of peers can be inspiring for students who want to compare themselves using peer modelling.
The VLE also enabled a parallel student–teacher interaction:
“And posting in the forum, the students showed courage to ask questions! If they had worked freely, I do not think they would have bothered asking. They would have just skipped that part” (T1, Evaluation).
When posting a contribution in the forum, the students’ thoughts, questions and ideas reflected their learning process. This enabled the teacher to follow students’ learning in a way that is not possible in a traditional setting. The interaction enabled student–peer interactions (turning to peers for help), student–peer contributions/interactions (engaging in the learning of their peers) and student–teacher interactions.
In this study, the teachers and researchers developed an intervention intended to stimulate student engagement. As an overarching methodology, we adopted DBR. This was fruitful, as DBR emphasises stakeholder collaboration and views the teachers’ and researchers’ contributions as equally important. The teachers participated in the design, iterations and development of the lesson design, and they could draw on their experience in teaching the specific group of students in this learning situation, while the researcher supplied the underpinning theories, workshop structure and systematic iterations, providing both pragmatic and theoretic underpinnings (as suggested by Anderson & Shattuck, 2012).
The first research question approached how teachers, supported by researchers, could design learning activities with LTs to facilitate student engagement. The teacher who implemented the intervention had no prior knowledge of the technologies that were put to use. In this intervention, there was continuous support provided as T2 offered hands-on technical support often taught in the adjacent classroom and the researcher ‘sat in’ once during each cycle, making observations of the implementation of the learning design and functioning as a sounding board to T1.
According to Winters and Mor (2008), it can be hard to bring about the sought after ‘disruptive effects’ to teaching and learning in TEL-interventions. The intervention started with suggestions on certain factors that could affect students engagement in learning following (Bergdahl et al., 2018a) to use learning technologies that enabled all students to participate simultaneously, and to follow the students’ active engagement. During the intervention the teacher confirmed that using a forum to post their work was highly engaging to students. This was also confirmed in a student evaluation of the intervention (Bergdahl, et al., 2018b). The findings from this intervention suggest that students reacted positively by being seen by peers and teacher, and they took interest in the contributions and engagement of their peers. We agree with the suggestions of Laurillard (2012: 2013), Goodyear and Dimitriadis (2010) and Herrington and Reeves (2011), who all argue that effective use of LTs must adopt a student-centred design to promote peer-to-peer interaction, peer modelling and feedback.
However, designing to facilitate engagement was not just a matter of theoretical design; the teachers also need the capacity to adjust their practices to match the LTs. That is, to provide the scaffolding needed to bridge the dimensions between the affordances of the LT and the learner. The technologies can become an extension of the teacher, as when the tablet was used to gain instant insight in the shared workspace. How effective LTs are orchestrated depends on what opportunities the teacher identifies that the LTs might bring, the ability to scaffold learning in the dimensions in the classroom as well as the virtual space, and to interact in a relevant way with the learner, the technologies and between the learner and technologies. Our results suggest that the teacher adapted to the new situation and scaffolded learning by offering structure and providing timely feedback. Feedback was directed to the relevant individuals – selected students, groups or the whole class – depending on who was identified to need input and support. The design was arranged so that the students would be able to become familiar with the introduced way of working. We noticed that both the teachers and students adapted quickly, and the technologies became an interwoven mediator that supported learning and classroom interaction.
We found that, although teachers and researchers could design and implement disruptive activities with an aim of facilitating student engagement, it was not observed that the teacher would do this without support. Andriessen and Baker noted (2016) that, for future systems enabling student dialogues, the learning goals, role of the teacher and how the students are normally assessed should be clear; in addition, in these systems, it should be ensure that the technology is compatible with the classroom context. Our findings add to this that the role of the teacher, albeit preliminarily, changed as a result of the intervention; however, to bring about change calls for some provocation of the present roles, context and assessment. We also suggest that teachers’ uncertainty about how to manage technologies for learning is likely to be one of the hindrances. This aligns with previous research (Håkansson Lindqvist, 2015; Kirschner, 2015; Seifert & Sutton, 2009; Weitze Laerke, 2016), which has proposed that there is an increased need to support teachers’ TEL-expertise. We want to add that, in a setting as complex as education, we find it unrealistic to expect that permanent shifts in teacher practices will happen by osmosis; rather, we conclude that they may develop over time and with the right support – and it is not about learning to manage many different tools, rather we suggest that it is about learning to design for engagement regardless of tool.
Concerning the second research question – how student engagement was facilitated during the intervention – our results show that the interactions shifted from being student–content orientated to student–peer and student–teacher oriented. These new orientations extended how students could engage (e.g. to give and provide feedback), as also noted by Grissom et al. (2017). These findings are in line with studies by several authors (Goodyear & Dimitriadis, 2013; Herrington & Reeves, 2011; Laurillard, 2012), who put forward that critical factors for successful education within TEL include peer modelling, interaction and feedback. When learning became visible, the lonely work in student–content interaction (student working individually at their laptops and turning to the Internet) was challenged with other types of interactions. Similarly to other studies on student engagement and digital tools brought into the classroom (Cakir, 2013; Han & Finkelstein, 2013; Pellas, 2014) we found that the VLE too gathered the voices of many students, leaving the traditional (one-dimensional) turn-taking and ‘one student at a time’ focus behind. To facilitate engagement, can therefore be related to how all students are given opportunities to voice their learning.
Similarly, to Gudmundsdóttir et al. (2014) who concluded that practices remain traditional even when introducing technologies, our observations illustrate that simply adding technologies will not make students interact – instead, we noted that when interaction was not orchestrated to facilitate in-class engagement, some students turned to the Internet for answers. The teachers also brought up that the fact that students viewed “the Internet as their educational saviour [was] a widespread problem.” However, when the learning activity was designed to enable multiple simultaneous dialogues, the students’ engagement was directed toward their and their peers’ contributions. Although VLEs are not typically associated with student engagement, our results indicated that the tool could function similarly to other technologies with more novelty or ‘wow’ factor if the aim is underpinned by facilitating engagement through supporting interaction and participation.
Like the findings of Weitze Laerke (2016), we observed that the teacher did not have ‘ready-developed’ strategies to engage students in TEL. Instead, we noticed that the intervention was a process of discovery, and the teacher added the new experiences to expand on existing practices. This is in line with Weitze Laerke (2016) who found that strategies to engage students when learning with technologies differ from those in a traditional classroom. Not having developed strategies may mean students are put to work without an underlying design or theories informing effective pedagogies in TEL. To influence engagement, the teacher must own and feel confident in how to design for learning, which directs students toward a variation of learning-focused interactions, even when not supported. The study suggests that student engagement can be influenced both regarding direction and richness, and with the support of researchers, teachers can design learning for engagement using LTs.
A DBR study aims to derive at design principles and to inform theory of DBR by reporting on lessons learnt, (Reeves, 2006). As such, we offer the following suggestions:
With an aim to design to facilitate engagement, the intervention was informed by the teachers’ experience, engagement theory (Fredricks et al., 2004) a previous study (Bergdahl et al., 2018a). Our results support the use of the influencers suggested (ibid.) and expand these to suggest the following design principles to facilitate student engagement:
- make the learning process visible
- include teacher presence in the application that the student work in
- ensure that all students have an opportunity to use the LT
- offer several ways for students to interact
- direct learning toward dialogues with peers on content
- make room for scaffolding by structure, peer modelling and teacher prompts
During this intervention, we found that there were certain aspects that were critical when collaborating with teachers to design for engagement. These were as follows:
Designing for TEL means increasing new ways of interacting, broadening the ways a student can engage and new possibilities to facilitate technologies to support learning. It is pertinent for teachers to explore the effects on teaching practices when orchestrating technologies to facilitate student engagement. However, teacher perceived support and incitements for change must be in place, along with communicated goals and visions to enable use of the wider potential of LTs. Moreover, educational institutions would probably benefit from formulating goals and visions, to guide and inform teacher practices by defining what qualitative use of technologies are and provide didactic support to match the vision.
This intervention stretched over eight lessons and three weeks. The relatively short duration limits the chances of transferring ownership of the intervention, which may have been enhanced in a longer study. Moreover, a longer study could possibly also extend to include student participation in the design process, thereby establishing involvement with those affected by the practices. As this intervention was implemented in two classes, there is a limited chance for generalisation. However, it is thought that, in analysing practices, we can find and identify structural and cultural hindrances, share the lessons learnt and identify design principles that may be adopted in other settings, or in future design projects that aim to design for engagement.
In this study, the teachers and researchers designed a TEL intervention aimed at increasing student engagement using DBR. During the process, we found that the teachers became more secure and wanted to try and use technologies. We found that the DBR tools offered useful guidance, enabling practitioners and researchers to collaborate. The researchers found that there was a need to supply teachers with support, as when no support was given, the implemented design was not sustained. We suggest that embracing technology is a personal process each teacher may undertake, requiring not only IT support, but varying types of professional support.
Observations and evaluations were analysed using thematic analysis to gain insight into how the intervention influenced student engagement and how teachers could design activities with technologies to increase student engagement, with the support of researchers. The intervention influenced student engagement, as all classroom interactions shifted to become accessible and student–peer and student–teacher oriented. New ways to interact broadened the possibilities to engage for students and opened additional didactic strategies. Thus, the results suggest that LT can be used to invite students to engage in multiple simultaneous learning-centred dialogues, which would increase their active time, in comparison with what traditional (analogue) classrooms can offer. Without guidance, implementing LTs in a way that engages students is challenging for teachers, and embracing their full potential is hard. The individual teacher’s critical reflection of practices and desire to embrace the potential of LTs is not enough for change. An implication of our finding is that there is a need to articulate goals and visions for high qualitative use of LTs; otherwise, the teachers will not have the guidance needed to advance, or evaluate, their professional development.
This research was a part of the overall project ‘I Use IT,’ funded by the City of Stockholm. The aim of the project is to approach the effects of digitalisation in upper secondary schools in Stockholm.
The authors have no competing interests to declare.
Anderson, T., & Shattuck, J. (2012). Design-Based Research: A Decade of Progress in Education Research? Educational Researcher, 41(1), 16–25. DOI: https://doi.org/10.3102/0013189X11428813
Bergdahl, N., Fors, U., Hernwall, P., & Knutsson, O. (2018a). The use of learning technologies and student engagement in learning activities. Nordic Journal of Digital Literacy, 13(2), 113–130. DOI: https://doi.org/10.18261/ISSN.18919-943x-2018-02-04
Bergdahl, N., Knutsson, O., & Fors, U. (2018b). ‘So, You Think It’s Good’ – Reasons Students Engage When Learning with Technologies – a Student Perspective. In: 10th International Conference on Education and New Learning Technologies, 9556–9563. Palma, Spain: IATED Digital Library. DOI: https://doi.org/10.21125/edulearn.2018.2289
Boekaerts, M. (2016). Engagement as an inherent aspect of the learning process. Learning and Instruction, 43, 76–83. DOI: https://doi.org/10.1016/j.learninstruc.2016.02.001
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. DOI: https://doi.org/10.1191/1478088706qp063oa
Braun, V., & Clarke, V. (2012). Thematic Analysis. In: Lyons, E., & Coyle, A. (eds.), APA Handbook of Research Methods in Psychology, 2nd ed., 2, 57–71. Washington DC: American Psychology Association. DOI: https://doi.org/10.1037/13620-004
Cakir, H. (2013). Use of blogs in pre-service teacher education to improve student engagement. Computers & Education, 68, 244–252. DOI: https://doi.org/10.1016/j.compedu.2013.05.013
Christenson, S. L., & Reschly, A. L. (2012). Jingle, Jangle, and Conceptual Haziness: Evolution and Future Directions of the Engagement Construct. In: Handbook of Research on Student Engagement, 3–19. DOI: https://doi.org/10.1007/978-1-4614-2018-7_1
Finn, J. D. (1989). Withdrawing from School. Review of Educational Research, 59(2), 117–142. DOI: https://doi.org/10.3102/00346543059002117
Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School Engagement: Potential of the Concept, State of the Evidence. Review of Educational Research Spring, 74(1), 59–109. DOI: https://doi.org/10.3102/00346543074001059
Goodyear, P., & Dimitriadis, Y. (2013). In medias res: reframing design for learning. Research in Learning Technology, 21(1), 19909. DOI: https://doi.org/10.3402/rlt.v21i0.19909
Goodyear, P., & Retalis, S. (2010). Technology-Enhanced Learning. Sense Publishers. Rotterdam. DOI: https://doi.org/10.1080/14759390802383827
Grissom, S., McCauley, R., & Murphy, L. (2017). How Student Centered is the Computer Science Classroom? A Survey of College Faculty. ACM Transactions on Computing Education, 18(1), 1–27. DOI: https://doi.org/10.1145/3143200
Håkansson Lindqvist, M. (2015). Conditions for Technology Enhanced Learning and Educational Change: a case study of a 1:1 initiative. Doctoral Dissertation; Umeå University. Retrieved from: https://www.diva-portal.org/smash/get/diva2:859735/FULLTEXT01.pdf.
Håkansson-Lindqvist, M. (2015). Gaining and Sustaining TEL in a 1:1 Laptop Initiative: Possibilities and Challenges for Teachers and Students. Computers in the Schools, 32(1), 35–62. DOI: https://doi.org/10.1080/07380569.2015.1004274
Han, J. H., & Finkelstein, A. (2013). Understanding the effects of professors’ pedagogical development with Clicker Assessment and Feedback technologies and the impact on students’ engagement and learning in higher education. Computers and Education, 65, 64–76. DOI: https://doi.org/10.1016/j.compedu.2013.02.002
Herrington, J., & Reeves, T. C. (2011). Using design principles to improve pedagogical practice and promote student engagement. ASCILITE 2011 – The Australasian Society for Computers in Learning in Tertiary Education, 594–601. Retrieved from: http://www.scopus.com/inward/record.url?eid=2-s2.0-84870769433&partnerID=40&md5=8175e94457e82b8e86cf7a73eb5c60e3.
Kensing, F., & Madsen, K. (1991). Generating Visions: Future Workshops and Metaphorical Design. In: Greenbaum, J., & Kyng, M. (eds.), Design at Work Cooperative Design of Computer Systems, 155–168. Hillsdale, New Jersey: Lawrence Erlbaum Associates, Publishers.
Kirschner, P. A. (2015). Do we need teachers as designers of technology enhanced learning? Instructional Science, 43(2), 309–322. DOI: https://doi.org/10.1007/s11251-015-9346-9
Laurillard, D. (2013). Rethinking university teaching: A conversational framework for the effective use of learning technologies, 2nd ed., 2. London, UK: Routledge. DOI: https://doi.org/10.4324/9780203304846
Laurillard, D., & Derntl, M. (2014). Learner Centred Design-Overview. In: Mor, Y., Mellar, H., Winters, S., & Niall, W. (eds.), Practical Design Patterns for Teaching and Learning with Technology, 13–16. SensePublishers, Rotterdam. DOI: https://doi.org/10.1007/978-94-6209-530-4_2
McAlister, S., Ravenscroft, A., & Scanlon, E. (2004). Combining interaction and context design to support collaborative argumentation using a tool for synchronous CMC. Journal of Computer Assisted Learning, 20(3), 194–204. DOI: https://doi.org/10.1111/j.1365-2729.2004.00086.x
Pellas, N. (2014). The influence of computer self-efficacy, metacognitive self-regulation and self-esteem on student engagement in online learning programs: Evidence from the virtual world of Second Life. Computers in Human Behavior, 35, 157–170. DOI: https://doi.org/10.1016/j.chb.2014.02.048
Reeve, J. (2012). A Self-determination Theory Perspective on Student Engagement. In: Christenson, S., Reschly, A., & Wylie, C. (eds.), Handbook of Research on Student Engagement, 149–172. Boston, MA: Springer US. DOI: https://doi.org/10.1007/978-1-4614-2018-7_7
Seifert, K., & Sutton, R. (2009). Educational Psychology, 2nd ed. The Saylor Foundation. Retrieved from: https://www.saylor.org/site/wp-content/uploads/2012/06/Educational-Psychology.pdf.
Wang, M.-T., & Eccles, J. (2013). School context, achievement motivation, and academic engagement: A longitudinal study of school engagement using a multidimensional perspective. Learning and Instruction, 28, 12–23. DOI: https://doi.org/10.1016/j.learninstruc.2013.04.002
Wang, M.-T., Fredricks, J., Ye, F., Hofkens, T., & Linn, J. S. (2017). Conceptualization and Assessment of Adolescents’ Engagement and Disengagement in School. European Journal of Psychological Assessment, 1–15. DOI: https://doi.org/10.1027/1015-5759/a000431
Weitze Laerke, C. (2016). Learning Design Patterns for Hybrid Synchronous Video-Mediated Learning Environments. In: Nortvig, A.-M., Sørensen, B. H., Misfeldt, M., Ørngreen, R., Allsopp, B. B., Henningsen, B. S., & Hautopp, H. (eds.), Proceedings of the 5th International Conference on Designs for Learning, 1st ed., 236–252. Aalborg Universitetsforlag. DOI: https://doi.org/10.1007/978-3-662-48768-6
Willig, C. (1999). Beyond appearances: A critical realist approach to social constructionism. In: Nightingale, D., & Cromby, J. (eds.), Social constructionist psychology: A critical analysis of theory and practice, 37–51. Open University Press.
Winters, N., & Mor, Y. (2008). IDR: A participatory methodology for interdisciplinary design in technology enhanced learning. Computers and Education, 50(2), 579–600. DOI: https://doi.org/10.1016/j.compedu.2007.09.015