Start Submission Become a Reviewer

Reading: Assessment of Information and Communication Technology Competencies in Design-Based Learning...

Download

A- A+
Alt. Display

Research

Assessment of Information and Communication Technology Competencies in Design-Based Learning Environments

Authors:

Hasan Çakır,

Gazi University, TR
X close

Harun Bahadır ,

Gazi University, TR
X close

Aslıhan Tüfekci

Gazi University, TR
X close

Abstract

Design-based learning (DBL) enables 21st-century skills to be gained through design-oriented production processes. Jonassen’s “Designing Constructivist Learning Environments framework” (CLEs) is a suitable model for designing a DBL environment. In such learning environments, students must have a certain level of information and communication technology (ICT) competencies in order to achieve learning goals. The aim of the study is to build on ICT competency areas that are likely to be used in each phase of CLEs and to develop a scale in order to assess the usage of these technologies by students. The study is of importance in terms of enabling teachers to evaluate learners and themselves technologically before education starts in such an environment. Literature review, expert opinion, and focus group interview were conducted to develop the scale. In order to determine the construct validity of the scale, principal component analysis was conducted for each technology competency area identified on the scale. As a result of the analysis, it was determined that the Cronbach’s alpha values of the subscales of the measuring tool were between 0.817 and 0.993. Cronbach’s alpha values above 0.80 indicate that the scale is adequately valid and reliable (Cronbach & Meehl, 1956). In line with these results, the “Design-based learning environments technological competencies scale” is considered being a suitable tool for assessing the ICT competency of teachers and students before starting teaching in learning environments based on the constructivist approach.

How to Cite: Çakır, H., Bahadır, H., & Tüfekci, A. (2021). Assessment of Information and Communication Technology Competencies in Design-Based Learning Environments. Designs for Learning, 13(1), 55–70. DOI: http://doi.org/10.16993/dfl.160
5
Views
8
Downloads
  Published on 25 Nov 2021
 Accepted on 06 May 2021            Submitted on 28 Apr 2020

Introduction

In today’s competitive work environment, where information and circumstances are constantly evolving while time is still limited, individuals are expected to keep up with that pace. These expectations, in other words, 21st-century abilities are regarded as the ability to take initiative, to critically think, to learn how to learn, to work in cooperation, to pose questions, and to self-regulate (P21, 2019). In the constructivist approach, individuals are often not passive recipients, but play an active role in their learning; teachers are often not transmitters, but mainly guides in learning; learners obverse realities in the outer world while being involved in the formation process of information through social interaction. Reflecting educational systems, which are designed with the constructivist approach, on education processes is at least as important as enabling students to acquire these abilities. The teaching system that will be used while describing the design of Jonassen’s (1999) “constructivist learning environment design framework (CLEs)” environment, should also encourage learners to solve problems and produce. Through design-based learning, which is effective in enabling learners to obtain 21st-century abilities and whose popularity has been increasing in the past few years, learners improve their learning skills by developing an application, software, or product in a design-based manner. Learners’ technological abilities must be competent enough for design-based learning to be effectively implemented in a classroom environment. In the literature, there are various scales, which assess technological competency, such as those of ISTE (2016), European Commission (2013), Europass (2015), ECDL, n.d., and NETg, n.d.. However, these scales have been developed with items, which are not based on an educational framework and only assess competencies related to hardware, operating system, and application software. This study aims at determining competencies areas, which will build on the technological competencies of learners studying in design-based learning environments based on constructivist teaching approaches, and developing a scale that will assess the aforementioned.

Conceptual Framework

Design-based learning (DBL), which helps learners to obtain 21st-century abilities such as problem-solving, collaborative work, active learning, and critical thinking (Chandrasekaran et al., 2013), is a technique in which learners use design methodology to produce creative and innovative practical solutions to problems (Nelson, 2004). Design-based learning environments, where learners create their cognitive structures as a result of the design processes (Kolodner et al., 1998), increase learners’ motivation as well as interest and curiosity in the subject (Doppelt & Schunn, 2008; Gardner, 2012) while improves academic success by helping learners develop creative and critical thinking skills (Doppelt, 2009). DBL increases not only academic performance but also associates real-life problems and experiences with teaching programs (Lee & Breitenberg, 2010).

DBL is an approach that is effective in teaching and learning complex and difficult subjects (Apedoe et al., 2008). In their 8-week study, in which they used the DBL approach in biology education, Ellefson et al. (2008) found out that DBL is an effective tool for learners to learn complex subjects, especially such as biology. In the research conducted by Mehalik, Doppelt, & Schuun (2008) with the participation of 10 teachers and 587 eighth grade students, in which learners designed an electrical alarm system for 4 weeks, it was found out that the DBL approach is effective in obtaining basic science conceptions and retention. In their study on transferring information obtained by learners via DBL to a design problem, Fortus, Krajcik, Dershimer, Marx, & Mamlok-Naaman (2005) observed that 149 students succeeded at learning subject content and transferring the learned information to a new design assignment. In a study conducted by Kim et al., (2015), which thoroughly investigated DBL activities with the exploratory method and the participation of fifth-grade students, it was found out that education that is designed with the DBL approach is interesting, enjoyable, and contextual. Furthermore, learners in DBL environments obtain creative thinking skills as a result of design activities (Davis, 1998; Kafai, 1995; Seitamaa-Hakkarainen, 2011).

DBL creates the basis of constructivist learning since it requires the production of a product, teachers to support learners to be active, to work in collaboration as well as to socially interact with each other in the process of learning (Kafai, 1995; Ke, 2014). CLEs framework is an ideal one to design DBL environments with its expectations from teachers and students as well as its explanatory descriptions about the process of learning. Learning is organized around a project or problem instead of subject content and thereby enables learners to acquire critical thinking skills and self-learning management skills (Duffy & Cunningham, 1996).

In the design of DBL activities created based on constructivist learning, information and communication technology is used to both support the process and facilitate learning. In such learning environments, technology enables the communication channels between learners to be increased and collaborative learning to continue outside the classroom (Feyzi Behnagh & Yasrebi, 2020; Levy, 1997). The technology that enables the simulation of difficult and dangerous situations allows the efficient use of time. Thus, with technology, students can focus more on problem-solving or product development. For this reason, it is important to determine the technologies that will be needed in DBL environments and to build on the competencies of these technologies to be used by students. In line with this need, the technologies that can be used in the stages of the CLEs framework have been tried to be determined by scanning the studies that are the subject of technology in the literature. CLEs framework has five essential components shown in Figure 1.

Figure 1 

Jonassen’s Framework for a Constructivist Learning Environment (Jonassen, 1999).

Problem/Project

The focus of CLEs comprises a question, problem, or project (Jonassen, 1999). Learning is organized around a project or problem instead of subject content and thereby enables learners to acquire critical thinking skills and self-learning management skills (Duffy & Cunningham, 1996).

Problem context

Problem context refers to the environment of a problem or the environment, in which the project is formed or comes to life, as well as the effective elements and stakeholders of this environment. Problem database websites, through which students and teachers can search for and filter problem descriptions, are appropriate for revealing the context of a problem. Thus, students will have a summary view of solved problems, unsolved problems and their possible solutions, progress reports, and comments.

Problem representation

Problem representation is the presentation of the stakeholders, environment, and elements of the problem explained in the context, from various aspects to the students. Podcasting, which can be used for observing these aspects, enables learners to learn fundamental conceptions, while; allowing learners to observe the relationship between these conceptions and real-life (Besser et al., 2021; Mitchell et al., 2021; Moryl, 2013). Concept maps, which are used as an environment for constructivist learning activities (Cañas et al., 2003), is a significant tool to reveal the meta-cognitive skills of learners (Brondfield et al., 2019; Marzano & Miranda, 2021; Novak, 1990; Shin & Jeong, 2021).

Manipulation space

Problem manipulation space is an environment where learners can test and see the results of the hypotheses they developed for the solution to the problem or the phases they developed for the project. The technologies to be used in this phase vary according to the content of the studied subject. These technologies could be data analysis software, which is one of the simulation tools of the virtual world, or a manipulation media for electronic spreadsheet software.

Related cases

This element is for improving learners’ experiences of determining a solution strategy, which may be missing for a newly encountered problem. Video search engines, which improve critical thinking skills and in-depth learning (Clifton & Mann, 2011; Palla & Sheikh, 2020; Ruggieri, 2020), are appropriate for determining which solution strategy is the most suitable for the problem. Also, video search engines enable learners to engage in the learning environment while allowing them to see the elements of the environment in detail (Guo et al., 2014; Halpern et al., 2020).

Information Resources

It is possible to include many technologies in the learning environment when it comes to information resources, which are required to understand the problem. Technologies such as search engines, which improve individual’s self-regulation skills while providing enriched search strategies with peer collaboration (Lazonder, 2001; Mracek, 2019), e-museum, which can be used for understanding difficult subjects and developing different perspectives (Çalık et al., 2016; Neill, 2008), online questionnaires, which can be used to collect data from individuals living in different geographies (Sue & Ritter, 2011), and RSS, which is the digital way to obtain information about course-related and non-course-related subjects, are the source technologies that learners can obtain information about the problem.

Cognitive (Knowledge-Construction) Tools

Cognitive tools enable learners to focus on the cognitive processes that are relevant to the solution to the problem. Language translation software, which is quick and free-of-charge at translating foreign languages that are obstacles for the academic success of learners (Chung & Ahn, 2021; Lee & Briggs, 2021; Muzdalifah & Handayani, 2020; van Rensburg et al., 2012), is one of the technologies helping students focus on the problem. Office tools are essential technologies used by learners for writing reports on a problem as well as presenting and creating graphics. Multimedia creation and editing software, which are required for editing multimedia collected in the process of data collection, are sought by learners in the process of problem-solving. Digital calendar applications are necessary for learners to focus on collaborative solution processes throughout the research process.

Conversation and Collaboration Tools

Conversation and collaboration tools refer to the media, which will enable social interaction among learners as well as access to, storage, and editing of information collected through such interactions. It is believed that social media, which is frequently used by students in all areas of their everyday life, can be used as an effective tool for learning (Bozanta & Mardikyan, 2017). In their study, which lasted for 8 weeks with the participation of 40 graduate students, Zhang, Chen, de Pablos, Lytras, & Sun (2016) discovered that social media increases teamwork in collaborative learning environments. E-mails are convenient communication environments for collaborative learning since they enable interaction between two learners or between the learner and teacher (Warschauer, 1995). Online communication applications support learning and teaching processes by developing collaborative works in learning environments (Cassany et al., 2019; Kapoor et al., 2019; Ngaleka & Uys, 2013; So, 2016). In their exploratory study, Bouhnik & Deshen (2014) found out that as a learning platform, WhatsApp has positive aspects such as access to educational materials, promoting sharing among learners, creating dialogue, easy communication with teachers, and continuation of extracurricular learning. It is believed that cloud technologies increase the quality of education (Arroyo et al., 2020; Gurung et al., 2016) by supporting collaborative studying (Bakla, 2020; El Mhouti et al., 2016; Savelyeva et al., 2021) while maximizing the share of resources in a learning environment. In the study conducted by Lin, Wen, Jou, & Wu (2014), which evaluated the education delivered in cloud-based learning environments through tests, questionnaires, and interviews, it was concluded that learning environments strengthen the reflection skills of learners while increasing their motivation. Google Doc provides a collaborative editing environment for learners. In their study, Ishtaiwa & Aburezeq (2015) concluded that Google Docs improves student-student, student-teacher, student-content, and student-interface interaction.

Wikis, which refer to web pages with content that can be created and edited by one person or multiple people, are also an effective way to collaboratively create information (Duffy & Bruns, 2006; Kim & Kim, 2020; Li et al., 2021). In their study, in which the impact of the wiki on learning outputs is examined by conducting a meta-analysis on 25 pieces of research, Trocky & Buckley (2016) found out that learners’ writing and collaborative working skills have increased. Forums, where dialogue occurs through discussions and messages, are effective tools in promoting communication and cooperation in learning environments (Ioannou et al., 2015). In the research carried out by Shana (2009) with the participation of 54 distance-education students, it was detected that the course success of the experimental group, which received an education with a discussion forum, was better than that of the control group, whose education did not include a discussion forum, moreover; it was found out that the attitude of the first group was more positive towards distance-education than the attitude of the other group. Blogs provide collaborative learning environments that increase social interaction and cognitive engagement (Erdogdu & Eskimen, 2020; Gurer, 2020; Noel, 2015). In the study carried out by Amir, Ismail, & Hussin (2011) with the participation of 80 students taking English language teaching course, they analyzed the questionnaire and blog entry records and concluded that blogs improve learners’ collaborative writing skills while promoting high autonomy in peer interaction. E-portfolios offer not only a collaborative learning environment for learners (Habeeb & Ebrahim, 2019; Lam, 2020) but also support their career development (Luchoomun et al., 2010). In the study conducted by Jimoyiannis & Tsiotakis (2016) with the participation of 24 undergraduate students, it was found out that e-portfolios increase motivation and engagement by supporting learners’ collaborative studies.

By allowing mass communication, video conferences enable cooperation among learners (Nilsen, 2011). With their features such as making edits by users and adding multimedia, geographical information systems, which emerged as a result of the impact of Web 2.0 technologies on mapping, have been collaboratively used in learning environments for the past few years. Google Maps provides a collaborative study environment that improves learners’ reading, writing, speaking, and listening skills, as a result of the audio, visuals, videos, and routes left by learners on an interactive map as well as their edits on the map (Sokolik, 2011). 3-D virtual worlds, which promote learner-learner and learner-content interaction (Warburton et al., 2009), support constructivist learning environments (Dickey, 2003; Jarmon et al., 2009).

Social/Contextual Support

Social/Contextual support refers to developing solution adaptations by taking into account the environment, where the problem occurred. In order to solve possible problems, learners must be able to perform basic technical skills such as connecting a projector device, connecting to a network, and solving technological problems, whilst keeping their digital skills updated.

With the use of Web 2.0 applications in learning environments, online learning brings all possible risks of the internet to the learning environments (Chen & He, 2013). Learners will actively participate in the process of learning and teaching in situations, in which they feel safe. The following headings should be taken into consideration for the digital safety of learners and teachers in learning environments (Europass, 2015; European Commission, 2013):

  • Updating the system, using antivirus software, identity theft, password security, security settings, device security, danger, risk, and measures
  • Spam
  • Individual privacy
  • Cyberbullying
  • Health risks

Education is the reflection of the dynamics and culture of a society (Giavrimis et al., 2009). In information societies, the norms, which are regarded as significant by individuals of digital environments, should also be taken into consideration by learners in digital learning environments. These norms can be listed as (Europass, 2015; European Commission, 2013):

  • Ethics
  • Online dignity
  • Personal rights
  • Cultural and intergenerational variety
  • Copyrights and licenses
  • Netiquette
  • Communication rules

Communities of practice are a group of people who share their interests, desires, and problems related to a specific subject and deepen their knowledge and experience as a result of the interactions that occur with these exchanges (Wenger et al., 2002). In this environment, individuals have the opportunity to search and share information, to build trust and interaction with each other, and to apply what they have learned in the community in real life (Snyder et al., 2003). Therefore, communities of practice are considered as a technology under the title of social/contextual support in constructivist learning environments. Virtual communities of practice, which is the digital version of communities of practice, an important tool in revealing the culture, context, and activities of the problem (Brown et al., 1989).

Learners’ technology competencies are a prerequisite for achieving targeted learning outcomes through design-based learning. Teachers wishing to implement this method should check whether this prerequisite is met or not. This is critical to ensuring an effective and efficient learning process. At the same time, it is of importance in terms of enabling teachers to evaluate learners and themselves on a technological basis before education starts in design base learning environment. It is thought that the development of technology competencies scale for design-based learning environments in which teachers can use in a wide range, easily and safely will fill an important gap in the field.

Methodology

The literature has been reviewed in-depth in this study, moreover; 36 technological competency areas relevant to the CLEs framework and 198 items questioning the applicability of these areas in the learning environment have been built on. Scale items were presented to one measurement evaluation specialist and two information communication technology specialists for their opinion, afterwards, technology competency areas and items that were not considered suitable for use in learning environments at the secondary level were removed from the scale. And 26 technology competency areas and 131 items under these areas were created. The draft scale items were evaluated for checking the readability level with a group of 8 students by the focus group interview method. After this interview, the names of frequently used websites and technology applications have also been added to the technology competency domains. The name of a technology competency area was also changed after this interview. The 131-item draft scale collected data from 152 middle school students aged 9–13 years in a large city setting of Turkey in the 2018–2019 academic year. In order to determine the construct validity of the scale, principal component analysis was conducted for each technology competency area in the scale. Apart from the two technology competency areas, a one-factor structure has emerged for each of the 24 technology competency areas. In these two technology competency areas, two-factor structures emerged. Total variance explained values in these factors were about 77% and 79%. An item with close similar factor load values in two factors was extracted from the scale. The scale was completed with 26 technology competency areas and 130 items under these areas.

Study Group

The study group consisted of 152 students studying at a secondary middle school in a large city setting in Turkey. 77 (50.7%) of these students were male and 75 (49.3%) were female. 4 of the students (2.6%) were 9 years old, 2 (1.3%) were 10 years old, 18 (11.8%) were 11 years old, 30 (19.7%) were 12 years old and 98 (64.5%) were 13 years old. Table 1 shows the computer and internet usage rates of the students.

Table 1

Study Group Daily Computer and Internet Usage Situations.

Daily Computer Usage Daily Internet Usage

frequency Percent(%) frequency Percent(%)

Less than 1 hour 76 50 21 13.8
1 hour 25 16.4 35 23
2 hour 18 11.8 30 19.7
3 hour 9 5.9 16 10.5
More than 4 hours 24 15.8 50 32.9
Total 152 100 152 100

Development of the scale

The scale development process is based on Jonassen’s CLEs framework, which provides a general framework for constructivist learning environments. The literature was reviewed in-depth to determine the technologies appropriate to the stages and sub-headings of the framework. Data collection tools that measure the 21st-century skills of individuals such as digital competencies and digital citizenship have also been examined. And those that are applicable to learning environments were evaluated. As a result of the literature review, 36 technology competency areas that are likely to be used in the design-based learning environments represented in Table 2 have been reached. And a question pool of 198 items was created which questions the use of these technologies in learning environments.

Table 2

Technological Competency Areas Achieved as a result of Literature Review.

CLEs Framework Technology Competency Areas
Related to The Framework
Related Literature

1.Problem/Project
a. Problem Context Video Display, Problem Database Websites (Guo et al., 2014), (Dershowitz & Treinen, 1998)
b. Problem Representation Podcasting, Concept Maps, Virtual Communities of Practice (Besser et al., 2021), (Mitchell et al., 2021), (Moryl, 2013), (Novak, 1990), (Brown et al., 1989), (Brondfield et al., 2019), (Marzano & Miranda, 2021), (Shin & Jeong, 2021)
c. Manipulation Space Simulation tools, Data Analysis Software, Electronic Spreadsheet Software (Jonassen, 1999)
2.Related Case Search Engines, Video Search Engines (Lazonder, 2001), (Mracek, 2019), (Clifton & Mann, 2011), (Palla & Sheikh, 2020), (Ruggieri, 2020), (Halpern et al., 2020),
3.Information Resources Digital Library Databases, Google Scholar, E-Museums, Online Questionnaires, RSS (Borgman et al., 2000), (Cothran, 2011), (Çalık et al., 2016), (Neill, 2008), (Sue & Ritter, 2011)
4.Cognitive (Knowledge-Construction) Tools Language Translation Software, Office Software, Pdf Editing Software, Audio and Video Editing Software, Digital Calendar Application, Digital Note-Taking Application (Chung & Ahn, 2021), (Lee & Briggs, 2021), (Muzdalifah & Handayani, 2020), (van Rensburg et al., 2012),
5.Conversation And Collaboration Tools Social Media, E-Mail, Online Communication Applications, Cloud Technologies, Google Doc, Wiki, Forums, Blogs, E-Portfolios, Video Conference Software, Reference Management Software, Geographical Information Systems, Learning Management Systems, 3-D Virtual Worlds (Bozanta & Mardikyan, 2017), (Zhang et al., 2016), (Warschauer, 1995), (Cassany et al., 2019), (Kapoor et al., 2019), (Ngaleka & Uys, 2013), (So, 2016),(Bouhnik & Deshen, 2014),(El Mhouti et al., 2016), (Arroyo et al., 2020), (Bakla, 2020), (Gurung et al., 2016), (Savelyeva et al., 2021), (Lin et al., 2014), (Ishtaiwa & Aburezeq, 2015), (Duffy & Bruns, 2006), (Kim & Kim, 2020), (Li et al., 2021) (Trocky & Buckley, 2016), (Ioannou et al., 2015), (Shana, 2009), (Erdogdu & Eskimen, 2020), (Gurer, 2020), (Noel, 2015), (Amir et al., 2011), (Habeeb & Ebrahim, 2019), (Lam, 2020), (Luchoomun et al., 2010), (Jimoyiannis & Tsiotakis, 2016), (Nilsen, 2011), (Basri & Patak, 2015), (Sokolik, 2011), (Lonn & Teasley, 2009), (Warburton et al., 2009), (Dickey, 2003)
6.Social/Contextual Support Social/Contextual Support (Chen & He, 2013), (Europass, 2015), (European Commission, 2013),

This pool of questions was evaluated by one measurement evaluation specialist and two information communication technology specialists. As a result of the expert review, it was decided to remove 10 technology competency areas and 67 items under these areas, to include 2 technology competency areas (7 items in these) under a different frame title, and to change the name of 1 technological competency area. Details of these changes are given below:

  • The “Video Display” technology competency area has been removed because it does not meet the problem context title.
  • The “Data Analysis Software” area was removed as the “Electronic Spreadsheet Software” area was considered sufficient.
  • The “Search Engines” area under the “Related Case framework” title was taken under the “Information Resources” framework title.
  • The “Digital Library Databases” and “Google Scholar” areas were omitted because it was thought that the “Search Engines” would be sufficient instead of these areas.
  • The “PDF Editing Software”, “Digital Note-Taking Application”, “Reference Management Software” and “Learning Management Systems” areas were removed considering that they would be very specific for secondary school level learning environments.
  • The “Audio and Video Editing Software” area has been changed to the “Creating/Editing multimedia” area.
  • The “Google Docs” area has been removed due to the “Cloud Technologies” area being considered sufficient.
  • The fact that 15 items under the “Social/Contextual Support” area included IT technicians have been considered as unsuitable for secondary school level learning environments and has been removed.
  • The items of the “Virtual Communities of Practice” area were included in this area because they were considered to be suitable for the “Social/Contextual Support” area.

As a result, the scale which consists of 26 technology competency areas and 131 items under these areas presented in Table 3 were developed.

Table 3

Technological Competency Areas as a Result of Field Expert Review.

CLEs Framework Technology Competency Areas
Related to The Framework
Related Literature

1.Problem/Project
a. Problem Context Problem Database Websites (Dershowitz & Treinen, 1998)
b. Problem Representation Podcasting, Concept Maps (Besser et al., 2021), (Mitchell et al., 2021), (Moryl, 2013), (Novak, 1990), (Brown et al., 1989), (Brondfield et al., 2019), (Marzano & Miranda, 2021), (Shin & Jeong, 2021)
c. Manipulation Space Simulation Tools, Electronic Spreadsheet Software (Jonassen, 1999)
2.Related Case Video Search Engines (Clifton & Mann, 2011), (Palla & Sheikh, 2020), (Ruggieri, 2020), (Halpern et al., 2020)
3.Information Resources Search Engines, E-Museums, Online Questionnaires, RSS (Lazonder, 2001), (Mracek, 2019), (Çalık et al., 2016), (Neill, 2008), (Sue & Ritter, 2011)
4.Cognitive (Knowledge-Construction) Tools Language Translation Software, Office Software, Creating/Editing Multimedia, Digital Calendar Application (Chung & Ahn, 2021), (Lee & Briggs, 2021), (Muzdalifah & Handayani, 2020), (van Rensburg et al., 2012)
5.Conversation And Collaboration Tools Social Media, E-Mail, Online Communication Applications, Cloud Technologies, Wiki, Forums, Blogs, E-Portfolios, Video Conference Software, Geographical Information Systems, 3-D Virtual Worlds (Bozanta & Mardikyan, 2017), (Zhang et al., 2016), (Warschauer, 1995), (Cassany et al., 2019), (Kapoor et al., 2019), (Ngaleka & Uys, 2013),(So, 2016),(Bouhnik & Deshen, 2014),(El Mhouti et al., 2016), (Arroyo et al., 2020), (Bakla, 2020), (Gurung et al., 2016), (Savelyeva et al., 2021), (Lin et al., 2014), (Ishtaiwa & Aburezeq, 2015), (Duffy & Bruns, 2006), (Kim & Kim, 2020), (Li et al., 2021) (Trocky & Buckley, 2016), (Ioannou et al., 2015), (Shana, 2009), (Erdogdu & Eskimen, 2020), (Gurer, 2020), (Noel, 2015), (Amir et al., 2011), (Habeeb & Ebrahim, 2019), (Lam, 2020), (Luchoomun et al., 2010), (Jimoyiannis & Tsiotakis, 2016), (Nilsen, 2011), (Basri & Patak, 2015), (Sokolik, 2011), (Lonn & Teasley, 2009), (Warburton et al., 2009), (Dickey, 2003)
6.Social/Contextual Support Social/Contextual Support (Chen & He, 2013), (Europass, 2015), (European Commission, 2013), (Brown et al., 1989).

The scale items were evaluated with a group of 8 students by the focus group interview method. As a result of the focus group interview, it was found out that the students had difficulties understanding the meaning of the technology competency domain names. For this reason, after consulting with experts, it was deemed appropriate to add the frequently used web site and technology application names to the technology competency domain names as examples. Examples of this arrangement are presented in Table 4. Some student opinions regarding this assessment are as follows:

“I don’t know what cloud technologies are, but I use Google Drive.”

“I didn’t hear the geographical information systems, but I did an address search on my phone and went there.”

“I have difficulty understanding the technological competency domain name in some items.”

“It is good to give examples alongside common names.”

Table 4

Technological Competency Areas and Application Examples Related To These Areas.

Technological Competency Areas Favorite Application Example

Search Engines Google, Yandex, etc.
Office Software Microsoft Word, Microsoft Excel, Microsoft PowerPoint, Microsoft Access, etc.
Social Media Facebook, Twitter, Instagram, etc.
E-Mail Google Gmail, Microsoft Outlook, etc.
Online Communication Applications WhatsApp, Google Hangout, etc.
Cloud Technologies Google Drive, Microsoft OneDrive, Dropbox, etc.

After the focus group interview, it was decided to write the “RSS” technology competency area with a clear statement as the “Web Feeds” and the design-based learning technological competency areas are finalized in Table 5.

Table 5

Technological Competency Areas Organized as a Result of the Focus Group Interview.

CLEs Framework Technology Competency Areas
Related To The Framework
Related Literature

1.Problem/Project
a. Problem Context Problem Database Websites (Dershowitz & Treinen, 1998)
b. Problem Representation Podcasting, Concept Maps (Besser et al., 2021), (Mitchell et al., 2021), (Moryl, 2013), (Novak, 1990), (Brown et al., 1989), (Brondfield et al., 2019), (Marzano & Miranda, 2021), (Shin & Jeong, 2021)
c. Manipulation Space Simulation Tools (Packet tracer, OrCad, Proteus etc.), Electronic Spreadsheet Software (Microsoft Excel, etc.) (Jonassen, 1999)
2.Related Case Video Search Engines (YouTube, etc.) (Clifton & Mann, 2011), (Palla & Sheikh, 2020), (Ruggieri, 2020), (Halpern et al., 2020)
3.Information Resources Search Engines (Google, Yandex, etc.), E-Museums, Online Questionnaires, Web Feeds (RSS) (Lazonder, 2001), (Mracek, 2019), (Çalık et al., 2016), (Neill, 2008), (Sue & Ritter, 2011)
4.Cognitive (Knowledge-Construction) Tools Language Translation Software (Google Translate, etc.), Office Software (Microsoft Word, Microsoft Excel, Microsoft PowerPoint, Microsoft Access etc.), Creating/Editing Multimedia, Digital Calendar Application (Chung & Ahn, 2021), (Lee & Briggs, 2021), (Muzdalifah & Handayani, 2020), (van Rensburg et al., 2012)
5.Conversation And Collaboration Tools Social Media (Facebook, Twitter, Instagram, etc.), E-Mail (Google Gmail, Microsoft Outlook, etc.), Online Communication Applications (WhatsApp, etc.), Cloud Technologies (Google Drive, Microsoft OneDrive, Dropbox), Wiki (Wikimedia, etc.), Forums, Blogs (WordPress, etc.), E-Portfolios, Video Conference Software (Adobe Connect, Skype, etc.), Geographical Information Systems (Google Maps, Yandex Maps, etc.), 3-D Virtual Worlds. (Bozanta & Mardikyan, 2017), (Zhang et al., 2016), (Warschauer, 1995), (Cassany et al., 2019), (Kapoor et al., 2019), (Ngaleka & Uys, 2013),(So, 2016),(Bouhnik & Deshen, 2014),(El Mhouti et al., 2016), (Arroyo et al., 2020), (Bakla, 2020), (Gurung et al., 2016), (Savelyeva et al., 2021), (Lin et al., 2014), (Ishtaiwa & Aburezeq, 2015), (Duffy & Bruns, 2006), (Kim & Kim, 2020), (Li et al., 2021) (Trocky & Buckley, 2016), (Ioannou et al., 2015), (Shana, 2009), (Erdogdu & Eskimen, 2020), (Gurer, 2020), (Noel, 2015), (Amir et al., 2011), (Habeeb & Ebrahim, 2019), (Lam, 2020), (Luchoomun et al., 2010), (Jimoyiannis & Tsiotakis, 2016), (Nilsen, 2011), (Basri & Patak, 2015), (Sokolik, 2011), (Lonn & Teasley, 2009), (Warburton et al., 2009), (Dickey, 2003)
6.Social/Contextual Support Social/Contextual Support (Chen & He, 2013), (Europass, 2015), (European Commission, 2013), (Brown et al., 1989).

Implementation of the Scale

The scale was administered online to 152 students in secondary middle school in a large city setting of Turkey on 20 computers in the IT class. The average response to the questionnaire by 152 students was 15.96 minutes. The researcher took part in the IT class during the students’ responses to the scale.

Analysis of the Scale

A large number of variables and the correlation of many of these variables make it difficult to evaluate the data set. In such cases, the use of principal component analysis (PCA) to interpret the variance-covariance structure of the variable set through the linear combination of variables by removing the dependency structure is a very effective method (Ersungur et al., 2007). PCA aims to express important information from a data set with a new variable array under the name of principal components. PCA is the method that best explains the variance used in interpreting the reliability of the test compared to other factor extraction methods under various sample sizes and common variance conditions. (Karaman et al., 2017). It is an ideal method for size reduction, especially on a large data set (Abdi & Williams, 2010). In the study, PCA was used in order to reduce the technology dimensions determined by the literature review in line with the purpose of the study. Orthogonal and oblique rotation methods are used to provide a more interpretable factor structure in PCA. In the study, the varimax rotation of Kaiser (Kaiser, 1958) which is the most preferred orthogonal rotation method in the literature (Jackson, 2005; Kleinbaum et al., 1988), was preferred. Since the original variables tend to be associated with a basic component with each rotation, the varimax rotation method is considered an easily interpretable return operation (Abdi & Williams, 2010). During principal component analysis, Kaiser-Meyer-Olkin (KMO) and Bartlett values were determined.

Findings

Kaiser-Meyer-Olkin (KMO) and Barlett analyses were conducted to test whether the data obtained from the study group were suitable for principal component analysis. As a result of the analysis, (KMO) values in each technology competency area ranged from 0.694 to 0.919, while Barlett significance values were found to be p = 000. KMO and Barlett test results are presented in Table 6. As the KMO value approaches 1 between 0.60–1, the adequacy of the data obtained from the sample goes to perfection (Tabachnick & Fidell, 2013). Barlett value should be at p < .05 significance level (Tavşancıl, 2002). With these results, it was evaluated that the data set was suitable for principal component analysis.

Table 6

KMO and Bartlett Test Values.

CLEs Framework Technology competency areas related to the framework Item No KMO Barlett

Chi-Square df Sig.

PROBLEM/PROJECT (1–19) Problem Context 1–4 0.789 183.161 6 .000
Problem Representation 5–11 0.864 364.515 21 .000
Manipulation Space 12–19 0.886 497.761 28 .000
RELATED CASE (20–24) Video Search Engines 20–24 0.754 279.483 10 .000
INFORMATION RESOURCES (25–40) Search Engines 25–28 0.798 221.587 6 .000
E-Museums 29–31 0.741 264.863 3 .000
Online Questionnaires 32–36 0.856 356.270 10 .000
Web Feeds (RSS) 37–40 0.768 295.677 6 .000
COGNITIVE (KNOWLEDGE-CONSTRUCTION) TOOLS (41–59) Language Translation Software 41–43 0.704 201.205 3 .000
Office Software 44–47 0.823 331.317 6 .000
Creating/Editing Multimedia 48–56 0.848 919.431 36 .000
Digital Calendar Application 57–59 0.726 196.967 3 .000
CONVERSATION AND COLLABORATION TOOLS (60–121) Social Media 60–64 0.850 329.331 10 .000
E-Mail 65–70 0.842 533.527 15 .000
Online Communication Applications 71–75 0.899 509.148 10 .000
Cloud Technologies 76–84 0.919 1287.437 36 .000
Wiki 85–91 0.918 945 137 .000
Forums 92–97 0.904 763.727 15 .000
Blogs 98–103 0.912 939.931 15 .000
E-Portfolios 104–108 0.843 578.537 10 .000
Video Conference Software 109–114 0.900 906.020 15 .000
Geographical Information Systems 115–117 0.694 214.042 3 .000
3-D Virtual Worlds 118–121 0.818 388.586 6 .000
SOCIAL/CONTEXTUAL SUPPORT (122–131) Social/Contextual Support 122–131 0.917 1094.918 45 .000

As a result of the suitability of the principal component analysis (PCA) of the data, PCA was started to evaluate the factor structure. Analysis was performed separately for each title under the CLEs framework title. PCA was performed by varimax rotation. Information about factor load values and the number of factors after rotation is presented in Table 7.

Table 7

Communalities Extraction, Component Matrix, Total Variance Explained, and Cronbach’s Alpha Values.

CLEs Framework Technology competency areas related to the framework Items Communalities Extraction Component Matrix Total Variance Explained Cronbach’s alpha

Cumulative Number of factors

PROBLEM/PROJECT (1–19) Problem Context 1–4 >0.639 >0.799 67.980 1 0.942
Problem Representation 5–11 >0.456 >0.676 58.543 1
Manipulation Space 12–19 >0.620 >0.729 60.518 2
>0.760 75.499
RELATED CASE (20–24) Video Search Engines 20–24 >0.495 >0.703 59.271 1 0.817
INFORMATION RESOURCES
(25–40)
Search Engines 25–28 >0.574 >0.758 69.254 1 0.953
E-Museums 29–31 >0.831 >0.912 85.678 1
Online Questionnaires 32–36 >0.456 >0.675 70.824 1
Web Feeds (RSS) 37–40 >0.678 >0.823 75.513 1
COGNITIVE (KNOWLEDGE-CONSTRUCTION) TOOLS (41–59) Language Translation Software 41–43 >0.689 >0.830 77.611 1 0.953
Office Software 44–47 >0.692 >0.832 77.933 1
Creating/Editing Multimedia 48–56 >0.707 >0.783 65.162 2
>0.874 79.368
Digital Calendar Application 57–59 >0.743 >0.862 80.048 1
CONVERSATION AND COLLABORATION TOOLS (60–121) Social Media 60–64 >0.596 >0.772 70.022 1 0.989
E-Mail 65–70 >0.539 >0.734 68.178 1
Online Communication Applications 71–75 >0.695 >0.833 77.629 1
Cloud Technologies 76–84 >0.634 >0.797 80.405 1
Wiki 85–91 >0.784 >0.885 84.024 1
Forums 92–97 >0.788 >0.888 82.091 1
Blogs 98–103 >0.806 >0.898 85.968 1
E-Portfolios 104–108 >0.808 >0.899 85.344 1
Video Conference Software 109–114 >0.828 >0.910 87.367 1
Geographical Information Systems 115–117 >0.760 >0.872 80.632 1
3-D Virtual Worlds 118–121 >0.742 >0.862 82.110 1
SOCIAL/CONTEXTUAL SUPPORT (122–131) Social/Contextual Support 122–131 >0.586 >0.765 71.192 1 0.953

Problem Context technology competency area in Problem/Project framework title includes 4 items. The lowest factor load value of these 4 items is 0.639. And the only factor that emerges explains 67.980% of the variance. There are 7 items in the problem representation technology competency area, while the lowest factor load value of these 7 items is 0.456. And 58.453% of the variance is explained by the only factor that occurs. The manipulation space technology competency area has 8 items, while the lowest factor load value of these 8 items is 0.620. An item in this technology competency area has been removed because the load values in the two factors are very close to each other. The total variance explanation rate of two factors emerging in the manipulation space technology competency area is 60.518%. The Cronbach’s alpha value of these 19 items under the framework title of Problem/Project (1–19) was calculated as 0.942.

Video search engines are the only technology competency area in the Related Case framework title. And there are 5 items in this technology competency area. The lowest factor load value of these items is 0.495. The Cronbach’s alpha value of this factor dimension, which explains 59.271% of the total variance, is 0.817.

Search Engines technology competency area in Information Resources framework title includes 4 items. The lowest factor load value of these 4 items is 0.574. And the only factor that emerges explains 69.254% of the variance. There are 3 items in the E-Museums technology competency area, while the lowest factor load value of these 3 items is 0.831. And 85.678% of the variance is explained by the only factor that occurs. Online Questionnaires technology competency area has 5 items, while the lowest factor load value of these 5 items is 0.456. And the only factor that emerges explains 70.824% of the variance. Web Feeds (RSS) technology competency area includes 4 items. The lowest factor load value of these 4 items is 0.678. And the only factor that emerges explains 75.513% of the variance. The Cronbach’s alpha value of these 16 items under the framework title of Information Resources (25–40) was calculated as 0.953.

Language Translation Software technology competency area in Cognitive (Knowledge-Construction) Tools framework title includes 3 items. The lowest factor load value of these 3 items is 0.689. And the only factor that emerges explains 77.611% of the variance. There are 4 items in the Office Software technology competency area, while the lowest factor load value of these 4 items is 0.692. And 77.933% of the variance is explained by the only factor that occurs. Creating/Editing Multimedia technology competency area has 9 items, while the lowest factor load value of these 9 items is 0.707. The total variance explanation rate of two factors emerging in Creating/Editing Multimedia technology competency area is 65.162%. Digital Calendar Application technology competency area includes 3 items. The lowest factor load value of these 3 items is 0.743. And the only factor that emerges explains 80.048% of the variance. The Cronbach’s alpha value of these 18 items under the framework title of Cognitive (Knowledge-Construction) Tools (41–59) was calculated as 0.953.

Social Media technology competency area in Conversation and Collaboration Tools framework title includes 5 items. The lowest factor load value of these 5 items is 0.596. And the only factor that emerges explains 70.022% of the variance. There are 6 items in the E-Mail technology competency area, while the lowest factor load value of these 6 items is 0.539. And 68.178% of the variance is explained by the only factor that occurs. Online Communication Applications technology competency area includes 5 items. The lowest factor load value of these 5 items is 0.695. And the only factor that emerges explains 77.629% of the variance. Cloud Technologies technology competency area includes 9 items. The lowest factor load value of these 9 items is 0.634. And the only factor that emerges explains 80.405% of the variance. There are 7 items in the Wiki technology competency area, while the lowest factor load value of these 7 items is 0.784. And 80.024% of the variance is explained by the only factor that occurs. Forums technology competency area includes 6 items. The lowest factor load value of these 6 items is 0.788. And the only factor that emerges explains 82.091% of the variance. Blogs technology competency area includes 6 items. The lowest factor load value of these 6 items is 0.806. And the only factor that emerges explains 85.968% of the variance. There are 5 items in the E-Portfolios technology competency area, while the lowest factor load value of these 5 items is 0.808. And 85.344% of the variance is explained by the only factor that occurs. Video Conference Software technology competency area includes 6 items. The lowest factor load value of these 6 items is 0.828. And the only factor that emerges explains 87.367% of the variance. The geographical Information Systems technology competency area includes 3 items. The lowest factor load value of these 3 items is 0.760. And the only factor that emerges explains 80.632% of the variance. There are 4 items in the 3-D Virtual Worlds competency area, while the lowest factor load value of these 4 items is 0.742. And 82.11% of the variance is explained by the only factor that occurs. The Cronbach’s alpha value of these 62 items under the framework title of Conversation and Collaboration Tools (60–121) was calculated as 0.953.

Social/Contextual Support is the only technology competency area in the Social/Contextual Support framework title. And there are 10 items in this technology competency area. The lowest factor load value of these items is 0.586. The Cronbach’s alpha value of this factor dimension, which explains 59.271% of the total variance, is 0.953.

As detailed above, total variance explained values in each technology area are between 58.543% and 87.367%. The fact that the total variance explained values used in determining the factor structure are over %50 indicates that the factor is representative. Factor load values are between 0.456 and 0.831. Having factor load values above 40%, which indicates the relationship between each item and the factor it is in, indicates that the item measures the factor. Cronbach’s alpha values in CLEs framework titles, which can also be expressed as subscales of the measurement tool, were found to be between 0.817 and 0.993. Cronbach’s alpha values above 0.80 indicate that the survey is valid and reliable (Cronbach & Meehl, 1956). As a result of the principal component analysis made after the literature review, expert opinion, and focus group discussion, dble_TCS is considered to be a measurement tool for measuring students’ information and communication technology competencies in design-based learning environments.

Discussion and Conclusion

In the constructivist approach, planning and implementing learning environments is a costly and time-consuming process compared to behavioral and cognitive learning environments. Student’s cognitive readiness and technological competency play an important role in the success of this process. The desired level of student technology competency will provide an effective and efficient learning process while increasing student motivation. In this study, it is aimed to build on student technological competency areas in design-based learning environments by using Jonassen’s CLEs framework and to develop a scale that measures them. For this purpose, firstly, the literature was reviewed. 36 technological competency areas relevant to the framework and 198 items questioning the applicability of these areas in the learning environment have been built on. This question pool with 5-point Likert-type ratings was presented to one measurement evaluation specialist and two information communication technology specialists’ opinions. As a result of the specialist evaluation, 10 technology competency areas and 67 items under these areas were excluded from the question pool. Two technology competency areas (7 items in these) were included under a different framework title. The name of 1 technological competency area was changed. Following the expert opinion, 26 technology competency areas and 131 items under these areas were created. The scale was discussed with 8 students in a focus group interview setting. After this interview, the names of frequently used websites and technology applications have also been added to the technology competency domains. The name of a technology competency area was also changed after this interview. The 131-item draft scale was applied to 152 secondary school students. The Kaiser-Meyer-Olkin (KMO) and Barlett test results calculated separately for each of the technology competency areas in the CLEs framework title showed that the data set was suitable for principal component analysis.

In order to determine the construct validity of the scale, principal component analysis was conducted for each technology competency area on the scale. Apart from the two technology competency areas, a one-factor structure has emerged for each of the 24 technology competency areas. In these two technology competency areas, two-factor structures emerged. Total variance explained values in these factors were about 77% and 79%. An item with close similar factor load values in two factors was extracted from the scale. The scale was completed with 26 technology competency areas and 130 items under these areas. When the factor load, explained total variance and Cronbach’s alpha values of the CLEs framework titles, which can also be expressed as subscales of the scale, were examined and it was concluded that the scale was valid and reliable.

Scales with less than 30 minutes of survey response time are considered applicable surveys (Yücedağ, 1993). The average time to answer dble_TCS by 152 students was 15.96 minutes. Indecision and lack of information on a scale lead to long response times (Heerwegh, 2003). The response time is therefore shown as an indication of uncertainty and response errors in a survey (Yan & Tourangeau, 2008). Although dble_TCS developed in the study has 130 items, the low average response time indicates that dble_TCS is understandable by the target group and answered with a determined attitude. In light of these evaluations, dble_TCS is a feasible scale for the target audience. dble_TCS developed within the scope of the research is thought to be an appropriate tool for teachers and students to evaluate their technological competency in learning environments based on the constructivist approach.

Suggestions for Future Research

152 participants were included in the study to test the reliability of the developed scale. The study can be repeated by increasing the number of participants by at least 300. The participants included in the study are at the middle school level. The scale can be tested at different education levels by repeating the study at high school and higher education levels.

Limitations

Information and communication technologies are constantly evolving and changing. The study was carried out within the scope of currently developed technologies. These technologies are one of the limitations of the study. Another limitation is that the scale and study are designed and implemented within Jonassen’s (1999) framework of Constructivist Learning Environment design principles. Although it is considered a comprehensive guide for designing constructivist learning, elements of other constructivist learning design guidelines, such as R2D2 (Willis, 1995), were not considered in this study.

Competing Interests

The authors have no competing interests to declare.

References

  1. Abdi, H., & Williams, L. J. (2010). Principal component analysis. WIREs Computational Statistics, 2(4), 433–459. DOI: https://doi.org/10.1002/wics.101 

  2. Amir, Z., Ismail, K., & Hussin, S. (2011). Blogs in Language Learning: Maximizing Students’ Collaborative Writing. Kongres Pengajaran Dan Pembelajaran UKM, 2010, 18, 537–543. DOI: https://doi.org/10.1016/j.sbspro.2011.05.079 

  3. Apedoe, X. S., Reynolds, B., Ellefson, M. R., & Schunn, C. D. (2008). Bringing engineering design into high school science classrooms: The heating/cooling unit. Journal of Science Education and Technology, 17(5), 454–465. DOI: https://doi.org/10.1007/s10956-008-9114-6 

  4. Arroyo, A. G. C., Matías, F. M. C., & Escobar, O. D. O. (2020). Collaborative Work Mediated by the Use of Google Drive as a Teaching Strategy Learning in Subjects of Research in Students of the Health Area. The FASEB Journal, 34(S1), 1–1. DOI: https://doi.org/10.1096/fasebj.2020.34.s1.03831 

  5. Bakla, A. (2020). A mixed-methods study of feedback modes in EFL writing. Language Learning & Technology, 24(1), 107–128. 

  6. Basri, M., & Patak, A. A. (2015). Exploring Indonesian students’ perception on Mendeley Reference Management Software in academic writing. 2015 2nd International Conference on Information Technology, Computer, and Electrical Engineering (ICITACEE), 8–13. DOI: https://doi.org/10.1109/ICITACEE.2015.7437761 

  7. Besser, E. D., Blackwell, L. E., & Saenz, M. (2021). Engaging Students Through Educational Podcasting: Three Stories of Implementation. Technology, Knowledge and Learning, 1–16. DOI: https://doi.org/10.1007/s10758-021-09503-8 

  8. Borgman, C. L., Gilliland-Swetland, A. J., Leazer, G. H., Mayer, R., Gwynn, D., Gazan, R., & Mautone, P. (2000). Evaluating Digital Libraries for Teaching and Learning in Undergraduate Education: A Case Study of the Alexandria Digital Earth ProtoType (ADEPT). Library Trends, 49(2), 228. 

  9. Bouhnik, D., & Deshen, M. (2014). WhatsApp Goes to School: Mobile Instant Messaging between Teachers and Students. Journal of Information Technology Education, 13, 217–231. a9h. DOI: https://doi.org/10.28945/2051 

  10. Bozanta, A., & Mardikyan, S. (2017). The Effects Of Socıal Medıa Use On Collaboratıve Learnıng: A Case Of Turkey. The Turkish Online Journal of Distance Education, 18(1), 96–110. edsdoj. DOI: https://doi.org/10.17718/tojde.285719 

  11. Brondfield, S., Seol, A., Hyland, K., Teherani, A., & Hsu, G. (2019). Integrating Concept Maps into a Medical Student Oncology Curriculum. Journal of Cancer Education, 1–7. DOI: https://doi.org/10.1007/s13187-019-01601-7 

  12. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42. DOI: https://doi.org/10.3102/0013189X018001032 

  13. Çalık, A., Ulugergerli, E. U., Yaşar, C., & Taşpınar, K. (2016). Virtual Learning and Contribution from E-museum an Example for Earth Sciences. 16th International Multidisciplinary Scientific Geo Conference SGEM 2016, Conference Proceedings, Book5, 3, 1077–1084. 

  14. Cañas, A. J., Coffey, J. W., Carnot, M.-J., Feltovich, P., Hoffman, R. R., Feltovich, J., & Novak, J. D. (2003). A summary of literature pertaining to the use of concept mapping techniques and technologies for education and performance support. Report to the Chief of Naval Education and Training, Pensacola, Florida, IHMC. 

  15. Cassany, D., Villanueva, C. A., & Ferrer, M. S. (2019). Whatsapp alrededor del aula. Caracteres: Estudios Culturales y Críticos de La Esfera Digital, 8(2), 302–328. 

  16. Chandrasekaran, S., Stojcevski, A., Littlefair, G., & Joordens, M. (2013). Project-oriented design-based learning: Aligning students’ views with industry needs. International Journal of Engineering Education, 29(5), 1109–1118. 

  17. Chen, Y., & He, W. (2013). Security risks and protection in online learning: A survey. The International Review Of Research In Open And Distributed Learning, 14(5). DOI: https://doi.org/10.19173/irrodl.v14i5.1632 

  18. Chung, E. S., & Ahn, S. (2021). The effect of using machine translation on linguistic features in L2 writing across proficiency levels and text genres. Computer Assisted Language Learning, 1–26. DOI: https://doi.org/10.1080/09588221.2020.1871029 

  19. Clifton, A., & Mann, C. (2011). Can YouTube enhance student nurse learning? Nurse Education Today, 31(4), 311–313. DOI: https://doi.org/10.1016/j.nedt.2010.10.004 

  20. Cothran, T. (2011). Google Scholar acceptance and use among graduate students: A quantitative study. Library & Information Science Research, 33(4), 293–301. DOI: https://doi.org/10.1016/j.lisr.2011.02.001 

  21. Cronbach, L. J., & Meehl, P. E. (1956). Construct validity in psychological tests. Minnesota Studies in the Philosophy of Science, 1, 174–204. 

  22. Davis, M. (1998). Making a case for design-based learning. Arts Education Policy Review, 100(2), 7–15. DOI: https://doi.org/10.1080/10632919809599450 

  23. Dershowitz, N., & Treinen, R. (1998). An on-line problem database. International Conference on Rewriting Techniques and Applications, 332–342. DOI: https://doi.org/10.1007/BFb0052380 

  24. Dickey, M. D. (2003). Teaching in 3D: Pedagogical Affordances and Constraints of 3D Virtual Worlds for Synchronous Distance Learning. Distance Education, 24(1), 105–121. DOI: https://doi.org/10.1080/01587910303047 

  25. Doppelt, Y. (2009). Assessing creative thinking in design-based learning. International Journal of Technology and Design Education, 19(1), 55–65. DOI: https://doi.org/10.1007/s10798-006-9008-y 

  26. Doppelt, Y., & Schunn, C. D. (2008). Identifying students’ perceptions of the important classroom features affecting learning aspects of a design-based learning environment. Learning Environments Research, 11(3), 195–209. DOI: https://doi.org/10.1007/s10984-008-9047-2 

  27. Duffy, P. D., & Bruns, A. (2006). The Use of Blogs, Wikis and RSS in Education: A Conversation of Possibilities. Online Learning and Teaching Conference 2006, 31–38. http://eprints.qut.edu.au/5398/ 

  28. Duffy, T., & Cunningham, D. (1996). Constructivism: Implications for the design and delivery of instruction. In D. Jonassen (Ed.), Handbook of Research for Educational Communications and Technology (pp. 170–198). Simon and Schuster. 

  29. ECDL. (n.d.). ECL Computer Skills Certification. European Computer Driving Licence Foundation. http://ecdl.org/ecdl-education 

  30. El Mhouti, A., Erradi, A. N. M., & Vasquèz, J. M. (2016). Cloud-based VCLE: A virtual collaborative learning environment based on a cloud computing architecture. Systems of Collaboration (SysCo), International Conference On, 1–6. DOI: https://doi.org/10.1109/SYSCO.2016.7831340 

  31. Ellefson, M. R., Brinker, R. A., Vernacchio, V. J., & Schunn, C. D. (2008). Design-based learning for biology: Genetic engineering experience improves understanding of gene expression. Biochemistry and Molecular Biology Education: A Bimonthly Publication of the International Union of Biochemistry and Molecular Biology, 36(4), 292–298. DOI: https://doi.org/10.1002/bmb.20203 

  32. Erdogdu, U. F., & Eskimen, A. D. (2020). Dünya Klasiklerini Blog Tasarımı Yoluyla Okuma Deneyimi -Bir Uygulama Çalışması. Selçuk Üniversitesi Edebiyat Fakültesi Dergisi, 44, 329–354. DOI: https://doi.org/10.21497/sefad.845426 

  33. Ersungur, Ş. M., Kiziltan, A., & Polat, Ö. (2007). Türkiye’de Bölgelerin Sosyo-Ekonomik Gelişmişlik Siralamasi: Temel Bileşenler Analizi. Atatürk Üniversitesi İktisadi ve İdari Bilimler Dergisi, 21(2), 55–66. 

  34. Europass. (2015). Digital competences—Self-assessment grid. European Union. https://europass.cedefop.europa.eu/sites/default/files/dc-en.pdf 

  35. European Commission. (2013). DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe. http://publications.jrc.ec.europa.eu/repository/bitstream/JRC83167/lb-na-26035-enn.pdf 

  36. Feyzi Behnagh, R., & Yasrebi, S. (2020). An examination of constructivist educational technologies: Key affordances and conditions. British Journal of Educational Technology, 51(6), 1907–1919. DOI: https://doi.org/10.1111/bjet.13036 

  37. Fortus, D., Krajcik, J., Dershimer, R. C., Marx, R. W., & Mamlok-Naaman, R. (2005). Design-based science and real-world problem-solving. International Journal of Science Education, 27(7), 855–879. DOI: https://doi.org/10.1080/09500690500038165 

  38. Gardner, G. E. (2012). Using Biomimicry to Engage Students in a Design-Based Learning Activity. The American Biology Teacher, 74(13), 182–184. DOI: https://doi.org/10.1525/abt.2012.74.3.10 

  39. Giavrimis, P., Papanis, E., & Roumeliotou, M. (2009). Issues of sociology of education. Sideris. 

  40. Guo, P. J., Kim, J., & Rubin, R. (2014). How Video Production Affects Student Engagement: An Empirical Study of MOOC Videos. Proceedings of the First ACM Conference on Learning @ Scale Conference, 41–50. DOI: https://doi.org/10.1145/2556325.2566239 

  41. Gurer, M. D. (2020). Sense Of Communıty, Peer Feedback And Course Engagement As Predıctors Of Learnıng In Blog Envıronments. Turkish Online Journal of Distance Education, 21(4), 237–250. DOI: https://doi.org/10.17718/tojde.803415 

  42. Gurung, R. K., Alsadoon, A., Prasad, P. W. C., & Elchouemi, A. (2016). Impacts of Mobile Cloud Learning (MCL) on Blended Flexible Learning (BFL). 2016 International Conference on Information and Digital Technologies (IDT), 108–114. DOI: https://doi.org/10.1109/DT.2016.7557158 

  43. Habeeb, K. M., & Ebrahim, A. H. (2019). Impact of e-portfolios on teacher assessment and student performance on learning science concepts in kindergarten. Education and Information Technologies, 24(2), 1661–1679. DOI: https://doi.org/10.1007/s10639-018-9846-8 

  44. Halpern, D., Piña, M., & Ortega-Gunckel, C. (2020). School performance: New multimedia resources versus traditional notes//El rendimiento escolar: Nuevos recursos multimedia frente a los apuntes tradicionales. Comunicar, 28(64), 39–48. DOI: https://doi.org/10.3916/C64-2020-04 

  45. Heerwegh, D. (2003). Explaining response latencies and changing answers using client-side paradata from a web survey. Social Science Computer Review, 21(3), 360–373. DOI: https://doi.org/10.1177/0894439303253985 

  46. Ioannou, A., Brown, S. W., & Artino, A. R. (2015). Wikis and forums for collaborative problem-based activity: A systematic comparison of learners’ interactions. The Internet and Higher Education, 24, 35–45. DOI: https://doi.org/10.1016/j.iheduc.2014.09.001 

  47. Ishtaiwa, F. F., & Aburezeq, I. M. (2015). The impact of Google Docs on student collaboration: A UAE case study. Learning, Culture and Social Interaction, 7, 85–96. DOI: https://doi.org/10.1016/j.lcsi.2015.07.004 

  48. ISTE. (2016). ISTE Standards Students 2016. https://www.iste.org/standards/standards/for-students-2016 

  49. Jackson, J. E. (2005). Varimax Rotation. In P. Armitage & T. Colton (Eds.), In Encyclopedia of Biostatistics. DOI: https://doi.org/10.1002/0470011815.b2a13091 

  50. Jarmon, L., Traphagan, T., Mayrath, M., & Trivedi, A. (2009). Virtual world teaching, experiential learning, and assessment: An interdisciplinary communication course in Second Life. Computers & Education, 53(1), 169–182. DOI: https://doi.org/10.1016/j.compedu.2009.01.010 

  51. Jimoyiannis, A., & Tsiotakis, P. (2016). Self-directed learning in e-portfolios: Analysing students’ performance and learning presence. EAI Endorsed Transactions on E-Learning, 3(10), 1–9. edsdoj. DOI: https://doi.org/10.4108/eai.11-4-2016.151154 

  52. Jonassen, D. H. (1999). Designing constructivist learning environments. Instructional Design Theories and Models: A New Paradigm of Instructional Theory, 2, 215–239. 

  53. Kafai, Y. (1995). Making Game Artifacts To Facilitate Rich and Meaningful Learning. Annual Meeting of the American Educational Research Association, SanFrancisco. 

  54. Kaiser, H. F. (1958). The varimax criterion for analytic rotation in factor analysis. Psychometrika, 23(3), 187–200. DOI: https://doi.org/10.1007/BF02289233 

  55. Kapoor, A., Tiwari, V., & Kapoor, A. (2019). Teaching undergraduates beyond the classroom: Use of WhatsApp. Indian Pediatrics, 56(11), 967–969. DOI: https://doi.org/10.1007/s13312-019-1664-6 

  56. Karaman, H., Burcu, A., & Aktan, D. Ç. (2017). Açımlayıcı faktör analizinde kullanılan faktör çıkartma yöntemlerinin karşılaştırılması. Gazi Üniversitesi Gazi Eğitim Fakültesi Dergisi, 37(3), 1173–1193. DOI: https://doi.org/10.17152/gefad.309356 

  57. Ke, F. (2014). An implementation of design-based learning through creating educational computer games: A case study on mathematics learning during design and computing. Computers & Education, 73, 26–39. DOI: https://doi.org/10.1016/j.compedu.2013.12.010 

  58. Kim, M. K., & Kim, S. M. (2020). Dynamic learner engagement in a wiki-enhanced writing course. Journal of Computing in Higher Education, 32(3), 582–606. DOI: https://doi.org/10.1007/s12528-019-09248-5 

  59. Kim, P., Suh, E., & Song, D. (2015). Development of a design-based learning curriculum through design-based research for a technology-enabled science classroom. Educational Technology Research and Development, 63(4), 575–602. DOI: https://doi.org/10.1007/s11423-015-9376-7 

  60. Kleinbaum, D. G., Kupper, L. L., Muller, K. E., & Nizam, A. (1988). Applied regression analysis and other multivariable methods (Vol. 601). Duxbury Press, Belmont, CA. 

  61. Kolodner, J. L., Crismond, D., Gray, J., Holbrook, J., & Puntambekar, S. (1998). Learning by design from theory to practice. Proceedings of the International Conference of the Learning Sciences, 98, 16–22. http://www.cc.gatech.edu/projects/lbd/htmlpubs/lbdtheorytoprac.html 

  62. Lam, R. (2020). E-Portfolios: What We Know, What We Don’t, and What We Need to Know. RELC Journal. DOI: https://doi.org/10.1177/0033688220974102 

  63. Lazonder, A. W. (2001). Minimalist Instruction for Learning to Search the World Wide Web. Education and Information Technologies, 6(3), 161–176. DOI: https://doi.org/10.1023/A:1012756223618 

  64. Lee, H.-K., & Breitenberg, M. (2010). Education in the New Millennium: The Case for Design-Based Learning. International Journal of Art & Design Education, 29(1), 54–60. DOI: https://doi.org/10.1111/j.1476-8070.2010.01631.x 

  65. Lee, S.-M., & Briggs, N. (2021). Effects of using machine translation to mediate the revision process of Korean university students’ academic writing. ReCALL, 33(1), 18–33. DOI: https://doi.org/10.1017/S0958344020000191 

  66. Levy, P. (1997). Collective Intelligence: Mankind’s Emerging World in Cyberspace. Cambrigde, Mass.: Perseus Books. 

  67. Li, Y., Chen, K., Su, Y., & Yue, X. (2021). Do social regulation strategies predict learning engagement and learning outcomes? A study of English language learners in wiki-supported literature circles activities. Educational Technology Research and Development, 1–27. DOI: https://doi.org/10.1007/s11423-020-09934-7 

  68. Lin, Y.-T., Wen, M.-L., Jou, M., & Wu, D.-W. (2014). A cloud-based learning environment for developing student reflection abilities. Computers in Human Behavior, 32, 244–252. DOI: https://doi.org/10.1016/j.chb.2013.12.014 

  69. Lonn, S., & Teasley, S. D. (2009). Saving time or innovating practice: Investigating perceptions and uses of Learning Management Systems. Computers & Education, 53(3), 686–694. DOI: https://doi.org/10.1016/j.compedu.2009.04.008 

  70. Luchoomun, D., McLuckie, J., & van Wesel, M. (2010). Collaborative e-Learning: E-Portfolios for Assessment, Teaching and Learning. Electronic Journal of E-Learning, 8(1), 21–30. eric. 

  71. Marzano, A., & Miranda, S. (2021). The DynaMap Remediation Approach (DMRA) in online learning environments. Computers & Education, 162, 104079. DOI: https://doi.org/10.1016/j.compedu.2020.104079 

  72. Mehalik, M. M., Doppelt, Y., & Schuun, C. D. (2008). Middle-school science through design-based learning versus scripted inquiry: Better overall science concept learning and equity gap reduction. Journal of Engineering Education, 97(1), 71–85. DOI: https://doi.org/10.1002/j.2168-9830.2008.tb00955.x 

  73. Mitchell, G., Scott, J., Carter, G., & Wilson, C. B. (2021). Evaluation of a delirium awareness podcast for undergraduate nursing students in Northern Ireland: A pre/post-test study. BMC Nursing, 20(1), 1–11. DOI: https://doi.org/10.1186/s12912-021-00543-0 

  74. Moryl, R. (2013). T-shirts, moonshine, and autopsies: Using podcasts to engage undergraduate microeconomics students. International Review of Economics Education, 13, 67–74. DOI: https://doi.org/10.1016/j.iree.2013.02.001 

  75. Mracek, D. (2019). The Google search engine: A blended-learning tool for student empowerment. 2019 International Symposium on Educational Technology (ISET), 224–229. DOI: https://doi.org/10.1109/ISET.2019.00054 

  76. Muzdalifah, I., & Handayani, S. (2020). Improving English Speaking Competence by Using Google Translate in Campus Environment. IOP Conference Series: Earth and Environmental Science, 469(1), 012039. DOI: https://doi.org/10.1088/1755-1315/469/1/012039 

  77. Neill, S. (2008). Assessment of the NEOTHEMI virtual museum project – An on-line survey. Computers & Education, 50(1), 410–420. DOI: https://doi.org/10.1016/j.compedu.2006.08.001 

  78. Nelson, D. (2004). Design based learning delivers required standards in all subjects, K–12. Journal of Interdisciplinary Studies, 17, 27–36. 

  79. NETg. (n.d.). NETg. https://kb.iu.edu/d/aidy 

  80. Ngaleka, A., & Uys, W. (2013). M-learning with whatsapp: A conversation analysis. International Conference on E-Learning, 282. 

  81. Nilsen, L. L. (2011). Collaboration and learning in medical teams by using video conference. Behaviour & Information Technology, 30(4), 507–515. a9h. DOI: https://doi.org/10.1080/0144929X.2011.577193 

  82. Noel, L. (2015). Using Blogs to Create a Constructivist Learning Environment. International Conference on New Horizons in Education, INTE 2014, 25-27 June 2014, Paris, France, 174, 617–621. DOI: https://doi.org/10.1016/j.sbspro.2015.01.591 

  83. Novak, J. D. (1990). Concept maps and Vee diagrams: Two metacognitive tools to facilitate meaningful learning. Instructional Science, 19(1), 29–52. DOI: https://doi.org/10.1007/BF00377984 

  84. P21. (2019). Framework for 21st Century Learning. Partnership for 21st Century Skills. http://static.battelleforkids.org/documents/p21/P21_Framework_Brief.pdf 

  85. Palla, I. A., & Sheikh, A. (2020). Impact of social media on the academic performance of college students in Kashmir. Information Discovery and Delivery. DOI: https://doi.org/10.1108/IDD-06-2020-0061 

  86. Ruggieri, C. (2020). Students’ use and perception of textbooks and online resources in introductory physics. Physical Review Physics Education Research, 16(2), 020123. DOI: https://doi.org/10.1103/PhysRevPhysEducRes.16.020123 

  87. Savelyeva, N., Kabanov, A., Uvarina, N., Alexey, S., Nevraeva, N., Bozhko, E., Kozhevnikov, M., & Lapchinskaia, I. (2021). Pedagogical features of the organization of competence-oriented work and interaction of students with the use of google-doc tools. Applied Linguistics Research Journal, 5(1), 49–53. DOI: https://doi.org/10.14744/alrj.2020.53824 

  88. Seitamaa-Hakkarainen, P. (2011). Design based learning in crafts education: Authentic problems and materialization of design thinking. Design Learning and Well-Being, 4, 3–14. 

  89. Shana, Z. (2009). Learning with Technology: Using Discussion Forums to Augment a Traditional-Style Class. Educational Technology & Society, 12(3), 214–228. 

  90. Shin, H. S., & Jeong, A. (2021). Modeling the relationship between students’ prior knowledge, causal reasoning processes, and quality of causal maps. Computers & Education, 163, 104113. DOI: https://doi.org/10.1016/j.compedu.2020.104113 

  91. Snyder, W. M., Wenger, E., & Briggs, X. (2003). Communities of Practice in Government: Leveraging Knowledge for Performance. Public Manager, 32(4), 17. 

  92. So, S. (2016). Whatslearn: The use of whatsapp for teaching and learning. Turkish Online Journal of Educational Technology, 2016(December Special Issue), 1359–1365. edselc. DOI: https://doi.org/10.1037/t58376-000 

  93. Sokolik, M. E. (2011). A Route to Communication: Google Maps. TESL-EJ, 15(3), 1. 

  94. Sue, V. M., & Ritter, L. A. (2011). Conducting online surveys. Sage Publications. DOI: https://doi.org/10.4135/9781506335186 

  95. Tabachnick, G. B., & Fidell, L. S. (2013). Using multivariate statistics (6th ed.). Pearson. 

  96. Tavşancıl, E. (2002). Tutumların Ölçülmesi ve SPSS ile Veri Analizi. Nobel Akademik Yayıncılık. 

  97. Trocky, N. M., & Buckley, K. M. (2016). Evaluating the Impact of Wikis on Student Learning Outcomes: An Integrative Review. Journal of Professional Nursing, 32(5), 364–376. DOI: https://doi.org/10.1016/j.profnurs.2016.01.007 

  98. van Rensburg, A., Snyman, C., & Lotz, S. (2012). Applying Google Translate in a higher education environment: Translation products assessed. Southern African Linguistics & Applied Language Studies, 30(4), 511–524. a9h. DOI: https://doi.org/10.2989/16073614.2012.750824 

  99. Warburton, S., García, M. P., & Russell, D. (2009). 3D design and collaboration in massively multi-user virtual environments (MUVEs). Cases on Collaboration in Virtual Learning Environments: Processes and Interactions, 27–41. DOI: https://doi.org/10.4018/978-1-60566-878-9.ch002 

  100. Warschauer, M. (1995). E-Mail for English Teaching: Bringing the Internet and Computer Learning Networks into the Language Classroom. ERIC. 

  101. Wenger, E., McDermott, R. A., & Snyder, W. (2002). Cultivating communities of practice: A guide to managing knowledge. Harvard Business Press. 

  102. Willis, J. (1995). A Recursive, Reflective Instructional Design Model Based on Constructivist-Interpretivist Theory. Educational Technology, 35(6), 5–23. 

  103. Yan, T., & Tourangeau, R. (2008). Fast times and easy questions: The effects of age, experience and question complexity on web survey response times. Applied Cognitive Psychology: The Official Journal of the Society for Applied Research in Memory and Cognition, 22(1), 51–68. DOI: https://doi.org/10.1002/acp.1331 

  104. Yücedağ, A. (1993). Anket Geliştirilmesi ve Uygulanması. Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 26(2). 

  105. Zhang, X., Chen, H., de Pablos, P. O., Lytras, M. D., & Sun, Y. (2016). Coordinated Implicitly? An Empirical Study on the Role of Social Media in Collaborative Learning. International Review of Research in Open & Distance Learning, 17(6), 121–144. DOI: https://doi.org/10.19173/irrodl.v17i6.2622 

comments powered by Disqus