Start Submission Become a Reviewer

Reading: The Problem-identification Process Prior to the Initiation of a Networked Improvement Community

Download

A- A+
Alt. Display

Research

The Problem-identification Process Prior to the Initiation of a Networked Improvement Community

Author:

Julie Kallio

University of Wisconsin – Madison, US
X close

Abstract

In this paper, I present a design case of the problem-identification process prior to the initiation of a Networked Improvement Community (NIC). A NIC is a type of research-practice partnership (RPP) that brings together researchers and practitioners to tackle complex problems of practice, and in doing so, proposes a social reorganization of the traditional education change processes. Central to initiating a NIC, and RPPs more broadly, is the identification of a common problem of practice, but this step often takes place before research on a partnership begins. To investigate how a problem of practice is identified, I use the case of PiPNIC, the Personalization in Practice – Networked Improvement Community, in which a team of university-based researchers used participatory design methods to identify a common problem of practice that would ultimately bring together educators from five schools to participate in the NIC. In the case, I show how the research team constructed a rich problem-solution space and identified a different problem of practice than the research team initially conceived. The problem-identification process, I therefore argue, should be included as a critical component of the NIC initiation framework, and I suggest the “problem-solution space” as a conceptual tool for the joint negotiation of problem identification. The case illuminates how NICs operationalize a social reorganization of research and development in education.

How to Cite: Kallio, J. (2022). The Problem-identification Process Prior to the Initiation of a Networked Improvement Community. Designs for Learning, 14(1), 58–71. DOI: http://doi.org/10.16993/dfl.186
38
Views
12
Downloads
1
Twitter
  Published on 30 May 2022
 Accepted on 15 May 2022            Submitted on 20 Aug 2021

Problem Statement

One way to tackle the complex challenges of change in education is a partnership between researchers and practitioners. At the outset of such partnerships, someone often asks, “What problem should we start with?” This design case study investigates the process by which researchers identified a problem of practice in collaboration with educators to initiate the Personalization in Practice – Networked Improvement Community (PiPNIC), a NIC that brought together educators and researchers around the common problem of conferring. Conferring was a different problem than the researchers initially conceived for the partnership, therefore the analysis of the case focuses on how researchers designed for educator participation in the problem-identification process. The focus on participation in problem identification is one aspect of the broader project of social reorganization of research and development proposed by Networked Improvement Communities. In the analysis, I show how the case of PiPNIC provides practical and theoretical considerations into how and by whom problems are identified in the initiation of a NIC.

The case of PiPNIC begins 8 months prior to its initiation and provides a unique case in that the problem-identification process was documented as it unfolded, rather than recalled as is often the case with partnership research (Coburn & Penuel, 2016). To examine problem identification, I focus on what decisions were made and who made them, connecting these decisions with the iterative process that resulted in identifying a common problem of practice.

This paper begins with review of the literature on NICs and problems of practice. I then describe the context for PiPNIC and the research methods. In the findings, I share the core activities to construct the problem-solution space and select a problem-solution pair, then discuss the implications of the case. Finally, I argue that problem identification should be included as a distinct component in the NIC initiation model.

Literature Review

Networked Improvement Communities (NICs) are an increasingly popular type of research-practice partnership in education (Coburn & Penuel, 2016). NICs bring researchers, practitioners, and other experts together to examine how a problem occurs in local contexts, identify root causes, and build robust data pathways to inform the iterative design of solutions (Bryk et al., 2015). Early examples of NICs, such as the Community College Pathways NIC, demonstrated remarkable success where others failed (Baron, 2017), prompting broad interest in using NICs to address other seemingly intractable problems of practice (LeMahieu et al., 2017). Explicating how problems are identified to initiate a NIC contributes to the design knowledge of how to initiate a NIC around a common problem of practice.

The problem-centered focus and participatory social structure of NICs represent a shift toward a social reorganization of research and development (R&D) activities (Bryk et al., 2011) that contrasts with the traditional linear model, where change is realized through a progression from basic research to application then dissemination (Stokes, 2011). In the linear model, researchers are knowledge creators and practitioners are implementers who do not interact with each other in any long-term, collaborative manner, often called the “research-practice gap” (Biesta, 2007).

In complex systems such as education, this gap has negative consequences. When researchers’ interests do not match practitioners’ needs, the uptake of solutions is unlikely (Cain, 2015). The exclusion of practitioners from R&D activities is one way that practitioners are marginalized from change processes, and this exclusion can rhetorically and materially deskill teaching as a profession, building resentment and resistance to externally created solutions (Apple, 1985). Some educational leaders reject research to demonstrate agency and status (Roegman & Woulfin, 2019). The problem-centered focus of the NIC model is therefore both a pragmatic attempt to solve complex problems and an ambitious social project to shift how and by whom systems change happens (Bryk & Gomez, 2007).

Case studies of NICs and other RPPs have been used to improve the NIC model by explicating how they are initiated (Russell et al., 2017) and organized (Bryk et al., 2015; LeMahieu et al., 2017). The NIC initiation model (Russell et al., 2017) was hypothesized to include five domains: developing a theory of practice improvement; building a measurement and analytics infrastructure; learning and using improvement research methods; leading, organizing, and operating the network; and fostering the emergence of culture, norms, and identity consistent with network aims. These five areas are diagrammed around a central problem of practice. The joint negotiation of a common problem of practice is identified as a critical task for research-practice partnership initiation (Donovan, 2011; Penuel et al., 2015), yet no research case has yet articulate how and by whom a NIC’s problem of practice was selected.

Theoretical Framework

As a theoretical framework, I describe the core concepts of problem-identification, beginning with how to conceptualize a problem. A problem exists when there is a gap between a current reality and a desired state. Problem identification is recognizing and articulating a gap, whereas problem solving is the set of actions to move to the desired state. Problem identification includes the activities described in a range of fields, including problem framing (Mintrop & Zumpe, 2016), formulation (Baer et al., 2013), and finding (Nickerson, Yen, & Mahoney, 2012). I locate problem analysis, usually done in NICs with improvement science tools, such as root cause analyses (Crow et al., 2019), as the first step of problem solving, which can happen once the initial problem is articulated. To be sure, problem identification and problem solving are related, with analyses revealing new insights, leading to new problems. Here, I focus on the initial problem identification that provides the first step into the complex system.

The process of problem identification itself has two parts: the construction of the problem-solution space and the selection of a problem. I use the term problem-solution space and problem-solution pairs, similar to the idea of need-solution pairs (von Hippel & von Krogh, 2016) as a way to capture the dynamic relationship between the two and the iterative process of probing the existing system for gaps, opportunities, and expertise. The construction of the problem-solution space is a set of activities that people do to define the relevant parameters of the context in which a problem or problems exist, as well as the ideas for solutions (Newell & Simon, 1972). In these activities, people from within and outside the system attempt to articulate the gaps between current and desired states, often dreaming about an ideal state and/or proposing potential ways to get there. These dreams and potential solutions can illuminate relevant features of the context or constraints of the system (DiSalvo et al., 2017), though there is a risk that partners will jump to enacting their solutions before adequate understanding of the problem, sometimes called “solutionitis” (Kivel, 2015).

If the parameters of the problem and tasks to solve the problem appear straightforward, these are often called tame, well-structured, or well-defined problems; in contrast, wicked, complex, or ill-structured problems are difficult to define, involve complex systems, and do not stay solved (Gomez et al., 2016). Problems in education are almost always this latter type. Education is a complex system comprised of dynamic and interconnected subsystems whose interactions contribute to the problem of practice, where “events and actions have multiple causes and consequences, and … order and structure coexist at many different scales of time, space, and organization” (Jacobson & Wilensky, 2006, p.12). City et al. (2009) use the term problem of practice to situate the undesired state in the routines and actions of the people in the organization.

From this perspective, a problem of practice might more aptly be considered a property of the complex social system, rather than an entity in and of itself. For this reason, any problem of practice is an initial step into learning, understanding, mapping, and ultimately changing, the system. Furthermore, articulating a problem of practice is done by people who are themselves part of the system. Even As researchers wade into practitioner conversations, they become part of the system. Consequently, individuals’ social and organizational positions, identities, experiences, and prior or current knowledge of the system will impact what they recognize as a problem, what represents an ideal state, and what change ideas they have (Philip et al., 2018). Due to this constructed and complex nature of problems of practice, identifying a problem of practice for a NIC requires multiple perspectives from people in different positions in the system.

Research Design

In this paper, I ask the question, How is the problem-identification process prior to the initiation of a NIC part of the broader social reorganization of research and development proposed by the NIC model? I use the design case of a university-based research team that used participtory design methods to identify a problem of practice to initiate PiPNIC, the Personalization in Practice – Networked Improvement Community. In this section, I begin with the analytic approach of a design case, then describe the partnership context, the data collection, and the data analysis. Finally, I reflect on my positionality as PiPNIC project manager.

Design Case

Presenting the theoretical and pragmatic insights from a design experiment as a case has been well established in the field of the learning sciences (Cobb et al., 2003). Design cases are “a specialized and critical form of design knowledge” (Boling, 2010, p.1) that aim to describe the real events, relationships, and activities that happened in connection to the artifact that was designed and the design moves that were made and by whom they were made. Design cases are not a recipe for future action; rather they communicate the kinds of precedent knowledge that designers use to reason and inform future design (Lawson, 2004).

I bound the case by those activities related specifically to the problem-identification process. In writing the case, I include a description of the partnership context, an explanation of data that was collected and how it was analyzed, and a description of the features what I determined to be the two parts of the PiPNIC problem-identification process. I aim to provide the reader, who was not present in the activities, the necessary information to evaluate my argument. Through attention to multiple perspectives, transparency in data collection and analysis, self-reflection on my own positionality, I aim build a trustworthy design case that, through its particulars, communicates design knowledge about the social reorganization of research and development in the work prior to NIC initiation.

The design case of PiPNIC is meant as an illustrative case to examine problem identification prior to the initiation of a NIC. PiPNIC was a successful case in that our initiation team identified a problem, recruited participants, and initiated the NIC in the spring of 2017. There is much to be learned from successful cases, though the goal is not to say that the design decisions in PiPNIC generalize to all problem-identification processes or to all NIC initiations. For instance, the scale of PiPNIC is much smaller than many of the networks or NICs that have been used as instrumental cases in other NIC research (e.g. Russell et al., 2017). I aim to use this smaller, more agile design case as a way to “learn fast” to understand NICs as a model for reorganizing educational R&D.

Partnership Context

PiPNIC grew out of an existing research alliance, the Personalization in Practice (PiP) partnership. PiP was a collaboration between education researchers at the University of Wisconsin-Madison, leaders at the Institute for Personalized Learning (IPL), and policymakers at the Wisconsin Department of Public Instruction (DPI). The goal of PiP was to study how public schools design and implement personalized learning (PL) strategies in K-12 schools. PL represents a range of approaches to redesign schooling around student interests, strengths, and needs. The PiP team’s initial research report described three domains of shifting practices: student agency over how their learning was organized; the co-construction of learning pathways through regular, data-driven conferring; and the use of technologies to support these learning pathways (Halverson et al., 2015).

The PiP team noticed that PL educators, operating at the leading edge of educational innovation, regularly encountered problems of practice with no clear solutions and little existing research to guide them. For example, educators commented that state and district standardized tests were not accurate measures of learning for their programs. Without accurate measures, PL educators struggled to document the impact of their program and thus programs struggled to scale beyond small implementations. In the spring of 2016, the PiP team identified a Networked Improvement Community model as a way to structure a collective effort to share and improve these emerging practices at scale, beginning with the idea of developing accurate measures of learning in PL environments.

Participatory Design Approach to Partnership

We used contextual inquiry as a participatory design method to center educators in the problem-identification process. Participatory design is an iterative process of creating and revising solutions in which the user is intimately involved in both defining which problem to solve and solving the problem (Schuler & Namioka, 1993). Participatory design goes well beyond attendance or buy-in; it is a process “to uncover self-motivations, identities, and interests and to construct meaningful engagements by working together with participants” (DiSalvo & DiSalvo, 2014, p.1). Indeed, participatory design “takes work, and new ways of thinking, and new kinds and methods of openness …. to bring [in] users’ knowledges and perspectives” (Muller & Druin, 2007, p.3).

Contextual inquiry (CI) is a design research technique (Koskinen et al., 2011) to “[work] with users to help them articulate their current work practices, system practices, and associated experiences” (Holtzblatt & Jones, 1993, p.177). Holtzblatt and Jones (1993) describe contextual inquiry as a set of three practices: one, be in the work context to observe interactions and ask specific questions; two, position the user as the expert through genuine curiosity and open-ended questions; and three, intentionally widen your attention to observe offhand comments, movements, or diversions that might be important and narrow your focus to follow on potential opportunities.

Contextual inquiry is one of many field research techniques, such as user experience (Bullen & Bennett, 1990) and rapid ethnographic assessments (Harris et al., 1997). Its objective, however, differs. Where the goal of ethnography is to understand why people do what they do, contextual inquiry is meant provide sufficient understanding of the current work system to articulate and act on a problem (Blomberg & Karasti, 2012). In this way, contextual inquiry is theoretically well aligned with the NIC model and to support the social reorganization of R&D by putting the researcher into the work context of educators and providing practices for problem identification. How this is operationalized in practice is taken up in the description of the design case of PiPNIC.

Data Collection

Careful ethical considerations were made in the construction of the partnership, collection of the data, and in my reflections of my own positionality in the analysis of the project. In terms of data collection, all activities in the context of PiP and PiPNIC partnership activities were approved by the university’s internal review board (Approval 2014-1567-CR006). The NIC initiation team began collecting data in the spring of 2016, eight months prior to the first official NIC meeting. Seven people formed the NIC initiation team: three members of the original PiP research team (one faculty Principal Investigator, or PI, and two graduate students) plus three additional university researchers and one new graduate student.

A participatory design approach to data collection has both a pragmatic and theoretical orientation, leading us to generate a rich and complex array of longitudinal documentation of the process, records of meeting, decisions, reflections, interactions, and artifacts. Because design experiments involve interventions in which the participants are actively making the changes, documentation both creates a record of the process and influences the process itself. Cobb and colleagues (2003) describe, “a central challenge in conducting retrospective analyses is to work systematically through the extensive, longitudinal data sets generated in the course of a design experiment so that the resulting claims are trustworthy” (p.13). To this end, in Table 1, I summarize the data collected and how each was used in the design of the problem-identification process and in this analysis.

Table 1

Summary of data collected and its use in the design and analysis of the problem-identification process.


DATA TYPE DESCRIPTION USE IN DESIGN USE IN ANALYSIS

Grant Application Written document Planning the partnership Record of the initiation team’s plan for the NIC, with whom, and around what problem

Participant Observation Initiation team observations Record for ideas to pursue, people to contact Used to identify features

Emailed reflections from convening participants Follow up emails asking for feedback on the meeting and asking them to nominate schools or educators that we should follow up with Nominated connections were contacted for listening conversations. Feedback was used to identify what features stood out to participants.

Expert Convening Report Summary of the organization of the meeting, attendees, major themes The summary was used for initial definitions and tensions. The Description and summary provided a set of features and tracked evolving understanding of the problem-solution space.

Google sheet with notes from phone calls and visits After each listening conversation, we would fill out a form with the person’s answers, any other notes, and whether this contact would be a good follow up for the NIC. (See Appendix A for the question protocol.) This document was the focus of meetings to make sense of what we were hearing in the statewide conversations. The spreadsheet of responses provided a record of who from the initiation team was doing the phone calls and visits, who was contracted, and a record of the content of their conversation.

Meeting notes One Google document, including when the meeting was, who was present, the agenda, and notes on conversation, and shared with the team for collaborative input and/or correction Record of decisions, follow up actions, and debriefs The meeting notes provided a chronology of actions and sequence of problem articulation.

Research Group Meeting Presentation and Meeting Summary The team created a set of slides to describe the listening process and present the three possible problem-solution pairs. After the meeting, I wrote up a description of the conversation. The slides generated conversation about which problem-solution pair would be selected. The meeting summary was used to share the decision with team members who were not present. The presentation slides plus the meeting summary provided the primary source of data for how we made the decision to select conferring protocols.

Partnership Memo At the conclusion of NIC activities, we wrote memos summarizing PiPNIC activities. I wrote a semi-structured, meta-design memo, focused on network-level activities. The memo integrated data sources described in this table and the team cross-checked each memo for accuracy. Not applicable Triangulated source of the chronology and features of activities

One limitation in our data collection is lack of audio or video recording of meetings and listening sessions with educators. The initiation team made the decision not to record the listening sessions because we thought it would create a barrier to open and honest conversations, and our goal was to get out into the field quickly. In future work, an audio recording of initiation and subsequently hub team meetings could be used to study the shift in discourse around problems, solutions, relationships, or other issues.

Data Analysis

As described above, problem identification includes the construction of the problem-solution space and the selection of a problem-solution pair. In the analysis, I began by writing a detailed description of partnership activities in the eight months prior to the initiation of PiPNIC. I then constructed diagrams of the problem-solution space, tracing its evolution through the record of open-ended questions across grant documents, presentations, meetings records, and conversations notes. Open-ended questions, a key feature of contextual inquiry, are those that begin with operators such as “how”, “what”, or “why,” and increasing specificity of questions suggests a narrowing of a problem space (Halverson, 2002). I then connected the change in question specificity with the features of the problem-identification process. The connection between activities and problem space is described in part one of the findings. Part two of the findings is a lightly-edited memo written at the time of problem selection. This memo shows the process by which the initiation team weighed the opportunities and challenges of selecting each problem-solution pair.

Ethical Considerations

My role on the PiPNIC team was integral to partnership activities and impacts the design case presented in this paper. First, as the project director, I influenced decisions made during the process, though they were never wholly mine. I was a graduate student with less positional power and younger than most of the PiPNIC team. Most decisions were made by consensus across at least three people, usually the PI, at least one university researcher, and myself. Relative to the other graduate students on the team, I had been in the program the longest. I had productive relationships with each of them: some from collaboration on PiP, some from classes taken together. My role became one of brokering within the team, facilitating and mediating questions, soliciting individuals’ feedback or ideas and bringing them to the larger group. My central involvement in the design of PiPNIC initiation is essential to my understanding of how the project unfolded and the practical and theoretical claims I am able to make (Cobb et al., 2003).

Second, I was the person primarily responsible for managing the data collection throughout the process, and my personal investment in the success of the project introduces bias into data collection and analysis. For example, some of my actions certainly went undocumented because I did not recognize them as relevant. Additionally, the analysis is intertwined with my own beliefs about and memories of the project. This increases the risk of telling a “just-so” story, an overly optimistic design narrative created to fit the data (Shavelson, et al., 2003). I aim to mitigate this bias in a few ways. First, during the design process, the partnership documents were collaboratively edited and document the negotiations that happened as the process unfolded. For example, in meetings, the meeting notes would be projected and shared so that everyone could (and did) edit them. Partnership documents reflect this collective input, providing validity to what was recorded. Second, in writing the analysis for this paper, two members of the initiation team provided significant feedback on my description and analysis of events. In fact, several times, they shared a different recollection of the details, pushing me to verify and triangulate each claim.

Findings

In this section, I describe the two parts problem-identification: constructing the problem-solution space and selecting a problem-solution pair.

Part One: Constructing the Problem-solution Space

As previously described, PiPNIC grew out of an existing research alliance where the research team found that PL educators struggled to document the impact of their program, preventing their programs from gaining the legitimacy they needed to scale. The initiation team began with the idea that accurate, valid, and legitimate measures of student learning in PL programs could support these programs to scale.

An Expert Convening

During the summer of 2016, the initiation team convened a group of 23 experts for a day-long meeting. At this event, the initiation team facilitated a process for attendees to share their current perceptions of needs and opportunities in Personalized Learning across the state. In our meeting notes in the days planning for this event, we wrote, “instead of beginning with designs to test, [let’s] begin with listening to what people are doing and what they know” [bold original].

The invitees included people involved in education across the state, including practicing educators, university-based staff, leaders from intermediary organizations, and DPI policymakers. Invitees had experience in personalized learning environments, improvement methods, professional learning, and/or district policies and were selected in consultation with the PiP research team, research alliance partners, and grant collaborators.

The intention for the convening was to begin to construct the problem-solution space around measures of student learning. The PI opened the convening with two questions: “How do teachers know when students are learning?” and “What are the design opportunities for schools to improve their ability to know that students are learning?” The initiation team then facilitated small-group discussions and whole-group reflections, providing multiple opportunities for exchange amongst participants. The focus was on sparking discussion amongst attendees, not telling them what the research team thought.

While these questions might appear naïve in their simplicity, they were meant to open up access to the discussion broadly for educators, policymakers, and researchers to engage with each other. The purpose of public American education is contested (Labaree, 1997), and it is a contested foundation on which we build systems of measurement and improvement. Beginning with these broad questions was a move that brought everyone in, a participatory move toward constructing as large an initial problem-solution space as possible.

The convening affirmed the need for valid measures of personalized learning, while expanding our understanding of what educators would like to measure. An educator described the distinction between “good students” and “competent learners,” where good students comply with traditional instruction and often perform well on standardized assessments, whereas a competent learner can organize their own learning, acts on their strengths and needs, and develops the non-cognitive skills1 they need to be successful. To this educator, personalized learning was meant to foster the latter.

PL educators also shared that they wanted measures to improve their teaching practices, not just student outcomes. They asked questions about how to collect data to guide their instructional decisions and develop practices to use this data. For example, they felt they needed more accurate, up-to-date information on each student’s needs and interests. A local principal asked, “What are current data practices [in personalized learning programs]?”, indicating their desire to learn from each other. From their questions, we noticed that PL programs may lack the practices and tools to use new measures of student outcomes.

Importantly, some attendees challenged the initiation team with questions critical of personalized learning and even our approach. One researcher was skeptical that personalized learning was a reform that would be desirable beyond this subset of educators and students. One superintendent asked why we were not focusing on how schools improve the instructional systems they already have.

A Statewide Listening Tour

After the expert convening, the initiation team realized that we needed more perspectives to add definition to the problem-solution space. We organized a statewide listening tour, calling and visiting educators and leaders from different geographic areas and urbanicities. We spoke with educators at 49 schools across the state, visited 11 schools, had conversations with 10 regional support agencies and consulted with researchers at UW-Madison (See Figure 1). This included educators at large traditional high schools, teacher-leaders at project-based charter schools, coaches in professional organizations, even a school board member, among others.

Map of Wisconsin shows wide geographic reach of listening sessions
Figure 1 

Map of listening conversations: phone calls (green phones), data coach conversations (yellow balloons), visits (orange cars), and conferences (brown balloons).

We conducted the listening tour through phone calls and visits. We briefly considered a survey, which would have been faster and cheaper. We decided to do phone calls and visits for two reasons: one, to allow us to ask open-ended and probing questions while in their work context, and two, meeting the people in person allowed us to spark new relationships or reactivate prior ones in anticipation of recruiting participants in the NIC.

The first question on our listening protocol was, “What kinds of things are you excited about in terms of student learning this fall?” (For the complete protocol, see Appendix A.) One team member questioned why we would include an open-ended question like this, commenting on one version of the protocol document, “I don’t think this first question is necessary.” Indeed, the responses to this question were varied, and not always related to our focus on data-driven instructional tools and practices. The question prompted a research stance of openness and curiosity, giving educators permission to direct the conversation and ask their own questions, which they did. Later in the protocol, we were more specific about asking about data practices and tools.

Over the course of four months of listening, our questions increased in specificity and showed evidence of taking insights from one listening conversation to another. Table 2 presents the sequence of questions the initiation team recorded.

Table 2

Sequence of questions that the initiation team asked in conversations with educators and in conversation with each other.


DATE SOURCE QUESTION OPERATOR

7 Jun 2016 Team Meeting Notes “How do teachers know when students are learning?” How

“What are the design opportunities for schools to improve their ability to know that students are learning?” What

29 Jul 2016 Convening Presentation Slides “What kinds of data, systems, and tools should be included in a SLDS?” What

“What are current data practices?” What

19 Aug 2016 Listening Tour Questions “What kinds of things are you excited about in terms of student learning this fall?” What

“Tell us about the kinds of information your school collects to document student learning?” Tell us about

“How does your team define student learning in your school?” How

“What role do students play in those discussions of data?” What

“How do students have an opportunity to show what they know?” How

12 Sep 2016 Initiation Team Meeting Notes “What kinds of schools take the leap to helping students use data to guide their own learning?” What

“What data do teachers collect about student learning and how are students using their own learning data?” What and how

7 Nov 2016 Initiation Team Meeting Notes “What are the micropieces that [teachers] are already assessing in their goal setting/initial conversations, check in conversations, and project finalization meetings?” What

From the first question, “How do teachers know when students are learning?” to five months later, “What are the micropieces that [teachers] are already assessing in their goal-setting/initial conversations, check-in conversations, and project-finalization meetings?”, we can see significant increase in the specificity of the question. This suggests that the initiation team was at the point of refining their understanding of the problem-solution space. Using what we were learning in subsequent questions also suggests an iterative refinement of the problem-solution space.

In the construction of the questions, educators and students were consistently included, e.g. “How do teachers know when students are learning?” After the expert convening, students were added as actors in the questions, e.g. “What role do students play in those discussions of data?”, suggesting that we had heard about the importance of students as partners in these data conversations. Subsequent questions continued to interrogate the relationship between the measures, the generation or use of the measure, and the people involved, such as the student, teacher, or school. “How do students have the opportunity to show what they know?” and “What data do teachers collect about student learning and how are students using their own learning data?” these questions are probing for the nature of the relationships between students, teachers, and data in a complex assessment system.

Part Two: Selecting a Problem-Solution Pair

In December of 2016, the initiation team wrapped up the listening tour and took the opportunity of a research group meeting to pitch three potential problem-solution pairs. Attendees included the initiation team, along with 4 additional researchers from the broader PiP group and 4 researchers who were funded on the same grant. The initiation team prepared slides describing three potential problem-solution pairs to the group, articulating the possibilities and limitations of each one. The following paragraphs were taken and edited lightly from a meeting summary that I wrote up immediately following the meeting.

Problem-Solution Pair #1. Develop a survey of validated measures of non-cognitive learning

Personalized learning educators characterized the desired outcomes of their programs as students who have the agency and capacity to direct their own learning, yet had no way to measure these to compete with traditional, standardized test scores. Educators shared throughout the listening tour that the lack of these measures limited their ability to report the impact of PL on student learning. A validated survey to measure non-cognitive skills had been proposed as early as the grant application, and we knew that researchers and local and national policymakers were interested and working on this challenge (e.g. García, 2014). Developing validated measures for integration into the statewide data system seemed like an excellent opportunity.

However, the initiation team identified several barriers. Educators were hesitant to quantify non-cognitive skills, fearing the measures would be used out of context. Other organizations were developing surveys, rubrics, and self-tests, but they were encountering issues with reliability of implementation, cultural bias, and reference bias. (For a summary of these issues, see Duckworth & Yeager, 2015). If the team went with the task of using previously developed measures, then the work would focus on testing and validating measures in the context of personalized learning, and it was unclear how educators would participate beyond giving the test. In addition, the task of building reliable skills measures is an emerging area of psychometric research and was an area of research that was beyond the expertise of the current PiPNIC initiation team. Pursuing this problem-solution pair would likely mean the departure of several people, as well as the need to recruit others with this expertise. This would delay NIC initiation and disrupt the relationships that had already been built.

Problem-Solution Pair #2. Standardize the student personalized learning plan

The personalized learning plan (PLP) is a central document of PL programs and contains data about the learner and their learning pathway, but the format and content vary. A common PLP could create one way for educators and students to monitor and support personalized pathways and build consistency across programs. The educators we spoke with in the listening sessions shared that the PLP was in some cases a central and dynamic document; for others, it lacked integration with other systems and quickly became outdated. The initiation team envisioned developing a PLP that would articulate a customized learning program for every student based on measures of student strengths and needs. The position of the PLP as central to supporting instructional decisions and the potential to integrate common measures across programs made this an attractive option for the NIC.

However, if the NIC chose this problem-solution pair, there were several related challenges. First, the data that was included in PLPs varied across schools and even amongst teachers within the same school. For example, some PLPs included student-created goals, whereas others were exclusively standards selected for students. Second, PLP format varied widely and might be limited by schools’ technology infrastructure: some programs used binders while others worked with sophisticated learning relation management systems, and still others cobbled together several different tools that integrated with Google documents or sites. Third, the role of the PLP in the instructional system was not clear, whether it was a way to share student artifacts demonstrating competencies, like a portfolio, or a way to track progress, like a list of completed competencies.

Each of these challenges were also opportunities as the PLP could align outcome and instructional data for individual, program, and statewide use, making the PLP seem like the obvious choice. The conversation then turned back to the tension identified at the expert convening between developing tools versus practices. Because it was unclear how educators currently use the PLP, the team was concerned that the tool would be disconnected from practice and therefore limited in adoption. Another reservation was whether educators would have the autonomy to change their PLP, especially if the educators were part of a larger district. Likewise, there may not be much opportunity for iterative testing, because educators would rely on a stable version during the year and need to wait until the summer to make changes.

Problem-Solution Pair #3. Develop a common protocol for conferring

Conferring, the regular, one-on-one conversation between educator and learner that focuses on learning pathways, processes, or products, had been initially described in the PiP study (Halverson et al., 2015) and came to the fore again in the listening tour. Educators rated conferring as having the highest utility amongst all their teaching practices (Rutledge, 2017), and educators were using conferring as a data-driven instructional strategy. Even the most traditional high school leaders we talked to wanted to have students set their own goals and reflect on their learning process in conversation with their teachers. The problem was that there was a lack of consensus as to its definition or use. Some educators connected it to the workshop model of teaching, others described it as for student reflection and goal-setting, and others still considered it akin to the Individualized Educational Program meeting for students in special education.

We identified that conferring was a process that happened regularly and could be iteratively improved with multiple design cycles. The research team also knew educators at several schools who were interested to define and improve their conferring practices, and the PiPNIC team saw an opportunity to incorporate student learning data into these conversations.

The third problem-solution pair was ultimately selected as the focus of PiPNIC that would be launched in January 2017. Table 3 summarizes the reasons for and against each problem-solution pair where Figure 2 diagrams the three options. Interestingly, the initial problem was highly desirable to researchers and policymakers, but was ultimately judged to be not desirable for educators, whereas conferring emerged as clearly the most viable, feasible, and desirable option.

Table 3

Summary of reasons for and against each problem-solution pair.


POSSIBLE PROBLEM-SOLUTION PAIR REASONS TO SELECT THIS PROBLEM REASONS NOT TO SELECT THIS PROBLEM

Develop a survey of validated measures of non-cognitive learning
  • Alignment with initial grant
  • High researcher and policymaker interest
  • Educators were hesitant about the validity of quantifying non-cognitive learning
  • Need to recruit different NIC team members
  • Difficult task that others have attempted without success
  • Low potential for participation by educators in the design process itself

Standardize the student personalized learning plan (PLP) to incorporate state and local data
  • Central artifact in the implementation of personalized learning
  • Clear what the design would be
  • Unsure whether educators would have the autonomy or willingness to change mid-year

Develop a common protocol for conferring
  • Educators perceive it as high-leverage
  • Practice happens regularly, so there would be many opportunities for design
  • Known interest in educators who would be interested to participate
  • Applicability beyond PL
  • Unsure what would be designed
  • Unclear connection to grant funding

Three problem-solution pairs and how their theory of action impacts the aim of the NIC
Figure 2 

Map of the three solution-problem pairs, including their theory of action or how them might impact the aim. The first pair, highlighted in the dashed line, was the problem initially conceived by the research team, whereas the third pair, highlighted by the dotted line, was the pair that the initiation team selected.

Discussion

The question this paper has asked is, How is the problem-identification process prior to the initiation of a NIC part of the broader social reorganization of research and development proposed by the NIC model? Through the design case of PiPNIC, I constructed a narrative of the PiPNIC problem-identification process, including the construction of the problem-solution space and the selection of one problem-solution pair. Next, I discuss three findings from this work: the impact of the problem-identification process and implications for the NIC initiation framework, the under-theorized relationship between problem and solution in education and specifically research-practice partnerships, and the way problem identification operationalizes the social part of reorganizing research and development in education.

First, the work that was done in the 8 months prior had a profound consequence on the course of the partnership. While eight months might feel like a luxury, had we stayed with the development of a validated survey of non-cognitive skills, might we have failed in our ability to recruit practitioners or might they have rejected what we designed? The latter is the typical story of the research-practice gap. In those eight months, we obtained the information we needed to select a viable problem of practice, and we built relationships that would later be important for recruitment. We positioned educators as necessary experts when we went to their work context and listened as they shared struggles, questions, and ideas. We selected the problem-solution pair that was more interesting to them over the researchers and policymakers. All this contributed to the iterative construction and narrowing of a problem-solution space that then allowed the initiation team to select a different problem-solution pair than initially conceived.

Somewhat simply, that the problem emerged from the field and how it was identified was different from a traditional, linear conception of the R&D process, therefore it is part of the social reorganization of R&D. I suggest incorporating problem-identification as a necessary task into the NIC initiation framework (Russell et al., 2015). Including the problem identification process in this framework could hasten the development of strategies and tools for problem identification and improve the theoretical understanding of the complex challenge of initiating a NIC.

Second, and perhaps because of the lack of attention on problem identification, the relationship between problems and solutions in the context of research-practice partnerships is under theorized. While improvement science advocates are relentless in their focus on problem analysis before a change idea is posited, designers and educators are solution-oriented and often want exclusively talk about solutions. There is a real tension between problem-centered approaches like improvement science (Bryk et al., 2015) and solution-focused orientations of educators that become a challenge to collaboration across research and practice.

I propose that the concept of a need-solution pair (von Hippel & von Krogh, 2016) in a problem-solution space can resolve this tension by giving problems and solutions a relationship. In the problem-solution space, improvers and educators can see their problems and solutions as connected, each providing different insights about the system. In PiPNIC, we were able to take the solutions that were presented (a non-cognitive skills survey, the redesign of the PLP, and the common protocol for conferring) and trace these change ideas to their roots in a problem of practice (lack of measures for impact, common format for collecting data, and lack of standard processes). As our conversations went back and forth between problems and solutions, we were able to use both to deepen our understanding of the system.

Making these problem-solution relationships explicit is therefore consistent with systems thinking, where problems of practice are epiphenomenal to or properties of the system, and the way we conceptualize problems and solutions has real impacts on our ability to tackle these challenges (e.g. Gomez et al., 2018). Conceiving problems and solutions as equally valid entry points into systems change and explicating their relationships might provide common ground for education researchers, practitioners, and policymakers as they engage in partnership initiation. The risk, to be sure, is that the problem and solution become fixed together, and NIC then takes up the initial solution as a given. The tendency of education reformers to jump to solutions is what improvement science was built to resist (Bryk et al., 2015). Any problem-solution pair should be held lightly and decoupled as the NIC takes up the more robust problem-analysis phase at the outset of NIC activities.

Finally, just as problems and solutions cannot be understood separate from the system that produces them, they cannot be understood separate from the people describing them. In addition to conceiving of problems and solutions as related to each other, a social understanding of the problem-identification points us to see the network of relationships surrounding them. In PiPNIC, the phone calls and visits created new ties or reactivated and affirmed old ones. We tacitly (or sometimes directly) gauged potential interest in NIC participation. The selection of conferring protocols as the problem of practice was in part based on our research team’s expertise and that we knew educators who would work with us on it. When it came to recruitment in the spring of 2017, all the schools we invited were ones we had connected with during the listening process. The problem-identification process is inextricably linked with a network of people with relationships.

One implication of a social perspective on the problem-identification process is that people initiating a NIC, in this case researchers, are put in the position of needing to build this social infrastructure. Similar to Cannata and colleagues (2017) descriptions of how NICs shift traditional roles, seeing improvement science as a social is not a traditional stance for researchers. Drawing on participatory design and design experiments, as we did in the case of PiPNIC and as others do in design-based RPPs, provides a wealth of methodological and epistemological precepts to support this approach. Specifically, the continual attention to and negotiation of who participates, how the process goes, and what is focused on, at each stage of the partnership, is needed for the kind of social reorganization of R&D that the NIC model proposes. Attention to participation is especially important for educator partners, who are traditionally marginalized in efforts to change education. For them to be seen and valued and for their participation to be prioritized throughout the process is how NIC leaders will build the diverse colleagueship of expertise needed to tackle the most challenging problems of practice in education.

A second implication of a social understanding of people and problems is that there is almost certainly no one “right problem,” “right people,” or “right place” to initiate a NIC. Instead, the place to start is a messy intersection of problems, solutions, people, and context, and each one provides a starting place that will be iteratively refined and changed as the project continues. This is the social reorganization of R&D.

Conclusion

In this paper, I present a design case of a university-based research team’s problem-identification process in the eight months prior to the initiation of a Networked Improvement Community. I use the design case of PiPNIC to explore how the initiation team constructed a rich problem-solution space and ultimately select a field-initiated problem-solution pair. The case demonstrates the significant impact of the work that happens prior to NIC initiation. I argue to include problem identification as a task in the NIC initiation framework and further conceptual understanding of the relationship between problems, solutions, and people in an education RPP. These findings contribute to the growing body of research on how NICs, and research-practice partnerships more broadly, are implemented in a range of contexts.

Notes

1Noncognitive here refers to skills that are not specifically rooted in academic disciplines, such as collaboration, communication, problem-solving, leadership, strategic thinking, among others. (Rutledge, 2017). 

Funding Information

The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R372A150031. The opinions expressed are those of the author and do not represent views of the Institute or the U.S. Department of Education.

Competing Interests

The author received graduate student funding on the IES Grant listed above.

References

  1. Apple, M. (1985). Teaching and “Women’s Work”: A Comparative Historical and Ideological Analysis. Teachers College Record, 86(3): 455–473. DOI: https://doi.org/10.1177/016146818508600306 

  2. Baer, M., Dirks, K. T., & Nickerson, J. A. (2013). Microfoundations of strategic problem formulation. Strategic Management Journal, 34(2), 197–214. DOI: https://doi.org/10.1002/smj.2004 

  3. Baron, K. (2017, May 5). Five Years & 20,000 Students: How a NIC Succeeded Where Others Failed. Retrieved February 25, 2020, from https://www.carnegiefoundation.org/blog/five-years-20000-students-how-a-nic-succeeded-where-others-failed/ 

  4. Biesta, G. J. J. (2007). Bridging the gap between educational research and educational practice: The need for critical distance. Educational Research and Evaluation, 13(3), 295–301. DOI: https://doi.org/10.1080/13803610701640227 

  5. Blomberg, J., & Karasti, H. (2012). Ethnography: Positioning Ethnography within Participatory Design. In: J. Simonsen & T. Robertson (Eds.) Routledge International Handbook of Participatory Design. Routledge: New York, NY, USA. 

  6. Boling, E. (2010). The Need for Design Cases: Disseminating Design Knowledge. International Journal of Designs for Learning, 1(1), 1–8. Retrieved from http://scholarworks.iu.edu/journals/index.php/ijdl/index. DOI: https://doi.org/10.14434/ijdl.v1i1.919 

  7. Bryk, A. S., & Gomez, L. M. (2007). Ruminations on reinventing an R&D capacity for educational improvement. The future of educational entrepreneurship: Possibilities of school reform (pp. 181–206). 

  8. Bryk, A. S., Gomez, L. M., & Grunow, A. (2011). Getting ideas into action: Building networked improvement communities in education. Frontiers in sociology of education, 1, 127–162. DOI: https://doi.org/10.1007/978-94-007-1576-9_7 

  9. Bryk, A. S., Gomez, L. M., Grunow, A., & LeMahieu, P. G. (2015). Learning to improve: How America’s schools can get better at getting better. Cambridge, MA: Harvard Education Press. 

  10. Bullen, C. V., & Bennett, J. L. (1990, September). Learning from user experience with groupware. In Proceedings of the 1990 ACM conference on Computer-supported cooperative work (pp. 291–302). DOI: https://doi.org/10.1145/99332.99362 

  11. Cain, T. (2015). Teachers’ engagement with research texts: Beyond instrumental, conceptual or strategic use. Journal of Education for Teaching, 41(5), 478–492. DOI: https://doi.org/10.1080/02607476.2015.1105536 

  12. Cannata, M., Cohen-Vogel, L., & Sorum, M. (2017). Partnering for improvement: Improvement communities and their role in scale up. Peabody Journal of Education, 92(5), 569–588. DOI: https://doi.org/10.1080/0161956X.2017.1368633 

  13. City, E. A., Elmore, R. F., Fiarman, S. E., & Teitel, L. (2009). Instructional rounds in education. Cambridge, MA: Harvard Education Press. 

  14. Cobb, P., Confrey, J., DiSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13. DOI: https://doi.org/10.3102/0013189X032001009 

  15. Coburn, C. E., & Penuel, W. R. (2016). Research-Practice Partnerships in Education: Outcomes, Dynamics, and Open Questions. Educational Researcher, 45(1). DOI: https://doi.org/10.3102/0013189X16631750 

  16. Crow, R., Hinnant-Crawford, B. N., & Spaulding, D. T. (Eds.). (2019). The Educational Leader’s Guide to Improvement Science: Data, Design and Cases for Reflection. Stylus Publishing, LLC. 

  17. DiSalvo, B., & DiSalvo, C. (2014). Designing for democracy in education: Participatory design and the learning sciences. Boulder, CO: International Society of the Learning Sciences. 

  18. DiSalvo, B., Yip, J., Bonsignore, E., & Carl, D. (2017). Participatory design for learning. Routledge. DOI: https://doi.org/10.4324/9781315630830 

  19. Donovan, S. (2011, April). The SERP approach to research, design, and development: A different role for research and researchers. Paper presented at the Annual Meeting of the American Educational Research Association, New Orleans, LA. 

  20. Duckworth, A. L., & Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Educational Researcher, 44(4), 237–251. DOI: https://doi.org/10.3102/0013189X15584327 

  21. García, E. (2014). The need to address noncognitive skills in the education policy agenda. Economic Policy Institute Briefing Paper #386, Washington DC. Retrieved from: www.epi.org. 

  22. Gomez, L. M., Russell, J. L., Bryk, A. S., LeMahieu, P. G., & Mejia, E. M. (2016). The right network for the right problem. Phi Delta Kappan, 98(3), 8–15. DOI: https://doi.org/10.1177/0031721716677256 

  23. Halverson, R. (2002). Representing phronesis: Supporting instructional leadership practice in schools (Doctoral dissertation, Northwestern University). 

  24. Halverson, R., Barnicle, A., Hackett, S., Rawat, T., Rutledge, J., Kallio, J., Mould, C., & Mertes, J. (2015). Personalization in practice: Observations from the field. Working Paper. Wisconsin Center for Education Research. 

  25. Harris, K. J., Jerome, N. W., & Fawcett, S. B. (1997). Rapid assessment procedures: a review and critique. Human Organization, 56(3), 375–378. DOI: https://doi.org/10.17730/humo.56.3.w525025611458003 

  26. Holtzblatt, K., & Jones, S. (1993). Contextual inquiry: A participatory design technique for system design. In: D. Schuler & A. Namioka, (Eds). Participatory Design: Principles and practices. Englewood Cliffs, NJ: Prentice-Hall. 

  27. Jacobson, M., & Wilensky, U. (2006). Complex Systems in Education: Scientific and Educational Importance and Implications for the Learning Sciences. The Journal of the Learning Sciences, 15(1), 11–34. DOI: https://doi.org/10.1207/s15327809jls1501_4 

  28. Kivel, L. (2015). The Problem with Solutions https://www.carnegiefoundation.org/blog/the-problem-with-solutions/ 

  29. Koskinen, I., Zimmerman, J., Binder, T., Redstrom, J., & Wensveen, S. (2011). Design research through practice: From the lab, field, and showroom. Elsevier. DOI: https://doi.org/10.1016/B978-0-12-385502-2.00006-7 

  30. Labaree, D. F. (1997). Public goods, private goods: The American struggle over educational goals. American educational research journal, 34(1), 39–81. DOI: https://doi.org/10.3102/00028312034001039 

  31. Lawson, B. (2004). Schemata, gambits and precedent: some factors in design expertise. Design studies, 25(5), 443–457. DOI: https://doi.org/10.1016/j.destud.2004.05.001 

  32. LeMahieu, P. G., Bryk, A. S., Grunow, A., & Gomez, L. M. (2017). Working to improve: seven approaches to improvement science in education, Quality Assurance in Education, 25(1), 2–4. DOI: https://doi.org/10.1108/QAE-12-2016-0086 

  33. Mintrop, R., & Zumpe, E. (2016). Defining and Framing Problems of Practice, in R. Mintrop, Design-based school improvement: A practical guide for education leaders. Harvard Education Press. 

  34. Muller, M. J., & Druin, A. (2007). Participatory design: the third space in HCI. In The human-computer interaction handbook (pp. 1087–1108). CRC press. DOI: https://doi.org/10.1201/9781410615862.ch54 

  35. Newell, A., & Simon, H. A. (1972). Human Problem Solving. Englewood Cliffs, NJ: Prentice-Hall. 

  36. Nickerson, J., Yen, C. J., & Mahoney, J. T. (2012). Exploring the problem-finding and problem-solving approach for designing organizations. Academy of Management Perspectives, 26(1). DOI: https://doi.org/10.5465/amp.2011.0106 

  37. Penuel, W. R., Allen, A. R., Coburn, C. E., & Farrell, C. (2015). Conceptualizing research–practice partnerships as joint work at boundaries. Journal of Education for Students Placed at Risk (JESPAR), 20(1–2), 182–197. DOI: https://doi.org/10.1080/10824669.2014.988334 

  38. Philip, T. M., Bang, M., & Jackson, K. (2018). Articulating the “how,” the “for what,” the “for whom,” and the “with whom” in concert: A call to broaden the benchmarks of our scholarship. Cognition & Instruction, 36(2), 83–88. DOI: https://doi.org/10.1080/07370008.2018.1413530 

  39. Roegman, R., & Woulfin, S. (2019). Got theory?: Reconceptualizing the nature of the theory- practice gap in K-12 educational leadership, Journal of Educational Administration, 57(1), 2–20. DOI: https://doi.org/10.1108/JEA-01-2018-0002 

  40. Russell, J. L., Bryk, A. S., Dolle, J., Gomez, L. M., LeMahieu, P., & Grunow, A. (2017). A Framework for the Initiation of Networked Improvement Communities. Teachers College Record, 119(7). DOI: https://doi.org/10.1177/016146811711900501 

  41. Rutledge, J. (2017). Measuring What Matters: How Noncognitive Skills are Captured, Stored, and Utilized in Personalized Learning Environments. [Doctoral Dissertation]. The University of Wisconsin-Madison. 

  42. Schuler, D., & Namioka, A. (Eds.). (1993). Participatory design: Principles and practices. CRC Press. 

  43. Shavelson, R. J., Phillips, D. C., Towne, L., & Feuer, M. J. (2003). On the science of education design studies. Educational researcher, 32(1), 25–28. DOI: https://doi.org/10.3102/0013189X032001025 

  44. Stokes, D. E. (2011). Pasteur’s quadrant: Basic science and technological innovation. Brookings Institution Press. 

  45. von Hippel, E., & von Krogh, G. (2016). Identifying Viable “Need–Solution Pairs”: Problem Solving Without Problem Formulation. Organization Science 2016, 27(1), 207–221. DOI: https://doi.org/10.1287/orsc.2015.1023 

comments powered by Disqus