Chat with us, powered by LiveChat Apracticalguidetocollaborativequalitativedataanalysis.pdf - STUDENT SOLUTION USA

A Practical Guide to Collaborative Qualitative Data Analysis

K. Andrew R. RichardsUniversity of Alabama

Michael A. HemphillUniversity of North Carolina at Greensboro

The purpose of this article is to provide an overview of a structured, rigorous approach to collaborative qualitative analysis whileattending to challenges associated with working in team environments. The method is rooted in qualitative data analysis literaturerelated to thematic analysis, as well as the constant comparative method. It seeks to capitalize on the benefits of coordinatingqualitative data analysis in groups, while controlling for some of the challenges introduced when working with multiple analysts.The method includes the following six phases: (a) preliminary organization and planning, (b) open and axial coding,(c) development of a preliminary codebook, (d) pilot testing the codebook, (e) the final coding process, and (f) reviewingthe codebook and finalizing themes. These phases are supported by strategies to enhance trustworthiness, such as (a) peerdebriefing, (b) researcher and data triangulation, (c) an audit trail and researcher journal, and (d) a search for negative cases.

Keywords: multiple analysts, qualitative methods, researcher training, trustworthiness

While qualitative research has been traditionally discussedas an individual undertaking (Richards, 1999), research reportshave in general become increasingly multi-authored (Cornish,Gillespie, & Zittoun, 2014; Hall, Long, Bermback, Jordan, &Patterson, 2005), and the field of physical education is no exception(Hemphill, Richards, Templin, & Blankenship, 2012; Rhoades,Woods, Daum, Ellison, & Trendowski, 2016). Proponents ofcollaborative data analysis note benefits related to integratingthe perspectives provided by multiple researchers, which is oftenviewed as one way to enhance trustworthiness (Patton, 2015).Collaborative data analysis also allows for researchers to effec-tively manage large datasets while drawing upon diverse perspec-tives and counteracting individual biases (Olson, McAllister,Grinnell, Walters, & Appunn, 2016). Further, collaborative ap-proaches have been presented as one way to effectively mentor newand developing qualitative researchers (Cornish et al., 2014).

Despite the potential benefits associated with collaborativequalitative data analysis, coordination among analysts can bechallenging and time consuming (Miles & Huberman, 1994).Issues related to the need to plan, negotiate, and manage thecomplexity of integrating multiple interpretations while balancingdiverse goals for involvement in research also represent challengesthat need to be managed when working in group environments(Hall et al., 2005; Richards, 1999). Concerns have also been voicedabout the extent to which qualitative data analysis involvingmultiple analysts is truly integrative and collaborative, rather thanreflective of multiple researchers working in relative isolation toproduce different accounts or understandings of the data (Moran-Ellis et al., 2006).

Challenges associated with collaboration become com-pounded when also considering the need for transparency inqualitative data analysis. Analysts need to develop, implement,and report robust, systematic, and defensible plans for analyzingqualitative data so to build trustworthiness in both the process andfindings of research (Sin, 2007). Authors, however, often prioritizeresults in research manuscripts, which limits space for discussingmethods. This leads to short descriptions of data analysis proce-dures in which broad methods without an explanation of how theywere implemented (Moravcsik, 2014), and can limit the availabilityof exemplar data analysis methods in the published literature.This has given rise to calls for increased transparency in thedata collection, analysis, and presentation aspects of qualitativeresearch (e.g., Kapiszewski & Kirilova, 2014). The AmericanPolitical Science Association (APSA, 2012), for example, recentlypublished formal recommendations for higher transparency stan-dards in qualitative research that call for detailed descriptions ofdata analysis procedures and require authors support all assertionswith examples from the dataset.

To help address the aforementioned challenges, scholarsacross a variety of disciplines have published reports on bestpractices related to qualitative data analysis (e.g., Braun &Clarke, 2006; Cornish et al., 2014; Hall et al., 2005). Many of theseapproaches are rooted in theories and epistemologies of qualitativeresearch that guide practice (e.g., Boyatzis, 1998; Glaser & Strauss,1967; Lincoln & Guba, 1985; Strauss & Corbin, 2015). Braun andClarke’s (2006) highly referenced article provides a step-by-stepapproach to completing thematic analysis that helps to demystifythe process with practical examples. In another similar vein, Halland colleagues (2005) tackle challenges related to collaborativedata analysis and discuss processes related to (a) building ananalysis team, (b) developing reflexivity and theoretical sensitivity,(c) addressing analytic procedures, and (d) preparing to publishfindings. Cornish and colleagues (2014) further this discussion bynoting several dimensions of collaboration that are beneficial in

Richards is with the Department of Kinesiology, University of Alabama,Tuscaloosa, AL. Hemphill is with the Department of Kinesiology, University ofNorth Carolina at Greensboro, Greensboro, NC. Address author correspondence toK. Andrew R. Richards at [email protected].

225

Journal of Teaching in Physical Education, 2018, 37, 225-231https://doi.org/10.1123/jtpe.2017-0084© 2018 Human Kinetics, Inc. RESEARCH NOTE

qualitative data analysis. The rigor and quality of the methodologymay benefit, for example, when research teams include insider andoutsider perspectives, multiple disciplines, academics and practi-tioners, international perspectives, or senior and junior facultymembers.

In this paper, we contribute to the growing literature thatseeks to provide practical approaches to qualitative data analysis byoverviewing a six-step approach to conducting collaborative qual-itative analysis (CQA), which is grounded in qualitative methodsand data analysis literature (e.g., Glaser & Strauss, 1967; Lincoln &Guba, 1985; Patton, 2015). While some practical guides in theliterature provide an overview of data analysis procedures, such asthematic analysis (Braun&Clarke, 2006), and others discuss issuesrelated to collaboration (Hall et al., 2005), we seek to address bothby overviewing a structured, rigorous approach to CQA whileattending to challenges that stem from working in team environ-ments. We close by making the case that the CQA process can beemployed when working with students, novice researchers, andscholars new to qualitative inquiry.

Collaborative Qualitative Analysis:Building Upon the Literature

In our collaborative work, we began employing a CQA process inresponse to a need to balance rigor, transparency, and trustworthi-ness in data analysis while managing the challenges associatedwith analyzing qualitative data in research teams. Our goal was tointegrate the existing literature related to qualitative theory, meth-ods, and data analysis (Glaser & Strauss, 1967; Patton, 2015;Strauss & Corbin, 2015) to utilize procedures that allowed us todevelop consistency and agreement in the coding process withoutquantifying intercoder reliability (Patton, 2015). Drawing fromrecommendations presented in other guides for conducting quali-tative data analysis (Braun & Clarke, 2006; Hall et al., 2005),researchers adopting CQA work in teams to collaborativelydevelop a codebook (Gibbert, Ruigrok, & Wicki, 2008) throughopen and axial coding, and subsequently test that codebook againstpreviously uncoded data before applying it to the entire dataset.There are steps embedded to capitalize on perspectives offered bymembers of the research team (i.e., researcher triangulation;Lincoln & Guba, 1985), and the process culminates in a set ofthemes and subthemes that form the basis for study results. TheCQA process also embraces the tradition of constant comparison(Glaser & Strauss, 1967) as newly coded data are compared withexisting coding structures and modifications are made to thosestructures through the completion of the coding process. Thisprovides flexibility to modify generative themes1 in light ofchallenging or contradictory data.

The CQA process is grounded in thematic analysis, which isa process for identifying, analyzing, and reporting patterns inqualitative data (Boyatzis, 1998). Typically, thematic analysisculminates with a set of themes that describe the most prominentpatterns in the data. These themes can be identified using inductiveapproaches, whereby the researcher seeks patterns in the datathemselves and without any preexisting frame of reference, orthrough deductive approaches in which a theoretical or conceptualframework provides a guiding structure (Braun & Clarke, 2006;Taylor, Bogdan, & DeVault, 2015). Alternatively, thematic analy-sis can include a combination of inductive and deductive analysis.In such an approach, the research topic, questions, and methodsmay be informed by a particular theory, and that theory may also

guide the initial analysis of data. Researchers are then intentionalin seeking new ideas that challenge or extend the theoreticalperspectives adopted, which makes the process simultaneouslyinductive (Patton, 2015). The particular approach adopted by aresearch team will relate to the goals of the project, and particularlythe extent to which the research questions and methods areinformed by previous research and theory.

Trustworthiness is at the center of CQA, and methodologicaldecisions are made during the research design phase to addressGuba’s (1981) four criteria of credibility, confirmability, depend-ability, and transferability. In particular, we find that triangulation,peer debriefing, an audit trail, negative case analysis, and thickdescription fold into CQA quite naturally. In addition to the afore-mentioned researcher triangulation, data triangulation is often acentral feature of design decisions as researchers seek to draw frommultiple data sources to enhance dependability (Brewer & Hunter,1989), and an outside peer debriefer (Shenton, 2004) can be invitedto comment upon ongoing analysis so to add credibility. An audittrail can be maintained in a collaborative researcher journal toenhance confirmability (Miles & Huberman, 1994), and a negativecase analysis can highlight data that contradict the main findingsso to enhance credibility (Lincoln & Guba, 1985). Transferabilityis addressed by providing a detailed account of the study contextand through rich description in the presentation of results(Shenton, 2004).

Overview of the Collaborative ConstantComparative Qualitative Analysis Process

The CQA process includes a series of six progressive steps thatbegin following the collection and transcription of qualitative data,and culminate with the development of themes and subthemesthat summarize the data (see Figure 1). These steps include(a) preliminary organization and planning, (b) open and axialcoding, (c) the development of a preliminary codebook, (d) pilottesting the codebook, (e) the final coding process, and (f) review ofthe codebook and finalizing the themes. While the process can beemployed with teams of various sizes, we have found teams of twoto four analysts to be most effective because they capitalize on theintegration of multiple perspectives, while also limiting variabilitydue to inconsistencies in coding (Olson et al., 2016). In largerteams, some members may serve as peer debriefers.

When considering the initiation of teamwork, we concur withthe recommendations of Hall and colleagues (2005) related to thedevelopment of rapport among team members prior to beginninganalysis. A lack of comfort may lead team members to hold backcritique and dissenting viewpoints that could be important to dataanalysis. This is particularly true of faculty members working withgraduate students where the implied power relationship candiscourage students from being completely forthright. As a result,we recommend that groups engage in initial conversations un-related to the data analysis so to get to know one another and theirrelational preferences. This could include a discussion of com-munication styles, previous qualitative research experience, andepistemological views related to qualitative inquiry (Hall et al.,2005). The team leader may also provide an overview of the CQAprocess, particularly when working with team members who havenot used it previously. As part of this process it should be madeclear that all perspectives and voices are valued, and that all teammembers have an important contribution to make in the dataanalysis process.

JTPE Vol. 37, No. 2, 2018

226 Richards and Hemphill

Dow

nloa

ded

by E

bsco

Pub

lishi

ng a

ells

wor

th@

ebsc

o.co

m o

n 05

/08/

18, V

olum

e ${

artic

le.is

sue.

volu

me}

, Art

icle

Num

ber

${ar

ticle

.issu

e.is

sue}

Phase One: Preliminary Organization and Planning

Following the collection and transcription of data, the CQA process

begins with an initial team meeting to discuss project logistics and

create an overarching plan for analysis. This includes writing a

brief description of the project, listing all qualitative data sources tobe included, acknowledging any theoretical or conceptual frame-works utilized, and considering research questions to be addressed.Members of the data analysis team should also have an initialdiscussion of and negotiate through topics, such as the target

Figure 1 — Overview of the six steps involved in collaborative qualitative analysis. Strategies for enhancing trustworthiness underpin the analysisprocess.

JTPE Vol. 37, No. 2, 2018

Qualitative Data Analysis 227

Dow

nloa

ded

by E

bsco

Pub

lishi

ng a

ells

wor

th@

ebsc

o.co

m o

n 05

/08/

18, V

olum

e ${

artic

le.is

sue.

volu

me}

, Art

icle

Num

ber

${ar

ticle

.issu

e.is

sue}

journal, anticipated authorship, and a flexible week-by-week planfor analysis. The weekly plan includes a reference to the dataanalysis phase, coding assignments for each team member, andspace for additional notes and clarification (see Figure 2). Deci-sions related to the target journal and authorship, as well as theweekly plan for analysis, will likely evolve over time, but we find ithelpful to begin such conversations early to ensure that all teammembers are on the same page.

Phase Two: Open and Axial Coding

To begin the data analysis process we use open coding to identifydiscrete concepts and patterns in the data, and axial coding to makeconnections between those patterns (Corbin & Strauss, 1990).While open and axial coding are distinct analytical procedures,

we embrace Strauss and Corbin’s (2015) recommendation thatthey can occur simultaneously as researchers identify patterns andthen begins to note how those patterns fit together. Specifically,each member of the research team reads two to three different datatranscripts (e.g., field notes, interviews, reflection journal entries)and codes them into generative categories using their preferredmethod (e.g., qualitative data analysis software, manual coding).The goal is to identify patterns common across transcripts, or tonote deviant cases that appear.

Depending on the approach to thematic analysis adopted, atheoretical framework and research questions could frame thisprocess. We find it helpful, however, to retain at least someinductive elements so to remain open to generative themes thatmay not fit with theory. Following each round of coding, teammembers write memos in a researcher journal, preferably through a

Project Overview and Data Analysis Timeline Project Overview: To understand how physical education teachers navigate the sociopolitical realities of the contexts in which they work and derive meaning through interactions with administrators, colleagues, parents, and students. This work is a qualitative follow-up to a large-scale survey that was completed by over 400 physical education teachers from the US Midwest.

1. Theoretical Framework: Occupational socialization theory 2. Target Journal:Physical education pedagogy specific journal, such as the Journal of

Teaching in Physical Education or Research Quarterly for Exercise and Sport3. Anticipated Authorship:Researcher 1, Researcher 2, Researcher 3 4. Data Sources:30 individual interviews, 5 focus group interviews, field notes from

observations of teachers 5. Research Questions:

a. How do physical education teachers perceive that they matter given the marginalized nature of their subject?

b. How do interactions with administrators, colleagues, parents, and students influence physical educators’ perceptions of mattering and marginalization?

c. How do physical education teachers’ perceptions of mattering and marginalization influence feelings of role stress and burnout?

Weekly Plan for Data Analysis: Week Coding Phase Coding Assignment NotesJuly 11, 2016 Initial Meeting rof nalp eht ssucsiD enoN

analysis and review the data analysis timeline. Make changes and adjustments to the plan as necessary. Discuss the various phases of analysis and prepare to begin open coding.

August 1, 2016 Open Coding 1 Researcher 1: 1001, 1002

Researcher 2: 1003, 1004

Researcher 3: 1005, 1006

Open coding of each transcript into categories. Following coding, identify 3-4 generative themes and write a 1 page memo

August8, 2016 Open Coding 2 Researcher 1: 1022, 1023

Researcher 2: 1024, 1025

Researcher 3: 1007, 1027

Open coding of each transcript into categories. Following coding, identify 3-4 generative themes and write a 1 page memo

Figure 2 — Example of a project overview, code numbers (e.g., 1001) refer to interview transcripts.

JTPE Vol. 37, No. 2, 2018

228 Richards and Hemphill

Dow

nloa

ded

by E

bsco

Pub

lishi

ng a

ells

wor

th@

ebsc

o.co

m o

n 05

/08/

18, V

olum

e ${

artic

le.is

sue.

volu

me}

, Art

icle

Num

ber

${ar

ticle

.issu

e.is

sue}

shared online platform (e.g., Google Docs), in which they overviewthe coding and describe two or three generative themes supportedby data excerpts. During research meetings, team members over-view their coding in reference to the memos they wrote, and theteam discusses the coding process more generally. Phase twocontinues for three to four iterations, or until the research teamfeels they have seen and agree upon a variety of generative themesrelated to the research questions. The exact number of transcriptscoded depends on the size of the dataset and the level of initialagreement established amongst the researchers. The team canmoveon when all coders feel comfortable with advancing to the devel-opment of a codebook. In our experience, this usually involvescoding approximately 30% of all transcripts, but could be less whenworking with large datasets.

Phase Three: Development of a PreliminaryCodebook

After the completion of phase two, one team member reviews thememos and develops a preliminary codebook (Richards,Gaudreault, Starck, & Woods, in press). An example codebookis included in Figure 3, and typically includes first- and second-orderthemes, definitions for all themes, and space to code quotations fromthe transcripts. Theme definitions provide the criteria against whichquotations are judged for inclusion in the codebook, and thus shouldbe clear and specific. We code by copy/pasting excerpts from thetranscript files into the codebook and flagging each with theparticipant’s code number, the line numbers in the transcript file,and a reference to the data source (e.g., Interview 1001, 102–105).

This allows for reference back to the data source to gain additionalcontext for quotations as needed. We always include a “General(Uncoded)” category where researchers can place quotations that arerelevant, but do not fit anywhere in the existing coding structure.These quotations can then be discussed during team meetings. Oncecompiled, the draft codebook is circulated to the research team forreview and discussed during a subsequent team meeting. Changesare made based on the team discussion, and a preliminary codebookis finalized. At this stage we enlist the assistance of a researcher whois familiar with the project, but not involved in the data analysis, toserve as a peer debriefer (Lincoln & Guba, 1985). This individualreviews and comments on the initial codebook, and appropriateadjustments are made before proceeding.

Phase Four: Pilot Testing the Codebook

After the initial codebook has been developed, it is tested againstpreviously uncoded data. During this step, the researchers all codethe same two to three transcripts, and make notes in the researcherjournal related to interesting trends or problems with the codebook.Weekly research team meetings provide a platform for researchersto overview and compare their coding and discrepancies arediscussed until consensus is reached. Entries in the researcherjournal are also discussed. These discussions lead to the develop-ment of coding conventions, which function as rules that guidesubsequent coding decisions. Conventions may be created fordouble coding excerpts into two generative themes in rare instanceswhen both capture the content of a single quotation, and thatquotation cannot be divided in a meaningful way.

Perceived Mattering Codebook

stpircsnarT morf selpmaxE snoitinifeD semehtbuS semehT

Subject Marginalization Lack of communication

Teacher believes physical education does not matter due to lack of communication about issues that affect thephysical education environment.

“My stressful day, um probably when things pop up that are not…A lot of my stresses get raised from being an activities director. If the school calls me and says now they have to— they have kids who are not coming, they change times, or I have a different schedule. My stuff is very organized and if it’s not where I think it’s supposed to be and I need it, that’s very stressful for me” (1019, 210–217, individual interview)

dna emit fo kcaL resources

Teacher believes physical education does not matter due to lack of teaching contact time and resources such as materials, equipment for PE, or teaching facilities.

“It’s kind of rough because I don’t have my own classroom. I don’t have my own computer up there. I don’t have a room that I can make into a welcoming environment so that’s kind of rough” (1018, 110–112, individual interview)

“Right now that class is more just like babysitting. It’s just a study hall, kind of boring. I don’t have a classroom I’m in the gym balcony where the bleachers are at. I don’t have space the kids complain” (1018, 120–122, focus group)

eileb rehcaeT troppus fo kcaL ves physical education does not matter due to situations in which the physical educator does not feel support for ideas or initiatives.

“I think the colleagues, it wouldn’t matter either way outside of the P.E. teachers, and I think the administration wouldn’t care either way.” (1018, 348–350, individual interview)

“At the elementary level that would be a big issue. As they get a little older, you know middle school, high school it’s not as much probably fun. They don’t see it in their eyes as much fun. The students themselves probably wouldn’t care, there’d be a handful.” (1019, 307–309, focus group)

Figure 3 — Example codebook including themes, subthemes, definitions of subthemes, and quotations from the dataset.

JTPE Vol. 37, No. 2, 2018

Qualitative Data Analysis 229

Dow

nloa

ded

by E

bsco

Pub

lishi

ng a

ells

wor

th@

ebsc

o.co

m o

n 05

/08/

18, V

olum

e ${

artic

le.is

sue.

volu

me}

, Art

icle

Num

ber

${ar

ticle

.issu

e.is

sue}

Conventions can also specify priority in the use of generativethemes. In Figure 3, for example, there are generative themes forboth “lack of support” and “lack of communication” related tosubject marginalization. Lack of communication could be consid-ered a way in which support is limited, but because there is aspecific category for lack of communication, it would receivepriority when coding. Modifications are made to the codebookas needed during these meetings, and an updated codebook isproduced to guide subsequent analysis. The pilot testing continuesfor three to four rounds of coding, or until the research team feelsconfident in the codebook. Once the team feels ready to move on,they have a final discussion of the codebook in light of the pilottesting and make adjustments. The peer debriefer (Lincoln &Guba,1985) then reviews the evolving codebook and recommendschanges prior to the final coding process.

Phase Five: Final Coding Process

In the final phase of coding the adjusted codebook is applied toall project data, including that which had been previously codedduring the formative phases of codebook development. While theresearcher triangulation involved when using multiple coders canincrease “validity2” in qualitative research, some have argued thatit has the potential to reduce “reliability” because of inconsisten-cies in coding across analysts (Olson et al., 2016). As a result, somequalitative researchers have introduced measures of inter-coderreliability in an attempt to quantify agreement between coders(Neuendorf, 2017). While acknowledging these perspectives, westruggle with efforts to apply the quantitative principles of reliabil-ity and validity to qualitative data analysis (Patton, 2015). Weprefer to approach the issue of coder agreement, and the broadernotions of trustworthiness and credibility, by establishing a clearprotocol and codebook (Gibbert et al., 2008) through previous stepsof CQA, and then dialogue through and reach consensuson coded data. This is done either through consensus coding orsplit coding. Regardless of the strategy chosen, coding conventionsdeveloped during previous phases are applied to the coding process.Analysts continue to make notes in the researcher journal related toproblems with the generative themes, or interesting patterns in thedata, and issues are discussed during weekly research meetings.We continue to apply the constant comparative method (Strauss &Corbin, 2015) at this stage as modifications are made to the code-book to reflect ongoing insights developed in the coding process.

Consensus coding is the more rigorous, but more time-consuming form of final coding. It is likely the more effectiveapproach when working in larger groups where coding consistencyconcerns are more abundant (Olson et al., 2016). During eachiteration of coding, team members code the same two to threetranscripts into the codebook. Then, during research team meet-ings, each coded statement is compared across members of theresearch team. Disagreements are discussed until the group reachesconsensus. Split Coding relies more heavily on the establishment ofclarity through the preliminary coding phases and the codingconventions that have been developed (Gibbert et al., 2008). Whileless rigorous than consensus coding, split coding is also less timeconsuming and manageable within smaller teams. During eachiteration of coding, team members code two to three differenttranscripts. As a result, only one member of the teamwill code eachtranscript. Then, during research meetings, questions or concernsrelated to particular excerpts are discussed. Split coding culminateswith each team member reviewing all coded excerpts in thecodebook, and disagreements are discussed to consensus.

Phase Six: Review the Codebook and Finalize theThemes

After all of the transcripts have been coded using consensuscoding or split coding, the research team meets one final time toreview the codebook. During the meeting, the codebook isdeveloped into a thematic structure comprised of themes andassociated subthemes that describe participants’ perspectives.The thematic structure is reviewed and approved by all membersof the research team, and the final agreed upon structure forms thebasis for the result that will be presented as part of the manuscript.Importantly, through the earlier stages of CQA, all members ofthe research team have had a hand in shaping and agree upon thethemes that are presented. This process, therefore, capitalizes onthe enhanced trustworthiness provided by multiple analysts,while minimizing issues related to coder variability, withoutattempting to quantify the qualitative data analysis process(Patton, 2015).

Conclusions and Final Thoughts

The purpose of this article is to provide an overview of a structured,rigorous approach to CQA while attending to challenges that stemfrom working in team environments. While this article has focusedprimarily on the data analysis process, effective analysis begins atthe design phase when researchers pose research questions, decideon methods, and identify participants (Patton, 2015). After datahave been collected, the six-phase CQA process is adopted to makemeaning through the formation of generative themes. This processintegrates existing approaches to qualitative research (Glaser &Strauss, 1967; Miles & Huberman, 1994; Patton, 2015), andcontributes to the emerging literature that seeks to provide practicalexamples of qualitative data analysis (e.g., Braun & Clarke, 2006;Cornish et al., 2014; Hall et al., 2005). It provides a structured andrigorous approach that enhances transparency through the dataanalysis process (e.g., Kapiszewski & Kirilova, 2014; Moravcsik,2014), while capitalizing on the development of a codebook andmultiple researchers’ perspectives (Gibbert et al., 2008).

In considering qualitative data analysis, Woods and Graber(2016) explain, “ultimately, it is the responsibility of the investi-gator to select those procedures that best meet the philosophicorientation of the study, the purpose of the investigation, and themethods that were used to collect the data” (p. 30). Regardless ofthe particular approach taken, all qualitative researchers are chal-lenged to ensure methodological rigor and transparency, and CQAprovides one way to demonstrate inclusive collaboration amongresearchers. The coding, memoing, and pilot testing of the code-book provide multiple layers where all researchers have opportu-nities to share their perspectives. The audit trail maintained throughongoing discussions and the researcher journal also enhancestransparency and allows for the process to be documented andadapted for use across multiple research projects.

We find that CQA can aid in the management of large,qualitative datasets by providing a structured and phasic approachto analysis. This can be particularly helpful for graduate students,early career researchers, and diverse research teams who may bestruggling to identify rigorous data analysis procedures that meetthe needs of all researchers (Cornish et al., 2014). The step-by-stepnature of the approach also has applicability for those coordinatinggroups of researchers, or analysts who want to adopt a rigorous,systematic, and defensible process that can be implemented withfidelity on a consistent basis. The process can further be adapted for

JTPE Vol. 37, No. 2, 2018

230 Richards and Hemphill

Dow

nloa

ded

by E

bsco

Pub

lishi

ng a

ells

wor

th@

ebsc

o.co

m o

n 05

/08/

18, V

olum

e ${

artic

le.is

sue.

volu

me}

, Art

icle

Num

ber

${ar

ticle

.issu

e.is

sue}

those who prefer to analyze data manually, or through qualitativedata analysis software.

In order to enhance transparency, researchers should be spe-cific about the methods used when analyzing data (Moravcsik,2014). This can be done, in part, by identifying and implementingwith fidelity a practical guide to analysis, such as the one advocatedin this paper, or other examples in the literature (e.g., Braun &Clarke, 2006; Cornish et al., 2014; Hall et al., 2005). The processcan then be specifically identified and cited in the methods, alongwith an explanation of any adaptations or deviations from originalarticulation. To further transparency, researchers may also com-municate why they use collaboration in qualitative research, andhow they believe it enhances study results. In future qualitativemethodology discussions, researchers should continue to considermore nuanced understandings of how collaboration enhancesqualitative research. These conversations have the potential tocapitalize on the benefits associated with multiple analysts, andthus could aid the design of future research.

Notes

1. While many researchers use terms such as “emergent” or “emerging”when discussing themes and the processes through which they are devel-oped (Taylor & Ussher, 2001), this language implies that the researcherplays a generally passive role in the creation of themes, or “if we just lookhard enough they will ‘emerge’ like Venus on the half shell” (Ely, Vinz,Downing, & Anzul, 1997, p. 205). We, therefore, refer to themes as beinggenerative so to emphasize the active role researchers play in generatingthem through qualitative data analysis.

2. While we agree with the perspective of Patton (2015), who is reluctantto apply the quantitatively oriented terms of “reliability” and “validity” todiscussions of qualitative data analysis, we use them here because theyare adopted by Olson and colleagues (2016). Our intent is to differentiateour desire to enhance trustworthiness and credibility from inter-coderagreement, which is more quantitatively driven.

References

American Political Science Association. (2012). Guide to professionalethics in political science (2nd ed.). Washington, DC: Author.

Boyatzis, R.E. (1998). Transforming qualitative information: Thematicanalysis and code development. Thousand Oaks, CA: Sage.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology.Qualitative Research in Psychology, 3, 77–101. doi:10.1191/1478088706qp063oa

Brewer, J., & Hunter, A. (1989). Multimethod research: A synthesis ofstyles. Thousand Oaks, CA: Sage.

Corbin, J., & Strauss, A. (1990). Grounded theory research: Procedures,canons, and evaluative criteria. Qualitative Sociology, 13, 3–21.doi:10.1007/BF00988593

Cornish, F., Gillespie, A., & Zittoun, T. (2014). Collaborative analysis ofqualitative data. In U. Flick (Ed.), The Sage handbook of qualitativedata analysis (pp. 79–93). Thousand Oaks, CA: Sage.

Ely, M., Vinz, R., Downing, M., & Anzul, M. (1997). On writingqualitative research: Living by words. London, UK: Routledge.

Gibbert, M., Ruigrok, W., & Wicki, B. (2008). What passes as a rigerouscase study? Strategic Management Journal, 29, 1465–1474. doi:10.1002/smj.722

Glaser, B.G., & Strauss, A. (1967). The discovery of grounded theory:Strategies for qualitative research. Chicago, IL: Aldine.

Guba, E. (1981). Criteria for assessing the trustworthiness of naturalisticinquiry. Educational Technology Research and Development, 29(2),75–91.

Hall, W.A., Long, B., Bermback, N., Jordan, S., & Patterson, K. (2005).Qualitative teamwork issues and strategies: Coordination throughmutual adjustment. Qualitative Health Research, 15, 394–410.PubMed doi:10.1177/1049732304272015

Hemphill, M.A., Richards, K.A.R., Templin, T.J., & Blankenship, B.T.(2012). A content analysis of qualitative research in the Journal ofTeaching in Physical Education from 1998 to 2008. Journal of Tea-ching in Physical Education, 31, 279–287. doi:10.1123/jtpe.31.3.279

Kapiszewski, D., & Kirilova, D. (2014). Transparency in qualitativesecurity studies research: Standards, beliefs, and challenges. SecurityStudies, 23, 699–707. doi:10.1080/09636412.2014.970408

Lincoln, Y.S., &Guba, E. (1985).Naturalistic inquiry. NewYork, NY: Sage.Miles, M.B., & Huberman, A.M. (1994). Qualitative data analysis: An

expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage.Moran-Ellis, J., Alexander, V.D., Cronin, A., Dickinson, M., Fielding, J.,

Sleney, J., & Thomas, H. (2006). Triangulation and integration:Processes, claims and implications. Qualitative Research, 6, 45–59.doi:10.1177/1468794106058870

Moravcsik, A. (2014). Transparency: The revolution in qualitativeresearch. Political Science and Politics, 47, 48–53. doi:10.1017/S1049096513001789

Neuendorf, K. (2017). The content analysis guidebook (2nd ed.).Thousand Oaks, CA: Sage.

Olson, J.D., McAllister, C., Grinnell, L.D., Walters, K.G., & Appunn, F.(2016). Applying constant comparative method with multiple investi-gators and inter-coder reliability. The Qualitative Report, 21(1), 26–42.

Patton, M.Q. (2015). Qualitative research and evaluation methods (4thed.). Thousand Oaks, CA: Sage.

Rhoades, J.L., Woods, A.M., Daum, D.N., Ellison, D., & Trendowski,T.N. (2016). JTPE: A 30-year retrospective of published research.Journal of Teaching in Physical Education, 35, 4–15. doi:10.1123/jtpe.2014-0112

Richards, K.A.R., Gaudreault, K.L., Starck, J.R., & Woods, A.M. (inpress). Physical education teachers’ perceptions of perceived matter-ing and marginalization. Physical Education and Sport Pedagogy.

Richards, L. (1999). Qualitative teamwork: Making it work. QualitativeHealth Research, 9, 7–10. doi:10.1177/104973299129121659

Shenton, A.K. (2004). Strategies for ensuring trustworthiness in qualitativeresearch projects. Education for Information, 22, 63–75. doi:10.3233/EFI-2004-22201

Sin, C.H. (2007). Using software to open up the “black box” of qualitativedata analysis in evaluations. Evaluation, 13, 110–120. doi:10.1177/1356389007073684

Strauss, A., & Corbin, J. (2015). Basics of qualitative research: Techni-ques and procedures for developing grounded theory (4th ed.).New York, NY: Sage.

Taylor, G.W., & Ussher, J.M. (2001). Making sense of S&M: A discourseanalytic account. Sexualities, 4, 293–314. doi:10.1177/136346001004003002

Taylor, S., Bogdan, R., & DeVault, M.L. (2015). Introduction to qualita-tive research methods: A guidebook and resource (4th ed.).New York, NY: Wiley.

Woods, A.M., & Graber, K. (2016). Interpretive and critical research:A view through a qualitative lens. In C.D. Ennis (Ed.), Routledgehandbook of physical education pedagogies (pp. 21–33). New York,NY: Routledge.

JTPE Vol. 37, No. 2, 2018

Qualitative Data Analysis 231

Dow

nloa

ded

by E

bsco

Pub

lishi

ng a

ells

wor

th@

ebsc

o.co

m o

n 05

/08/

18, V

olum

e ${

artic

le.is

sue.

volu

me}

, Art

icle

Num

ber

${ar

ticle

.issu

e.is

sue}

Copyright of Journal of Teaching in Physical Education is the property of Human KineticsPublishers, Inc. and its content may not be copied or emailed to multiple sites or posted to alistserv without the copyright holder's express written permission. However, users may print,download, or email articles for individual use.

error: Content is protected !!