Categories
Uncategorized

Active Recall: The #1 Study Technique Behind Every A+ Exam

Why doesn’t anyone teach us how to study properly for exams?

I know students from my time at university who studied for 10 hours every day and still failed the exam. How is that possible?

What do people do differently to achieve better results in less time?

One thing I can clarify upfront: they’re not necessarily smarter.

They simply have a better study method at their disposal.

In this article, I’ll explain the Active Recall study method, which has been found to be the key to peak academic performance in numerous scientific studies.

Passive vs. Active Learning

When it comes to excelling in exams, it’s not just about the number of hours you spend studying, but rather the effectiveness of your study methods.

In a comprehensive 58-page meta-analysis conducted by Dunlosky et al. in 2013*, examining various learning approaches, the authors revealed that commonly used techniques such as re-reading, highlighting, or summarizing notes often do not yield the desired results.

But why do these methods remain so popular?

The answer is quite simple: they are straightforward and have been the traditional way of studying for a long time.

Reading and highlighting notes are very convenient.

And who hasn’t experienced this: after reading something multiple times, you might even feel well-prepared.

However, a word of caution!

When suddenly asked for specific details, many find themselves at a loss. There’s a difference between merely recognizing information and actually recalling it.

Active Recall Learning Method

The Study Secret: How Our Brain Functions

To understand the best way to study, let’s take a brief journey into the realm of neuroscience. To make this more relatable, I’ll illustrate it using the example of Belinda.

Picture Belinda as she prepares for her upcoming exam, diligently reviewing her lecture slides repeatedly. At this very moment, various regions of her brain are operating at full capacity.

The occipital lobe is busy creating mental images of what she’s currently perusing, while the angular gyrus and the fusiform cortex are hard at work, deciphering the meanings of the words she’s reading.

Once the information has been processed, the brain dispatches it to the hippocampus – essentially the brain’s memory hub.

But here’s the twist: if you merely read through notes, only a fraction of the information tends to stick. Think of it as you would strengthening your muscles: it requires targeted exercises.

Similarly, your memory – especially the hippocampus – needs the right ‘workout regimen.’

This is where the Active Recall study method comes into play.

While straightforward reading primarily activates the visual aspects of your brain, the hippocampus often gets sidelined.

Hence, passive study techniques like reading scripts or highlighting notes pale in comparison to Active Recall.

What Is the Active Recall Study Method?

Active Recall operates by compelling our brains to actively engage in the study process. Instead of passively absorbing information through reading or listening, Active Recall encourages us – as the name implies – to actively retrieve information from our memory.

The act of actively recalling information trains the hippocampus and increases the likelihood that you’ll remember the information when you need to recall it later (e.g., during an exam).

When employing the Active Recall study method, you can retain what you’ve learned for a significantly extended period and apply it in various contexts.

Let’s explore five ways to incorporate the Active Recall learning method into your exam preparation.

#1 Stop and Recite

Following your reading of a section, momentarily set aside your study materials. Attempt to express the content in your own words.

Afterwards, retrieve the script and compare: What did you manage to remember, and where are the gaps? Fill in those information gaps using the script, and repeat the process until you’ve confidently internalized the content.

Active Recall Learning Method 2

#2 Flashcards

Flashcards have been a tried-and-true study method for many students for quite some time. Even the act of creating the cards, where you must articulate information precisely, serves as an effective part of the exam prep.

In today’s digital age, you can leverage tools like Anki to craft flashcards. Anki incorporates another study method that works particularly well for memorization, known as “spaced repetition.” This means that flashcards are revisited at specific intervals.

These intervals are carefully calibrated to enhance your ability to remember the content effectively.

However, a word of caution is in order: when using flashcards for studying, it’s crucial not to overlook active recall.

What do I mean by that? As soon as you can’t explain a term or concept, flipping the flashcard to reveal the answer right away won’t be enough.

This approach does not align with active recall because it doesn’t provide your brain with the opportunity to recall the information from memory. Instead, take a moment to challenge yourself to explain as much of the answer in your own words as possible before checking the back of the flashcard.

Additionally, a small tip regarding flashcards:

They often require you to condense information significantly. This can lead to a situation where you may remember numerous isolated facts but struggle to grasp the broader context or how these facts interconnect.

#3 Create Questions: Your Path to Profound Understanding

Instead of relying solely on your study materials and flashcards, try crafting your own questions related to the content you’re studying.

This strategy encourages you to critically evaluate what you’ve learned and ensures you maintain a firm grasp of the overall context.

Creating questions actively engages your learning process, allowing you to forge stronger connections between the new information and your existing knowledge.

Following each chapter or section, make a note of important questions, and later, challenge yourself to answer them without consulting your notes.

Active Recall Learning Method 3

#4 Engage an Audience

A highly effective way to master the material is by teaching it to others.

This process activates various cognitive pathways in your brain. Try explaining the subject of your exam to your classmates, friends, or family members.

Doing so compels you to think deeply about the topic and structure your explanation logically. Furthermore, if you encounter difficulties during your explanation, it serves as an immediate indicator of areas where you need improvement.

Pro tip: In the absence of a live audience, you can even imagine one or engage in a chat with ChatGPT. You can prepare the AI with a prompt to ask you specific questions.

If you’re interested in a dedicated tutorial on using ChatGPT for exam preparation, feel free to leave a comment under this video.

#5 Utilize Past Exams as a Recipe for Success

Reviewing past exams is like a trial run for the real test. You become familiar with the question formats and develop an understanding of what to anticipate in the exam.

When you tackle previous exams under timed conditions, you’ll also hone your time management skills and gauge your level of readiness.

An additional benefit: The more you engage in this practice, the more at ease you’ll feel during the actual exam because you’ll possess insights into what lies ahead.

The Drawbacks of Active Recall

Certainly, Active Recall boasts numerous advantages, but are there any downsides? Undoubtedly!

This method can be rather demanding. It necessitates true engagement and cognitive effort, which is substantially more challenging than simply skimming through your notes or reading the script.

Particularly with complex subjects, it can be frustrating when answers don’t readily come to mind. This study approach requires stepping out of your comfort zone and demonstrating substantial initiative.

Yes, the allure of quickly consulting your notes may be strong, but trust me, the additional effort required by Active Recall is well worth it!

*https://journals.sagepub.com/doi/abs/10.1177/1529100612453266

Categories
Uncategorized

Finding Articles for your Literature Review: The 5 Biggest Mistakes

You can’t figure out the process of finding articles for your literature review?

I hear this a lot.

The most common question I get from students is this: The topic I’m writing about is so new, there’s just no literature on it. What should I do?

In this article, we’ll get to the bottom of this. To do this, we’ll discuss the 5 biggest mistakes in searching literature for your research.

If you stick with it until the end, I even have an example for you on how you can craft an amazing literature review section, even if at first glance there seems to be no literature about the topic.

https://youtu.be/an5eqrgdd7U

#1 Too narrowly defined search terms

The first mistake students make when trying to figure out how to search for literature effectively is using too narrowly defined search terms.

The results of a literature search can only be as good as the filtering mechanisms you use.

Let’s use a simple example. For this, we need a topic that is so new that there is little or no literature to be found on it.

The example is: Universal Wallets

Universal Wallets are storage places for digital valuables. Maybe you have your own digital wallet to store cryptocurrencies or NFTs in.

In a Universal Wallet, however, you can store not only NFTs and coins but also other things like digital identity documents or other proofs of your identity.

Boolean Operators

For searching in literature databases, finding literature can be as simple as entering search terms. Using Boolean operators can help you to be more effective in your search.

If you are doing a systematic search process that you want to document in your methods section, this is an absolute must, because it ensures replicability.

But even if you keep your search terms for yourself and write a “regular” literature review section, Boolean operators can help you big time.

Especially the “OR” operator. For example, with “Universal Wallets,” you would get a low number of hits for the search term “Universal Wallet” in most databases.

In Google Scholar, however, an algorithm automatically combines your search term with synonyms and also gives you the next best results.

But if you’re searching on another database, that’s of no help to you.

Through the Google Scholar search results, though, I came across synonyms that I would never have searched for.

Wireless Wallets, Cloud Wallets, Electronic Wallets, Hardware Wallets, Wallet System, and Mobile Wallet seem to be related terms.

From this, you could make the search string “wireless” OR “cloud” OR “electronic” OR “hardware” OR “mobile” AND “wallet” for your database search on Scopus, Web of Science and so on.

You’d have to search for Wallet System separately or use brackets to include the term “wallet system.” It’s just a bit cumbersome because here “wallet” is the first word and not the second.

finding literature

Forward and Backward Search

If a refined database search yields no results, then you need to delve into some other tricks.

With a forward and backward search, you can find the sources upon which the few hits you did find are based, or sources that have cited them afterwards.

The best hit for the term “Universal Wallet” was the paper “Universal Wallets” by Jorgensen and Beck (2021). For the backward search, we simply look at the bibliography and see what we can find there.

Ha!

One paper cited there uses the term “Digital Wallet.” That’s something I could have thought of by myself.

But most often, when finding literature, I do not.

Now, searching for “Digital Wallet” yields endless results. A digital wallet is, of course, not what is meant by “Universal Wallet,” but it is still a related technology.

Thus, it’s perfectly feasible to build on the term “Digital Wallet” to describe what exactly is new about a “Universal Wallet” and why we cannot simply transfer the research on digital wallets to this more nuanced version of the technology.

#2 Inappropriate Databases for Finding Literature

The search terms are one thing. But if the database you are searching on simply does not cover enough sources, then even the best search terms and operators won’t help.

When you use databases, primarily use those in which the research of your own discipline is indexed.

A medical researcher searches on PubMed, a computer scientist on IEEE, and a psychologist on PsycNet. However, sometimes it pays to look beyond the confines of a discipline.

So make sure you also check out interdisciplinary databases. This could be something like Google Scholar, Scopus, or Web of Science.

Or you could look directly into another discipline if you already have an idea of who else might be interested in your topic outside your research field.

The more you can cover with your databases, the better. In the worst case, you find nothing there. But at least you’ve tried.

#3 Old Wine in New Bottles

Regarding “Universal Wallets,” there is certainly a difference compared to “Digital Wallets” or the good old-fashioned wallet.

However, this may not always be the case with other terms.

Take, for example, the topic of Digital Transformation.

The term has only been a trend topic for a few years, and since then, there has been a lot of research on it.

But what’s actually new about it?

Haven’t companies been introducing technology to improve their processes for 50 years?

Yes, they have. And there’s plenty of research literature on that. It’s just not found under the label “Digital Transformation.”

But if you search for “IT-enabled Organizational Transformation,” you get some hits from the last millennium!

And these hits might be relevant for someone wanting to study Digital Transformation.

So, when it comes to your topic, ask yourself: Is the topic really as new as the term describing it?

#4 Too Much Description

This mistake deals with the goal of your literature search. You want to write a literature review section on your topic and diligently cite relevant literature.

This approach isn’t necessarily wrong.

But you write a truly excellent literature section when you change your goal.

A poor literature review section merely recounts research on a topic but fails to gather enough relevant literature.

A mediocre literature review section manages to do so, but only describes what has been researched on the topic.

A fantastic literature review section gathers relevant literature, explains what has been done so far, and critiques or interprets what this means.

So, especially with a new topic, don’t focus too much on finding literature and citing every single source that somewhat fits the topic.

Instead, try to develop your own argument using a handful of relevant findings. For example, you could explain the difference between the new term and existing ones.

Or you could further develop the argument you used for motivation in your introduction.

We’ll look at an example of this at the end of the video.

finding literature 2

#5 Your Doughnut Has No Hole

If you’re writing a non-systematic literature review section, completeness is not a criterion. So your “cheese” of literature can have holes, as long as you develop an exciting argument.

But if you have a sweet tooth, try imagining the metaphor of a doughnut.

Let’s say you’ve convinced me, and your topic is indeed so new that there’s little research on it. And Finding literature isn’t always straightforward.

What you can do then is not look for the hole in the middle, but for the surrounding dough. What are the terms and phenomena that are closely related to your topic?

Work your way through the databases, gathering literature on somewhat broader topics and terms, and approach the doughnut’s hole argumentatively.

For “Universal Wallets,” you could start with the topic of Digital Wallets, explain what Crypto-Wallets are, and from there move on to Universal Wallets.

An Example for a Literature Chapter on a “Brand New” Topic

As luck would have it, I found an excellent example for you. In their paper on “Universal Wallets,” Jörgensen and Beck (2021) did almost exactly that.

They start not with the topic of Digital Wallets but directly with Crypto-Wallets. That’s how they begin their literature review section.

When it comes to “Universal Wallets,” they continue to cite literature.

But if we take a closer look, we realize that this literature has nothing directly to do with the term.

The chapter is a perfect doughnut!

The first source they cite is titled “Digital Lifestyle.” This source is somewhat more broadly related to the topic.

Then, the authors cite sources on “The Token Economy” and “Identity Ecosystems.” These two sources are closer to “Universal Wallets,” but neither directly refers to it.

Further sources revolve around “Blockchain Identity Management Systems” and “The potential of blockchain in education and health care.”

The last source is a report from the World Wide Web Consortium on “Universal Wallets.” When the academic literature is not yet extensive, reports from industry can also be helpful.

From this example, you can see how to write a literature review section on a topic without using a single academic source on that topic!

Avoiding the 5 mistakes I mentioned have taught you how to search literature effectively.

So from now on, “My topic is too new” is no longer an excuse. And if you meet someone who uses this excuse in your presence, send them this article! 🙂

Categories
Philosophy of Science

Ontology, Epistemology and Methodology (simply explained)

The strange sounding words ‘Ontology, Epistemology, and Methodology‘ often appear in social science texts or lectures.

Have you ever asked yourself what these terms mean, and how they relate to each other?

In this article, I’ll explain the individual meanings and the connection between Ontology, Epistemology, and Methodology. This will equip you well for discussions in any academic field and help you better understand these basic philosophy of science concepts.

In my opinion, understanding these three terms is the key to taking your academic journey to the next level.

Why are Ontology, Epistemology, and Methodology Such Important Concepts

As you’ve likely noticed, we are delving into the realm of philosophy of science. This field, which is part of philosophy, serves as a meta-discipline with implications for all other scientific fields.

The terms Ontology, Epistemology, and Methodology are inherently philosophical and concern how we understand science.

This is crucial because without philosophical underpinnings, scientific work as you know it wouldn’t be possible.

For instance, when you’re developing a survey questionnaire for your master’s thesis, which 150 individuals will fill out, and then you conduct statistical tests in SPSS, this scientific endeavor adheres to specific ontological and epistemological assumptions.

These assumptions shape the research approach you are following, regardless of whether you are aware of them or not.

If you’re conducting interviews for your research, for example, your work will be rooted in a different ontology, a different epistemology, and, of course, a different methodology.

Without an understanding of the philosophical foundations underpinning your research, you might find yourself caught off guard when someone knowledgeable in this area poses a targeted question.

A fitting metaphor for the relationship between Ontology, Epistemology, and Methodology is an iceberg.

Your methodology and methods are on the surface and visible. For example, you describe them in your methods section.

Ontology and Epistemology, on the other hand, lie beneath the surface, intricately connected to the visible part of the iceberg.

In a PhD thesis, students are sometimes asked to make a statement about their ontology and epistemology. But in a regular academic paper, you would not do so, because they shine through implicitly.

But if you are aware of them, you can fully understand what you are doing and know what others are talking about when they mention these things.

So, let’s take a look behind the curtain.

Ontology Epistemology Methodology

Ontology

Ontology resides at the deepest point underwater. This concept pertains to the philosophical exploration of ‘being’ itself or ‘how things exist.’

In other words, it addresses how we understand reality.

This might initially sound quite confusing. Why would there be differences in this regard?

Well, there are plenty of differences!

Ontology in the Natural Sciences

Let’s say you’re an atomic physicist working at CERN with a particle accelerator. Your scientific understanding of reality likely follows the belief that 1+1=2, and that the atoms observable in the particle accelerator are objectively real.

So, if a colleague or an alien were to look into the particle accelerator, the atoms would still be just as real, independent of you as the subject.

This ontology, following the tenets of objectivism, is accepted by the majority of researchers in the natural sciences. Admittedly, it would be quite challenging to make progress otherwise.

Ontology in the Social Sciences

However, when we delve into the realm of social sciences, things become a bit more difficult. Here, we don’t observe atoms or other natural scientific phenomena; rather, we study social phenomena.

These aren’t influenced by the laws of physics but are shaped by human interactions.

Can we then assume the same ontology here?

Some say yes, we can. In psychology, for instance, a natural scientific ontology has also prevailed. It is widely believed that psychological phenomena can be generalized and are objectively real, similar to the natural sciences. This, in turn, affects which methods for acquiring knowledge are accepted – but we’ll get to that shortly.

Other social scientists are dismayed by this ontology. For them, it’s evident that social phenomena are in the eye of the beholder and socially constructed.

How we, as humans, perceive reality is closely tied to how we interpret it. So, there’s no objective reality, but rather a subjective one. This ontology falls under the philosophical stream of constructivism.

Since the 1970s, a third prominent position has emerged in the social sciences, which somewhat mediates between the two, also known as ‘Critical Realism.’

You can ask yourself:

Is a chair a chair because it has four legs and a backrest? Or is a chair a chair because we use it for sitting?

The answer to this question can tell you more about the ontology through which you see the world.

Epistemology

Now, let’s move a bit closer to the surface to explore epistemology.

Here, the question is: How is it even possible to acquire knowledge about the world?

What methods of gaining knowledge are accepted in a scientific discipline?

When our CERN scientist conducts measurements in the particle accelerator, she is convinced that new knowledge can be generated through this process. Additionally, she is aware of the ‘nature’ of this knowledge – that it is concrete, tangible, and objective. This epistemology is also known as positivism.

In the realm of the social sciences, things again become more difficult. Here, knowledge could just as easily be characterized as personal, subjective, and unique.

This, in turn, has significant implications for how we, as researchers, can acquire new knowledge and which methods we can or cannot accept. This epistemological position is also called interpretivism.

The Relationship between Ontology and Epistemology

As you may have noticed, there is always a specific epistemology that aligns with an underlying ontology.

A positivist epistemology corresponds to an objectivist ontology.

An interpretivist epistemology aligns with a constructivist ontology.

There are, of course, other positions like critical realism, but that would be the subject of another video.

Ontology Epistemology Methodology 2

Methodology

The philosophical assumptions you make, specifically the ontology and epistemology that shape your understanding of reality and knowledge, determine the methodology you employ in your research.

Broadly speaking, there are once again two opposing positions in this area that dominate the social sciences. These methodological approaches are the quantitative and qualitative research paradigms.

Traditionally, a quantitative approach aligns with the objectivist-positivist position, while a qualitative approach corresponds better to the constructivist-interpretivist position.

At this level of the iceberg, however, the possibilities are much more flexible, at least in most social sciences. There are methods that combine both approaches or cross over, creating methodological pluralism.

Throughout the history of science, there have been (relatively) intense debates and disputes about which philosophical assumptions are the right ones for each discipline.

Fortunately, today, it is possible to be successful with less dogmatic positions and contribute to the diversity of a discipline by acknowledging the value of each position.

Categories
Research Methods

Operationalization of Variables in Quantitative Research

Have you encountered the term Operationalization in the realm of empirical social research during a lecture or while reading a methods book?

Maybe you’ve been assigned the task of operationalizing one or more variables for an assignment or research project?

But you just don’t know what on earth all these people are talking about?

The issue lies in the fact that many university instructors are so well-acquainted with this term that they often struggle to empathize with beginners.

They use terms like variables, concepts, constructs, and operationalization without offering the fundamental knowledge that someone new to this type of work requires to understand how these things are interconnected.

In this article, my goal is to clarify these terms and elucidate, in the most straightforward language, how they are related. We will also explore what operationalization entails and how you can put it into practice with regard to variables in your own study.

Operationalization

The World of Quantitative Research

In the realm of empirical social research, one of the fundamental distinctions lies between the qualitative and quantitative research paradigms.

This division finds its origins in the philosophical underpinnings of the social science, which I’ve explored in-depth in my article on Ontology, Epistemology, and Methodology.

Operationalization is an important task within the realm of quantitative social research.

The quantiative paradigm is characterized by its goal of testing theoretical assumptions, mostly through the use of statistical methods.

The bedrock of these statistical methods is grounded in quantitative empirical data, such as survey responses or the outcomes of experiments, or digital trace data.


Theoretical Building Blocks (Concepts and Constructs)

The currency of the social sciences is theory. Social science theory relies on linguistic elements, even within the quantitative paradigm. In contrast, mathematicians and physicists build their theories with numbers and equations, reflecting the philosophical assumptions and nature of these disciplines.

Social science theories require the use of ‘concepts’ as foundational elements. These concepts serve as the vocabulary employed by researchers when describing existing theories or building new ones.

Qualitative researchers tend to be more at ease with this aspect, as they enjoy crafting new concepts to enrich the theoretical landscape and to describe emerging social phenomena.

On the other hand, quantitative researchers find the conceptual level less satisfying, often considering it too ambiguous. For instance, the concept of ‘intelligence’ can have diverse interpretations. Not everybody agrees on what ‘intelligence’ means.

In the context of quantitative research, however, concepts are transformed into ‘constructs.’ Constructs are concepts made measurable. And this is what a quantitative researcher aims to do. (Döring & Bortz, 2016).

Operationalization

The process of making concepts measurable is referred to as ‘operationalization,’ and it introduces a new component – variables.

A completed construct can encompass one or multiple variables, resulting in constructs that are then called unidimensional or multidimensional.

Unidimensional Constructs

An instance of a construct that can be determined by measuring a single variable is ‘weight.’ If we can measure weight in kilograms using a scale, then assessing the construct ‘weight’ is relatively straightforward.

Multidimensional Constructs

However, many other constructs that researchers aim to measure are more complex.

For instance, the construct ‘intelligence’ cannot be assessed through a single variable. To make assertions about intelligence, researchers may consider variables such as ‘abstract thinking,’ ‘communication skills,’ ‘learning,’ ‘problem-solving,’ and more.

During the operationalization of a multidimensional construct, researchers must decide which variables are relevant to the concept and which ones should be included in their study.

Conversely, it is important to note that a single construct can be operationalized in various ways. For example, a study that solely employs the IQ variable to measure intelligence might face criticism, because intelligence involves more than just the result of an IQ test.

At the same time, even if a researcher picks a handful of variables to measure ‘intelligence,’ another researcher might pick 5 other variables, for example.

Measurement Instruments

In the realm of quantitative research within the social sciences, researchers often rely on a method called ‘items’ or ‘item batteries’ for data collection.

These item batteries consist of pre-designed sets of questions that can be incorporated into a questionnaire.

Researchers can either create their own item batteries or utilize existing ones from the literature.

If you are new to all of this, I would suggest the latter option. Many experienced researchers have already put in the effort to test and evaluate these item batteries.

This also means that you can measure a single variable in various ways.

For instance, if you intend to measure ‘abstract thinking,’ there might be multiple item batteries or scoring systems provided by different authors to consider.

In the process of operationalization, it is crucial to make well-informed selections and provide strong justifications for your choices.

You must consider what different batteries cover and which measurement instruments are widely accepted within the research community.

One indicator of this is the number of citations for the publication where the measurement instrument is made available.

Additionally, the quality of operationalization can be assessed by examining the reliability and validity of the measurement instruments.

If you’d like to delve deeper into this topic, take a look at my tutorial on Reliability, Validity, and Objectivity.

Beyond item batteries for surveys, there are various other methods of operationalization. The core principle remains the same, even if your method involves other types of data collection.

In any case, it is essential to engage with the existing literature and determine how you can gain meaningful insights about the variables you are interested in.

Theoretical Assumptions (Propositions and Hypotheses) for Operationalization

To complete this tutorial, we must address the following question:

After identifying your measurement instruments and conducting your analysis, what comes next?

In addition to the theoretical building blocks, which are your constructs, there are the connections or relationships that hold them together.

In quantitative research, the goal is not necessarily to discover new building blocks, but to provide insides about the relationships between them.

Theoretical relationships are tested by identifying causal relationships (primarily based on experiments) or correlational relationships (e.g., through surveys). These relationships are typically assessed for statistical significance.

Theoretical assumptions guide what should be tested. These assumptions are derived from the existing literature.

In this context, propositions are assumptions about how concepts are related, while hypotheses are assumptions about how measurable variables or constructs are related.

When formulating hypotheses at the start of your study, you are not only selecting the theoretical building blocks (e.g., Variable A and Variable B are relevant) but can also make predictions about their relationship (e.g., Variable A positively affects Variable B).

For more information on hypotheses, you can refer to my other tutorials on hypothesis development.

Conclusion on Operationalization

If you now have a basic understanding of what operationalization entails, this video has fulfilled its purpose. However, it’s crucial to delve further into this topic. As the next step, I recommend reading a methods book. A good starting point is Discovering Statistics by Andy Field.

Categories
Research Methods

Thematic Analysis in Qualitative Research (Braun & Clarke 2006)

You’ve come across Thematic Analysis according to Braun and Clarke (2006) and are wondering what this qualitative research method is all about?

No problem.

In this article, you’ll learn:

  1. The 6 steps of Thematic Analysis according to Braun and Clarke.
  2. The different types of Thematic Analysis you can do.
  3. The difference between Thematic Analysis and qualitative content analysis.
  4. The types of research projects for which Thematic Analysis is particularly well-suited.

By the end of this article, you won’t just have added another qualitative method to your toolkit, but you’ll also know when to best employ each one.

Thematic Analysis according to Braun and Clarke (2006)

Thematic analysis is one of the most popular qualitative research methods out there. Since Braun and Clarke published their paper “Using Thematic Analysis in Psychology” in 2006, it has been cited over 150,000 times. Therefore, the method has gained recognition far beyond the realms of psychology and is used across various disciplines.

The reasons for the popularity of thematic analysis are manifold.

Unlike Grounded Theory, it represents a specific method rather than a methodological approach. This means that there are concrete steps for its execution that have been clearly and explicitly defined.

Moreover, thematic analysis still offers a certain flexibility, which is essential for qualitative approaches.

The method has evolved very little since 2006, meaning the guidelines by Braun and Clarke are still highly relevant.

However, since then, the duo has distinguished between three different kinds of thematic analysis. This differentiation arose because the method was sometimes interpreted in ways different from what they originally intended.

The three types are as follows:

Reflexive Thematic Analysis

This is the method as Braun and Clarke envisioned it. It’s based on a constructivist mindset, meaning subjective interpretations of the data are at the forefront.

Positivist Thematic Analysis

In this variant, researchers compute a reliability measure to check for agreement among them. This version follows more of a positivist mindset and isn’t quite what the two originally had in mind.

Thematic Analysis with a Codebook

This third variant is neither one extreme nor the other. It involves working with a codebook, which can contain predefined categories but can also be expanded spontaneously.

What is a “Theme”?

A Theme is either…

…a summary of the content

or

…a central concept that encapsulates the meaning of similar content.

Themes cannot be discovered or found within the content. They have to be generated by you. So never write: “I identified 5 themes…” but instead say “I developed 5 themes…”

6 steps of reflexive thematic analysis as proposed by Braun and Clarke (2006)

What follows are the 6 steps of reflexive thematic analysis as proposed by Braun and Clarke (2006).

#1 Familiarize yourself with the data

First, transcribe your data if you have it only in audio or video format.

Then, read the entire dataset twice from start to finish. This gives you a good overview of all your material. It’s better than starting to evaluate a transcript without knowing the rest.

Try to fully immerse yourself in the situation described in the transcripts. However, always maintain an analytical perspective.

Take notes as you read. You can also take notes right after conducting an interview or while visiting a company on-site if that’s your research context. All your notes are for your personal use; you don’t need to share them later. However, they will assist you in the evaluation later on.

In your notes, jot down your initial reactions. These can be analytical or purely intuitive.

#2 Generate initial codes

Now it’s time to start coding. The codes that emerge at this stage are categories, but not themes yet!

What are categories?

Try to code all your data according to the same schema. That is, find categories on a consistently similar level of abstraction.

An example of a category would be “democratic decision-making within the team.”

Another category on the same level could be “open discussion about the integration of new technologies.”

Two categories that aren’t on the same level might be “hierarchy” (too abstract) or “weekly meetings where personnel decisions are made” (not abstract enough).

With the categories, you can certainly venture a preliminary interpretation, like “democratic decision-making”. This exact phrase wasn’t in the data; it’s something you interpreted.

But what’s the purpose of the categories?

The categories reduce the volume of your data and group your analytical units.

What’s essential for Thematic Analysis as per Braun and Clarke is that you don’t code EVERYTHING. Instead, you should only form categories that are relevant to your research question.

In the data, you’ll find many sections that just aren’t interesting and won’t help answer your research question. You don’t need to code these sections.

The naming of your categories should be chosen such that they precisely describe what’s relevant to your question. A category doesn’t have to consist of just one word; it can be a bit longer (3-6 words).

How to Code?

You can either work digitally with software like NVIVO or use pen and sticky notes. I’m more of a software person. But everyone has their own preferences.

Even while coding, you can and should continue to take notes that you can use later on.

#3 Generate the First Themes

The reflexive Thematic Analysis by Braun and Clarke operates inductively. Your themes should arise exclusively based on your data and, at this point, based on your codes.

Now, group the codes. Which ones are thematically related? This will lead to clusters of categories. Each cluster will then become a theme.

Here, you can also work with mind maps and visually develop the clusters. It’s also possible that within a larger cluster, you have smaller clusters (or “subthemes”). However, try not to make it too complicated.

In the end, having 3 to 6 themes is a good amount to work with.

Avoid Thematic Buckets

The biggest mistake in coding and also in generating themes is the use of so-called “buckets”. A classic bucket includes categories like “Advantages”, “Disadvantages”, “Barriers”, and “Challenges”. It’s crucial to steer clear of these.

#4 Review Your Themes

Once you’ve finalized all the themes, create a final mind map featuring all the themes, potential subthemes, and categories.

Check if everything forms a coherent overall picture and accurately reflects the content of the data. Ask yourself the following questions for each theme:

  • Is this more than just a category?
  • Does this theme encapsulate multiple categories?
  • How does the theme relate to the research question? Are there overlaps between themes?
  • Is there sufficient data supporting the theme?
  • Is the theme too broad or too specific?

If you encounter issues with these questions, such as overlapping themes, take a step back and rephrase the themes or rearrange the structure.

The Thematic Analysis by Braun and Clarke isn’t a linear process; you can always move forward and backward as needed.

#5 Define and Name Your Themes

Write a detailed description for each theme, comprising 5 to 6 sentences.

Also, finalize the specific designation for each theme.

If you encounter issues while describing or naming, it typically indicates that the theme isn’t distinct enough. In such cases, revert a step or two and reconsider.

Thematic Analysis according to Braun and Clarke

#6 Write Down Your Findings

The final step for your Thematic Analysis according to Braun and Clarke involves drafting your report.

In most instances, this will be an academic paper.

Now, you will integrate your findings with existing literature and align the motivation, research question, results, and discussion.

In your methodology section, ensure to cite Braun and Clarke (2006) and explain how you approached your thematic analysis.

In the results section, introduce all the themes at a glance and then delve deeper into each specific theme. Provide quotes from your data that represent each theme.

Certainly, let the quotes speak for themselves. They can even be a bit lengthy. However, simply stringing together quotes isn’t sufficient. Between them, you must jot down your interpretation and establish the connection between the data and the theme.

What’s the difference between Braun and Clarke’s Thematic Analysis and Qualitative Content Analysis?

Answering this question isn’t that straightforward. The approach suggested by Braun and Clarke is simply their perspective on systematically evaluating qualitative data.

With qualitative content analysis, there are different variants too, for example by the German social scientists Mayring or Kuckartz.

A content analysis would probably be more suited if you have a big qualitative data set and would like to count your categories or if you want to develop a codebook for other researchers to use.

The procedures of thematic analysis and inductive content analysis are quite similar and differ by maybe one or two steps and their respective labels.

When selecting your method, consider the target audience for your research. For more interpretative research and an English-speaking audience, choose thematic analysis.

For a more structured approach and some quantification of your qualitative data, choose content analysis.

Categories
Uncategorized

The Netnography Methodology by Kozinets (12-Step Tutorial)

Netnography is a qualitative research approach focused on studying online communities and their behaviors. The method was first developed by Robert Kozinets and has since become an indispensable tool for many social sciences.

In this article, I’ll walk you through the fundamentals of netnography based on Kozinets’ 12-phase process and demonstrate how you can effortlessly implement it. By aligning your investigation with this established methodology, you’ll hopefully yield insightful findings related to the online phenomenon you’re examining.

What is Netnography?

Netnography is a modification of the term ethnography, representing a research approach that deals with field research – but on the Internet.

The architect of this methodology is Robert Kozinets, who extensively details the approach in his 2010 book, “Netnography.” Now there is also an updated edition from 2015.

Up until that point, online research was keenly centered on conducting precise and quantitative investigations on social media or within online forums.

How often was a tweet liked, and how did that correlate with its reach? How do networks form on social media? What statistical correlations can be derived from user behavior?

While all these questions are vital and relevant, they aren’t the focus of netnography. Instead, netnography seeks to understand the context of such behaviors, shedding light on who exhibits certain behaviors and why.

If you familiarize yourself with the objectives of classic field research (ethnography), you’ll instantly grasp the netnographic approach. It revolves around observing groups and the behaviors of their members. Netnography simply transposes this research methodology onto the online domain.

The Phases of Netnography According to Kozinets

In the second edition of his book from 2015, Kozinets breaks down the netnographic research approach into 12 phases. Let’s delve into these together.

#1 Introspection Phase

Before you whip out your smartphone and get lost forever in the depths of Reddit, there are some fundamental questions about yourself you must address.

This step is inherently tied to the nature of ethnography. This type of research is entirely dependent on you, the researcher. This means that all your prior knowledge on the topic, your background, and your personal motivations will influence the design of your netnography.

Document the current state of your prior knowledge so that you can describe and reflect upon it in your netnography project.

#2 Investigation Phase

If you’re drafting an extended abstract or proposal for your netnography, this is the appropriate section to record answers to the following questions. If you don’t require such documentation, find another means to note them down for your own reference.

  • What phenomenon do you wish to investigate? (Frame a research question)
  • How do you plan to examine this phenomenon?
  • What kind of data are you looking to collect for this?
  • How do you intend to analyze the data?
  • How will the analysis contribute to addressing the research question?
  • What role do you see yourself playing during the netnography?
  • Under what ethical considerations are you approaching your netnography?
  • What advantages does netnography offer over other research designs?
  • What risks does netnography pose in your specific case (for research subjects)?

Especially crucial to advancing with your netnography is pinpointing the “study sites” – the venues where you plan to carry out your investigation. Is it a Facebook group? A Reddit forum? A YouTube channel?

Who are the individuals frequenting these sites? What would the ideal setting look like for your research to be most effectively conducted?

#3 Information Phase

Engaging with ethical considerations before you even start is of paramount importance. With netnography, you might be entering a protected space.

Users present in that space have certain expectations about it and their privacy. When you’re collecting data from there, it’s crucial to do so ethically.

The top priority is the so-called “Informed Consent.” This means you inform members of the online community about the nature of your research and obtain their approval. At this juncture, you can draft a text or document explaining all this.

You might need to register your study with your university’s ethics committee. To pass the ethical review, you’ll need a draft of your “Informed Consent” document.

Won’t the research results be distorted if users are aware they’re being observed? Yes, this is known as Consent-Bias and is a limitation of netnography. However, you have little choice in this matter because observing without the users’ consent is deemed unethical by almost every ethics committee in the world.

#4 Initial Interview Phase

In this phase, you begin researching communities and online platforms. You have two options for this:

  1. Directly search for the communities, e.g., through search engines.
  2. Initially identify the individuals that can inform your research and then determine in which communities or at which locations they engage with others.

For both ways, Kozinets recommends initial interviews. So, you converse with the individuals or, for instance, the admin of a community to better understand them.

By the end of this phase, you should have a list of potential communities suitable for your netnography. The interviews with members of the communities that make it to your netnography can be repurposed later.

#5 Inspection Phase

When conducting netnography, you’re spoilt for choice. Unlike traditional ethnography, the entire internet is at your disposal, with all its possible forums, groups, and platforms.

Therefore, compare the list of potential communities with your research question and decide which communities you want to study. Think carefully about why you’re making this choice so that you can justify this decision in your research design chapter.

netnography 3

#6 Interaction Entry Phase

Let’s see if Kozinets manages to start all phases with an “I”.

In this phase, the focus is on devising a strategy on how you’ll interact with the users of the community.

As you might know from my other tutorials, there are participatory and non-participatory observations. This principle of minimal to maximal involvement by you, the researcher, also applies to netnographies.

A low level of participation could involve merely reading in an online forum and not interacting with users. High participation might entail conducting interviews with the community members or even posting content yourself.

However, since direct interaction at the community level can often be too intrusive, Kozinets suggests creating an Interaction Research Website. In this scenario, you design a separate website independent of the actual community.

This website then facilitates the interaction between you and the individuals. The advantage is that you’ve secured consent from all participants, and portions of the community can interact with you, while others can abstain if they choose not to engage.

You can find numerous examples of such websites in Kozinets’ book.

You don’t necessarily have to code your interaction website from scratch; you can leverage existing platforms and tailor them to your needs.

#7 Immersion Phase

In the seventh phase, you venture into the field, beginning to interact regularly with the data, topics, and individuals.

The duration and specifics of your immersion depend entirely on you and your study. For inspiration, I recommend reading the research design chapters of published netnographies. There, researchers detail exactly how this phase unfolded for them.

Don’t stress at the beginning of this phase. There’s no right or wrong, no too much or too little. Your understanding of your community and your netnography project will grow over time. The more you immerse yourself, the clearer the overall picture becomes, and the next steps naturally present themselves.

#8 Indexing Phase

This phase is about organizing your data. Often, you’ll have more data at your disposal than you can analyze.

So, the task is to compile a manageable yet meaningful amount of data to proceed with your analysis.

To do this, you’ll need to assess your data. What are particularly important data sources, and what can be neglected?

It’s advisable here to select fewer but high-quality data sources rather than many of low quality.

#9 Interpretation Phase

You’re free to choose your analysis method. Remember that Netnography is a methodology, that means that you still have to choose your methods you want to use for data analysis within this research approach.

Qualitative methods that stand for an interpretivist approach, are suitable. This includes all methods that operate on the principle of hermeneutics.

Phenomenological, existential, or humanistic approaches are also possible.

A typical analysis method that aligns well with netnography are Grounded Theory techniques. On my channel, you’ll find plenty of tutorials on various qualitative research methods.

#10 Iteration Phase

If you’re already familiar with hermeneutics, Grounded Theory, or other qualitative approaches, then this phase will seem familiar.

Netnography does not proceed linearly. This means all the steps you learn here are not conducted from 1 to 12 in sequence and then you’re done.

Netnography is meant to proceed in iterations, which is why these steps should be understood as a cycle. You can (and should) revisit any phase of your netnography and repeat steps when you deem it meaningful.

For instance, if you’ve gained a significant insight through coding with Grounded Theory, return to the Immersion Phase and look for this specific insight. That’s just one example of an iteration, and there are many more. So, perceive these phases not as rigid but flexible.

#11 Instantiation Phase

In the penultimate phase, the focus is on shaping your netnography into a tangible form. For most of us, this means producing a research paper. So, consider the best possible structure to present your netnography.

As always, my tip is: don’t reinvent the wheel, but draw inspiration from examples you enjoy reading.

According to Kozinets, there are four different forms of netnography instantiation. These are:

  • Symbolic
  • Digital
  • Auto
  • Humanistic

Discussing all these forms here would be beyond the scope. Thus, I trust you’ll dive into Kozinets’ book to acquaint yourself with them, assign your netnography to one of these types, and align your research paper accordingly.

However, the presentation of your netnography doesn’t always have to be (just) an academic text. You can let your creativity run free.

netnography 2

#12 Integration Phase

What happens now that you’ve completed your netnography? All the hard work shouldn’t just be read by academics; it should also have an impact on the world out there.

This could be in the form of a publication, a YouTube video, a workshop, or an event with the community you worked with.

  • What can you give back to this community?
  • How can your results make the (online) world a little better in a lasting way?

This phase doesn’t end with the submission of your research paper.

The netnography is now a part of you and your story.

Categories
Uncategorized

Exam Preparation? This STUDY METHOD is all you need.

Are you deep into exam preparation and need to memorize your study materials as best as you can?

Then this article is tailor-made for you – I’ll introduce you to the ultimate study method that will engrave the material into your memory and let you recall everything effortlessly on exam day.

This study method is grounded in two core principles, proven over decades of educational research, to be essential for outstanding exam performance.

By implementing the method from this article, you won’t need to second-guess whether you’ll perform well in the exam in the future.

Insights from 100 Years of Education Research (The Short Version)

For many decades, research has been conducted focusing on individual learning success as the dependent variable and understanding how our brain retains information.

From this, one would theorize that we could derive recommendations for your exam preparation, leading to an optimal learning strategy. Right?

Absolutely.

The first thing you need to know is the “Two-Stage Memory System,”

the most widely recognized memory model in the literature. It is also known to many as the distinction between “short-term memory” and “long-term memory”.

The reason we quickly forget things is due to “catastrophic interferences,” which are new pieces of information that overwrite old ones.

Only through repeated and active recall can the information be transferred to the second system, commonly referred to as long-term memory (Rasch & Björn, 2013).

The Forgetting Curve

The German researcher Hermann Ebbinghaus made a significant mark in psychology with a series of self-experiments. He was the pioneer in describing the so-called “forgetting curve.”

exam preparation

The curve describes that we forget the majority of information shortly after learning it. However, by incorporating regular repetitions, the curve becomes less steep. Consequently, the intervals between repetitions can gradually increase over time.

For instance, if you’re studying an important slide deck for an exam, it makes sense to schedule a revision the very next day. Then, the next repetition could be after 2 days, followed by 4 days, and so on.

What Ebbinghaus did not know back then: The steepness of the curve depends on the type of content. Principles or laws are retained for longer than, for example, numbers and dates or, as in Ebbinghaus’ experiments, unrelated syllables.

The more abstract and numerical your study material, the more repetitions you’ll initially need.

Building upon Ebbinghaus’s discoveries, the concepts of “Active Recall” and “Spaced Repetition” emerged later as the ultimate study techniques.

Active Recall for Exam Preparation (Principle #1)

In contrast to Active Recall, “Passive Intake” refers to the mere passive consumption of information, such as quietly reading a script or a textbook and making annotations.

While this method may feel relatively effortless and relaxing at the outset, it’s not the most efficient way to retain the content. This is because very little actually sticks in your memory.

The more effective method is to actively retrieve the study material.

Here’s how it works:

Pose a question to yourself, similar to one you might encounter in your exam, and attempt to answer it aloud, relying solely on your memory.

Flashcards are an ideal tool for this technique.

Write the question on the front and the answer on the back. So, when working with a script or book, your main goal should be to quickly transfer the content you need to learn into this question-and-answer format on flashcards.

If you can’t fully answer a question, revisit the provided solution.

To devise a study schedule that factors in the forgetting curve, let’s delve into the second study technique.

Spaced Repetition for Exam Preparation (Principle #2)

Essentially, I’ve already touched upon the principle. It’s about scheduling repetitions just before you’re about to forget the content.

Given that the forgetting curve becomes flatter over time, the intervals between repetitions can be extended.

This approach often contrasts with typical study habits. Many of us briefly review content once, then, driven by pre-exam panic, attempt to cram as many repetitions as possible in a short period.

Ideally, the process should be the reverse: You should study at shorter intervals at the beginning of your preparation and only need a single review right before the exam.

Especially when facing multiple exams in a brief timeframe, plan your reviews such that the intervals between them are synchronized across the various exams. This approach maximizes the benefits of the Spaced Repetition principle.

Algorithmic Personalization in Exam Preparation

Using both Active Recall and Spaced Repetition in your exam preparation is undoubtedly powerful.

However, research on this subject has introduced another method on top of them that significantly boosts learning outcomes.

Here’s What the Science Says

In their study, Lindsey et al. (2014) examined three learning methods.

The first method is something we’re all too familiar with from school and university. Each week, a topic or chapter was covered, which was then assessed via a quiz.

The second method was based on the Spaced Repetition principle. All participants in the study were presented with topic content in their learning app, with increasing intervals between repetitions. These intervals were standardized for everyone.

For the third method, participants were given a learning app that personalized the intervals with a simple algorithm. If a question was answered incorrectly, the next repetition came sooner than for the questions that were answered correctly.

The app’s algorithm thus adjusted the intervals and the number of repetitions individually for each participant during their exam preparation.

The results were conclusive.

exam preparation 2

Method 3, that is, personalized Spaced Repetition, significantly outperformed the other two. Even a month after the exam, the contents were best recalled by this group.

Software for Algorithmic Personalization

The easiest way to implement algorithmic personalization for your exam preparation is through software.

There are various flashcard apps available. When choosing one, make sure the app adjusts repetition intervals based on your answers. This typically happens through a feedback mechanism.

You can indicate how well you were able to answer the question on a flashcard. If you rate your answer as poor, the app will present the card to you again sooner than others.

This way, you compel your brain to perform at its optimal memory capacity.

Examples of Apps that implement this feature include Anki, Brainscape, Quizlet, Cram, and IDoRecall.

Categories
Uncategorized

How to Write a Methodology Section 

You would like to know how to write a methodology section effectively?

In this article, you’ll discover how to do so, regardless whether you are working on a paper, a thesis, or a dissertation.

When I mean methodology section, I’m referring to the section that you position after the literature review and before presenting your results.

This means that this tutorial is for you if you are working with empirical data or a systematic approach to analysing literature.

We will cover everything from quantitative, qualitative, and mixed methods approaches in this article.

The difference between methodology and method

A very common rookie mistake is to confuse the concepts methodolgy and method.

Don’t do that.

A methodology describes an overarching research strategy, which can involve multiple methods. An empirical method is a single procedure that can help you to collect or analyse data.

An example for a methodology would be case study research. And under the umbrella of this methodology, you can apply qualitative methods, for example.

Writing your Methodology Section

In this very important section, you outline your research design, which means the complete process you apply in this particular study to gain new knowledge and answer your research question or questions.

You need to demonstrate how well-thought-out your research design is and how neatly you’ve aligned your data collection and analysis methods.

For every empirical research project, this is one of the most important criteria for anyone who reviews your work.

A good thing is that the fundamental principles for writing a methodology section remain the same. This means that you can learn a formular that you can follow again and again.

After internalizing the three fundamental principles that I am about to show you, I strongly recommend delving into two, three, or even four papers that follow a similar methodological approach.

You’ll quickly recognize these fundamental principles and can apply them to your own work.

#1 Start with the research paradigm or a framework

To start your methodology section, you should state the paradigm you are following (i.e., quantitative, qualitative, mixed methods, review, design science, action research, etc.).

You then name a specific framework or an author that provided guidelines that you follow for your data collection and analysis.

After many decades of empirical research in various disciplines, some works and authors have emerged who have developed a sort of scientific consensus.

So, you don’t need to do anything other than follow their guidance.

Research the most cited methodological frameworks in your discipline and compare which one is best suited for your study.

For example:

  • “Case Study Research – Design & Methods” by Robert Yin is a standard reference for conducting case studies.

However, if you are planning a more interpretative case study, another framework is probably more suitable, as Yin’s approach is best suited for quantitative research within the case study paradigm.

Once you have chosen a framework, follow the suggested approach as consistently as possible, and be sure to cite it in your methodology section.

The steps outlined in your chosen framework will now determine how you structure the rest of your methodology section.

To do this, you simply describe the steps in chronological order.

#2 Stay True to the Process

After you clarified what your overall approach is, you begin to describe each method. You should structure this part in a way that explains your workflow step by step, in chronological order.

You typically begin with the data collection method.

For example, for a quantitative online survey:

  • How was the questionnaire developed?
  • How was the survey created (technically)?
  • Was there a pretest? How was it conducted, and what were the results?
  • How were participants recruited?
  • How was the composition and size of the sample determined?
  • What analyses (statistical tests) were calculated and using what software?
  • Are there reliability measures you can report?

Example for qualitative interviews:

  • What is a good description of your case? (only for case studies)
  • How was the interview guide developed?
  • How were the interviewees recruited?
  • How was the composition and size of the final sample? Why did you make these choices?

You then continue with the analysis method.

  • What analyses (e.g., content analysis) were conducted and using what software?
  • What coding scheme did your analysis follow?
  • Are there reliability measures you can report?

You can divide things like case description, data collection, and analysis in different sub-sections to structure your methodology section. It is important to use very standardized headings like “data collection”.

Nothing fancy here.

Additionally, you should not reveal any results at this point.

The methodology section only describes your research design so that it could be replicated by anyone.

Transparency is crucial, and even unexpected incidents such as a failed pretest or a revised codebook are not a problem.

They contribute to adding depth to your approach and show that you have followed good scientific practices.

How to Write a Methodology Section 2

Secret Tip: Create a Figure

Before we dive into the third and final fundamental principle, let’s explore a secret tip: illustrating your research design.

This means that you visually represent your research design that is both clear and engaging.

Build on the framework you referred to in the beginning, but apply it specifically to the combination of methods that you have chosen.

For example:

How to Write a Methodology Section 3

In this study, a qualitative content analysis of Twitter posts was conducted. Following the figure, the individual steps 1-4 would then be described in chronological order in the text.

The figure just makes it more appealing and allows the reader to see the whole process at a glance.

#3 Connect Each Step with a Justification

Even if you follow a framework meticulously, you may sometimes encounter unplanned changes, dilemmas, or too many options. In such cases, you need to make tough decisions.

The methodology section is not only a description of your research design but also a continuous justification of your choices.

As you learned before, you should explain each step sequentially and ideally refer to a scientifically recognized framework.

Now, in the final step, ensure that you logically link the individual steps with justifications. You can roughly remember it like this:

For each step you describe, write another sentence explaining why this step was the best and most logical choice. To practice this, you can review your logical chain at the end. You can also back up your justifications with literature.

Here’s an example:

We chose a multi-case design as it allows the development of more robust insights by identifying key practices of organisations that go beyond the idiosyncrasies of individual cases (Eisenhardt & Graebner, 2007)”.

Why a Case Study?

“Due to the fact that Augmented Reality has only been sparsely adopted in the crafts industry (reference here), and Company XY has been actively promoting the use of AR glasses in production for two years, participating in the pilot project ABC, which was the first to implement DEF in practice in Germany, a case study of this company is suitable to investigate (content of the research question).”

Why Qualitative Interviews?

“The identified needs for theory-building in relation to this topic in the literature, combined with the novelty of the technology in the context of the crafts industry, necessitate the creation of a qualitative data foundation to identify requirements, motivations, and barriers.”

Why A content analysis?

And so on…

I think you get the idea.

As you can see, the methodology section consists of a sequence of steps and their justifications.

The quality of this chapter will be largely judged based on how well you are able to justify your decisions.

How to Write a Methodology Section shribe

One Last Note

Sometimes, you may have to make a decision where you’re not sure which direction to go.

That’s perfectly fine.

And sometimes, there’s no right or wrong; there’s only well-justified and poorly-justified.

The key is to make a decision and then describe, in a comprehensible manner, why that decision makes sense both methodologically and conceptually.

If you can do that, there’s nothing standing in the way of a high score for this section in the evaluation of your research project.

Categories
Uncategorized

Qualitative Data Analysis with ChatGPT (extremely time-saving)

Qualitative data analysis with ChatGPT is new kid on the block in research nowadays.

This includes thematic analysis, grounded theory, and content analysis. All these methods can benefit from AI.

Because, unfortunately, qualitative data analysis is extremely labor-intensive.

For example, interview transcripts sometimes need to be analyzed sentence by sentence.

Wouldn’t this be a task where you could wonderfully get support from ChatGPT?

Qualitative Data Analysis with ChatGPT

Qualitative data analysis can be used to summarize or structure large amounts of qualitative data, i.e., interview transcripts, documents, or social media postings.

Summarizing means abstracting the content by forming categories, a process also known as “coding.” You assign a “label” to small units of analysis, such as an answer to an interview question, and can thus present hundreds of pages of text in a single category system.

Summarizing also means that you don’t know in advance what categories will emerge. This approach follows an inductive logic, from specific (the data) to general (the categories).

Structuring means assigning the content to pre-defined categories. These categories can, for example, be derived from the literature. This results in a so-called codebook that specifies when an analytical unit corresponds to a certain category.

Both tasks are extremely labor-intensive and repetitive.

That means they are tailor-made for artificial intelligence, or even better, for a language model like GPT.

So, how can we best integrate ChatGPT into the qualitative data analysis process? In the following, we’ll look at 7 steps to turn ChatGPT into your personal research assistant.

The following steps are loosely based on the working paper by Zhang et al. (2023) from Penn State University, where the authors summarized the best practices in prompt engineering from 17 qualitative researchers.

In case you did not know, prompt engineering refers to crafting the instructions you give to ChatGPT and other large language models.

Qualitative Content Analysis with ChatGPT

#1 Start your Prompt with a Role-play

Several experiments with ChatGPT have shown that assigning a role can significantly improve the quality of results.

Let’s look at an example.

In our case, ChatGPT is supposed to conduct a thematic analysis with interview data we have collected. So, think of a role that fits this context. For example:

“Imagine you are a researcher and an expert in the evaluation of qualitative data, such as interview transcripts.”

#2 Provide ChatGPT with Context

To perform a qualitative data analysis with ChatGPT, a two-line instruction won’t get you very far. You need to formulate your prompt as detailed as possible to get the best results.

Just imagine you’re talking to someone who is very intelligent but is doing the task for the first time. For our thematic analysis, it could look like this:

“I will provide you with an interview transcript that you should analyze using a method called ‘thematic analysis.’ Your task is to assign categories to the answers in the transcript. A category is an abstract summary of the content. You should not assign categories to the questions themselves.”

#3 Give ChatGPT a Detailed Description of your Qualitative Method

The prompt should describe as precisely as possible what the goal of the task is. This also includes aligning the prompt with the specific goal you have in mind. Let’s say your interviews are about the topic of Remote Work and Stress.

Then you should formulate your prompt accordingly. Always keep in mind that ChatGPT only knows as much about the task as the information you provide to the AI.

“The interview transcripts focus on the topic of remote work. Specifically, the categories you create based on the answers should relate to how remote workers experience stress and what countermeasures they take in this context.”

The level of detail cannot be too high. It’s important not to rush and take your time when composing the prompt.

Qualitative Content Analysis with ChatGPT 2

#4 Specify How Results Should Look Like

If you’re working with pre-defined categories or a codebook, now is the time to explain to ChatGPT into which categories the data should be classified. You can also include a complete codebook if you have one.

However, for our example, we’ll continue with an inductive analysis, which means that we develop them from scratch.

Here, we want ChatGPT to keep the categories from being too abstract. I would recommend using the AI only for the labor-intensive tasks, which typically is the first round of coding. This step is often referred to as “open coding”.

Later, when we develop more theoretical codes or themes, we involve our brain a bit more and use our creativity.

For the open coding, we want ChatGPT to give us some examples from the data that represent these categories well.

“Don’t give me more than 10 categories for the first 3 transcripts and, for each category, find a text example of at least three sentences that best reflects the category.”

#5 Structure Your Data

ChatGPT delivers better results when you structure the transcripts.

That means not just copying and pasting but making sure the format of question and answer is consistent.

Spelling errors or punctuation do not affect ChatGPT’s performance.

The easiest way to explain the structure of the transcripts to ChatGPT is as follows:

The transcripts are structured as follows:

<Interviewer (indicated by the abbreviation ‘I’)> : <Question (ending with ‘?’)>

<Participant (indicated by the abbreviation ‘P’)> : <Answer>

#6 Define the Format of the Results

To make it easier for you to process the results and continue to work with them, consider how ChatGPT should present them.

A table would be a good idea in most cases. You could integrate it into your prompt like this:

“Please present the results in table form. The first column contains the categories, the second column contains a description of the categories (at least three sentences), the third column contains the example quote, and the last column contains the number of participants who mentioned the topic.”

#7 Adjust as Needed

After you’ve entered your monster prompt and fed ChatGPT with the transcripts, you can still make adjustments as necessary.

If you feel like the 10 categories are too few, you can ask for more. If you think there are too many, you can instruct ChatGPT to prioritize the categories. For example:

“Reduce the categories to 8 and arrange the table so that the categories mentioned by the most participants are listed first.”

You can then transfer the output table to Excel, for example, and continue working with it.

Qualitative Data Analysis with ChatGPT: Things to Keep in Mind

Are you convinced by the results?

That’s great, but hold on a second. You should be aware of some important limitations of ChatGPT when it comes to qualitative data analysis.

Qualitative Content Analysis with ChatGPT 3

#1 Transparency of Analysis

It’s often challenging to understand how ChatGPT has formed the categories. The authors of the working paper found in their tests that two additional instructions not only improve the results but also enhance transparency.

“Analyze the transcripts sentence by sentence.”

With this instruction, you prompt ChatGPT to analyze the transcripts from beginning to end, just as you would manually. This makes it less likely that ChatGPT skips or neglects parts of the data. So, be sure to include this short sentence.

Another recommendation is to include the following sentence:

“Provide a brief explanation for each category, explaining how you arrived at the category.”

ChatGPT will then provide a description of how the categories relate to the data, making it easier for you to understand how the categories were generated.

#2 ChatGPT Can Get “Tired”

ChatGPT is a black box, meaning we can’t see what’s happening under the hood. If you overburden ChatGPT with many different instructions for an extended period, the quality of results may decline.

The AI is likely to improve over time, and differences between the Pro and free versions may exist, among other factors.

What you should do is create the entire prompt as best as possible and then let ChatGPT process it only once or make only minor adjustments afterwards.

#3 Reliability in Qualitative Data Analysis with ChatGPT

Experiments in which the API of ChatGPT is queried with the same prompts multiple times show that the results are slightly different each time.

This is also known as the “temperature” of the large language model.

The higher the temperature, the more varied the results.

As an average user, you rely on the default settings, where the temperature is not too high. However, what you can (and should) do is perform a manual reliability check, just as you would when coding with another person.

If you’re wondering why I’m not discussing research ethics and plagiarism in this video, simply check out my ChatGPT plagiarism video. There, we cover everything you need to know to use ChatGPT correctly as a tool for your academic projects.

Categories
Research Methods

The Gioia Method for Grounded Theory (simply explained)

Have you stumbled upon the Gioia Method while looking for a suitable research method?

Now you surely want to know if this approach is a fit for your qualitative study and how the method distinguishes itself from conventional Grounded Theory methods.

Great that you’re here! Because that’s exactly what you’re going to learn about in this article.

After learning where the Gioia Method originated and what its purpose is, I’ll explain to you how to use the Gioia Method in 5 steps.

After this article, you can immediately start to analyze your qualitative data and impress your supervisor.

So, “sit back, relax, and enjoy the show!”

Where does the Gioia Method come from?

The Gioia Method was named after Professor Dennis Gioia. In the 1990s, the management scholar began, along with various co-authors, to use Grounded Theory in his research.

Already in the 1960s, Glaser and Strauss had developed the Grounded Theory approach. However, it took time for it to establish itself in disciplines other than sociology.

Gioia and his co-authors consistently received the same feedback from the review panels of management journals:

The article is wonderfully written, and the theoretical value sounds promising, but how do we know whether this is truly a result based on your interview data or if you’ve just made it up?

The application of Grounded Theory was largely limited to researchers who, following the paradigm of interpretivism, sought to counteract the prevailing positivist paradigm, or in other words, the dominance of quantitative research.

These early Grounded Theory studies were fundamentally different compared to what quantitative researchers perceived as scientific.

And that was a problem.

Gioia responded by developing his own way of doing Grounded Theory.

What is now called the Gioia Method aims to address the poor reputation of qualitative research by introducing more rigor into the theory development process.

In 2013, Gioia et al. published an article that explained the approach from A to Z.

Gioia Method

When should you use the Gioia Method?

The Gioia Method is suitable for inductive, qualitative-interpretivist research.

In most cases, the foundation for this is data from interviews.

However, the Gioia Method can also be applied to other types of data, such as during literature reviews or for the analysis of documents or social media posts.

For those who find the classic Grounded Theory a bid elusive, they might take a liking to the Gioia Method.

In my view, it’s somewhat easier to grasp and provides a more explicit pathway on how to progress from interview data to your very own theory.

The Gioia Method in 5 Steps

#1st-Order Concepts

The data analysis starts with the formation of so-called “1st-Order Concepts”. This step is comparable to the open coding of the original Grounded Theory approach.

Here, you categorize the data and primarily use the language you find within the data.

You do not have to find very abstract categories or need to do a lot of interpretation. As coding units, you take single statements. You can also code every sentence if you want to be really serious about it.

According to Gioia, from just 10 interviews, one could derive 50 to 100 1st-Order Concepts. These concepts don’t have to be just a single word; they can also be short sentences. I’ll show you an example of this in Step 4.

#2nd-Order Themes

Next, you’ll sift through the 1st-Order Concepts, attempting to group them logically.

Can you see a pattern here?

If so, you can now identify more abstract categories that consolidate several 1st-Order Concepts.

With these abstract categories, you’ll distance yourself from the exact wording of the data, crafting themes with your own language.

If you already have an idea of how these emerging 2nd-Order Themes relate to each other, all the better. You can also begin sketching that out. This process is analogous to axial coding as introduced by Strauss and Corbin (1998).

Once you reach this juncture, it’s time to gather some fresh data.

Here, you can specifically seek experts who can tell you more about what you already found.

This is called theoretical sampling and is a hallmark of classic Grounded Theory.

Data collection ceases when you no longer uncover new 2nd-Order Concepts (also known as “Theoretical Saturation”).

#3 Aggregated Dimensions

Once the data collection is complete and you have coded them all, you consolidate your 25-30 2nd-Order Concepts once more.

This results in approximately 3-5 theoretical dimensions.

Ideally, these should be original and describe your observed phenomenon in a way no one else has done before.

#4 Form a Data Structure

Now comes the step that sets the Gioia Method apart. From the formed 1st-Order Concepts, 2nd-Order Themes, and aggregated dimensions, you’ll create what’s known as a data structure.

This is essentially a horizontal diagram that demonstrates how the 2nd-Order Themes emerged from the 1st-Order Concepts and how the aggregated dimensions arose from the 2nd-Order Concepts. You’ll then incorporate this as a figure in your methodology chapter.

This data structure allows readers to better understand how theoretical concepts have been derived from the data.

Gioia Method 2

#5 Develop a Grounded Theory Model

However, the data structure is not the final outcome of the Grounded Theory study as per the Gioia Method.

While it does showcase all the theoretical components of the new theory, it remains static.

What’s missing now is integrating the dynamics and processes you’ve observed.

This means that your focus will now shift to the relationships between the concepts. Typically, a number of arrows assist in this phase. 😉

In Gioia et al.’s example on page 23, you can see how the aggregated dimensions and the 2nd-Order Concepts reappear in the model and how they are connected to each other.

Often times such a model is an inductive process model. Such a model explains a process by showing some practices that are involved in this process.

It is not necessary that the first-order concepts appear in the model, but there should be a clear connection between the data structure and the final model.

For example, the aggregated dimensions could be three phases within the model or three central practices.

If you can arrange your themes and dimensions in a way that answers your research question, you have successfully applied the Gioia method and derived your very own grounded theory!

Conclusion

Gioia et al. emphasize in their article that the approach is more akin to a methodology rather than a concrete method, even if it’s referred to as such.

This implies that deviations from this process are not only possible but also intended.

Each study is unique, and you shouldn’t rigidly adhere to individual steps if, from your perspective, a deviation from the guidelines makes sense.

However, according to Gioia et al., it’s crucial in such cases to meticulously describe the exact process of your analysis in the methodology section of your work.

Even if deviations are permissible, the defining characteristic of the Gioia Method is its structured and systematic approach.