14.3 Unobtrusive data collected by you

Learning Objectives

  • Define content analysis
  • Describe the kinds of texts that content analysts analyze
  • Outline the differences between manifest content and latent content
  • Discuss the differences between qualitative and quantitative content analysis
  • Describe code sheets and their purpose

 

This section focuses on how to gather data unobtrusively and outlines what you should do with the collected data. There are two main ways of gathering data unobtrusively: conducting a content analysis of existing texts and analyzing physical traces of human behavior. We’ll explore both approaches.

Content analysis

One way of conducting unobtrusive research is to analyze texts. Texts come in all kinds of formats. At its core, content analysis addresses the questions of “Who says what, to whom, why, how, and with what effect?” (Babbie, 2010, pp. 328–329). [1] Content analysis is a type of unobtrusive research that involves the study of texts and their meaning. Here we use a more liberal definition of text than you might find in your dictionary. The text that content analysts investigate includes such things as actual written copy (e.g., newspapers or letters) and content that we might see or hear (e.g., speeches or other performances). Content analysts might also investigate more visual representations of human communication, such as television shows, advertisements, or movies. The following table provides a few specific examples of the kinds of data that content analysts have examined in prior studies. Which of these sources of data might be of interest to you?

Table 14.1 Content analysis examples
Data Research question Author(s) (year)
Spam e-mails What is the form, content, and quantity of unsolicited e- mails? Berzins (2009) [2]
James Bond films How are female characters portrayed in James Bond films, and what broader lessons can be drawn from these portrayals? Neuendorf, Gore, Dalessandro, Janstova, and Snyder-Suhy (2010) [3]
Console video games How is male and female sexuality portrayed in the best-selling console video games? Downs and Smith (2010) [4]
Newspaper articles How do newspapers cover closed-circuit television surveillance in Canada, and what are the implications of coverage for public opinion and policymaking? Greenberg and Hier (2009) [5]
Pro-eating disorder websites What are the features of pro-eating disorder websites, and what are the messages to which users may be exposed? Borzekowski, Schenk, Wilson, and Peebles (2010) [6]

You might notice that the data sources in Table 14.1 represent primary sources, which means they are the original documents written by people who observed the event or analyzed the data. Secondary sources, on the other hand, are those that have already been analyzed. Often, secondary sources are created by looking at primary sources and analyzing their contents. We reviewed the difference between primary and secondary sources in Chapter 2.

In her methods text, Shulamit Reinharz offers a helpful way to distinguish between these two types of sources. She explains that while primary sources represent the “‘raw’ materials of history,” secondary sources are the “‘cooked’ analyses of those materials” (1992, p. 155). [7] The distinction between primary and secondary sources is important for many aspects of social science, but it is especially important to understand when conducting content analysis. While there are instances where secondary sources are analyzed in content analysis, it is more common for content analysts to analyze primary sources.

In those instances where secondary sources are analyzed, the researcher’s focus is usually on how the original analyst or presenter of data reached their conclusions or on how they chose to present the data. For example, James Loewen (2007) [8] conducted a content analysis of high school history textbooks. His aim was not to learn about history, but to understand how students are taught American history in high school. His inquiry uncovered that the books often glossed over issues of racism, leaving students with an incomplete understanding of the trans-Atlantic slave trade, the extermination of Indigenous peoples, and the civil rights movement.

Sometimes students who are new to research methods struggle to grasp the difference between a content analysis of secondary sources and a literature review, discussed in Chapter 4. In a literature review, researchers analyze theoretical, practical, and empirical sources to try to understand what we know and what we don’t know about a particular topic. The sources used to conduct a scholarly review of the literature are typically peer-reviewed sources, written by trained scholars, published in some academic journal or press. These sources are selected to help the researcher arrive at some conclusion about our overall knowledge about a topic. Findings from sources are generally taken at face value.

Conversely, a content analysis of scholarly literature would raise questions not addressed in a literature review. A content analyst who examines scholarly articles would try to learn something about the authors (e.g., who publishes what and where), publication outlets (e.g., how well do different journals represent the diversity of the discipline), or topics (e.g., how has the popularity of topics shifted over time). A content analysis of scholarly articles would be a “study of the studies” as opposed to a “review of studies.” Perhaps, for example, a researcher wishes to know whether more male or female authors are published in the top-ranking journals in the discipline. The researcher could conduct a content analysis of different journals and count authors by gender (though this may be a tricky prospect if relying only on names to indicate gender). Researchers may also want to learn whether or how various topics of investigation go in and out of style. To do so, they could investigate changes over time in topical coverage in various journals. In these latter two instances, the researcher is not aiming to summarize the content of the articles, as in a literature review, but instead is looking to learn something about how, why, or by whom particular articles came to be published.

Content analysis can be qualitative or quantitative, and researchers often use both strategies to strengthen their investigations. In qualitative content analysis, the aim is to identify themes in the text being analyzed and to identify the underlying meaning of those themes. For example, Alyssa Goolsby (2007) [9] conducted qualitative content analysis in her study of national identity in the United States. To understand how the boundaries of citizenship were constructed in the United States, she conducted a qualitative content analysis of key historical congressional debates focused on immigration law.

Quantitative content analysis, on the other hand, involves assigning numerical values to raw data so that it can be analyzed statistically. Jason Houle (2008) conducted a quantitative content analysis of song lyrics. Inspired by an article on the connections between fame, chronic self- consciousness (as measured by frequent use of first-person pronouns), and self-destructive behavior (Schaller, 1997), [10] Houle counted first-person pronouns in Elliott Smith song lyrics. His analysis found that Smith’s use of self-referential pronouns increased steadily from the time of his first album release in 1994 until his suicide in 2003 (2008). [11] In the final portion of this section, we’ll elaborate on how qualitative and quantitative researchers collect, code, and analyze unobtrusive data.

Indirect measures

Texts are not the only sort of data that researchers can collect unobtrusively. Unobtrusive researchers might also be interested in analyzing the evidence that humans leave behind that tells us something about who they are or what they do. This kind evidence includes the physical traces left by humans and the material artifacts that tell us something about their beliefs, values, or norms. Physical traces include such things as worn paths across campus, the materials in a landfill or in someone’s trash can (a data source William Rathje and colleagues [Rathje, 1992; Rathje & Murthy, 1992] [12] have used), indentations in furniture, or empty shelves in the grocery store. Examples of material artifacts include video games and video game equipment, sculptures, mementos left on gravestones, housing structures, flyers for an event, or even kitchen utensils. What kinds of physical traces or material artifacts might be of interest to you?

The original author of this text, Dr. Blackstone, relates the following example of material artifacts:

I recently visited the National Museum of American History in Washington, DC. While there I saw an exhibit displaying chef Julia Child’s home kitchen, where she filmed many of her famous cooking shows. Seeing the kitchen made me wonder how cooking has changed over the past few decades since Child’s shows were on air. I wondered how the layout of our kitchens and the utensils and appliances they contain might influence how we entertain guests, how much time we spend preparing meals, and how much time we spend cleaning up afterward. Our use of particular kitchen gadgets and utensils might even indicate something about our social class identities. [13] Answers to these questions have bearing on our norms and interactions as humans; thus, they are just the sorts of questions researchers using unobtrusive methods might be interested in answering. I snapped a few photos of the kitchen while at the museum. Though the glass surrounding the exhibit prevents ideal picture taking, I hope the photos in Figure 14.1 give you an idea of what I saw. Might the organizational scheme used in this kitchen, or the appliances that are either present or missing from it, shape the answers to the questions I pose above about human behaviors and interactions? (Blackstone, n.d.)

 

picture of a kitchen with many pots and pans organized on a pegboard
Figure 14.1 A visit to chef Julia Child’s kitchen at the National Museum of American History

[14]

One challenge with analyzing physical traces and material artifacts is that you generally don’t have access to the people who left the traces or created the artifacts that you are analyzing. (And if you did find a way to contact them, then your research would no longer qualify as unobtrusive!) It can be especially tricky to analyze meanings of these materials if they come from some historical or cultural context other than your own. It is not always easy to situate the traces or artifacts you wish to analyze both in their original contexts and in your own. Sometimes, it can even lead to problems during data analysis. How do you know that you are viewing an object or physical trace in the way that it was intended to be viewed? Do you have the necessary understanding or knowledge about the background of its original creators or users to understand where they were coming from when they created it?

Imagine an alien trying to understand some aspect of Western human culture simply by examining our artifacts. Cartoonist Mark Parisi demonstrates the misunderstanding that could ensue in his drawing featuring three very small aliens standing atop a toilet. One alien says, “Since water is the life-blood on this planet, this must be a temple of some sort…Let’s stick around and see how they show their respect” (1989). [15] Without a contextual understanding of Western human culture, the aliens have misidentified the purpose of the toilet, and they will be in for quite a surprise when someone shows up to use it!

The point is that while physical traces and material artifacts make excellent sources of data, analyzing their meaning takes more than simply trying to understand them from your own contextual position. You must also be aware of who caused the physical trace or created the artifact, when they created it, why they created, and for whom they created it. Answering these questions will require accessing materials in addition to the traces or artifacts themselves. It may require accessing historical documents or, if analyzing a contemporary trace or artifact, perhaps another method of data collection such as interviews with its creators.

Analysis of unobtrusive data collected by you

Once you have identified the set of texts, physical traces, or artifacts that you would like to analyze, the next step is to figure out how you’ll analyze them. This step requires that you determine your procedures for coding, differentiate between manifest and latent content, and understand how to identify patterns across your coded data. We’ll begin by discussing procedures for coding.

You might recall being introduced to coding procedures in Chapter 13, where we discussed the coding of qualitative interview data. While the coding procedures used for written documents obtained unobtrusively may resemble those used to code interview data, many sources of unobtrusive data differ dramatically from written documents or transcripts. What if your data are sculptures or paths in the snow? The idea of conducting open coding and focused coding on these sources as you would for a written document sounds a little silly, not to mention impossible. So how do we begin to identify patterns across the sculptures or worn paths or utensils we wish to analyze? One option is to take field notes as we observe our data and then code patterns in those notes. Let’s say, for example, that we’d like to analyze how people use kitchen utensils, as in Figure 14.1. Taking field notes might be a useful approach were we conducting observations of people actually using utensils in a documentary or on a television program. (Remember, if we’re observing people in-person then our method is no longer unobtrusive.)

If our data include a collection of actual utensils, rather than observations of people in documentaries or television shows, note-taking may not be the most effective way to record our observations. Instead, we could create a code sheet to record details about the utensils in our sample. A code sheet, sometimes referred to as a tally sheet in quantitative coding, is the instrument an unobtrusive researcher uses to record observations.

In the example of kitchen utensils, perhaps we’re interested in how utensils have changed over time. If we had access to sales records for utensils over the past 50 years, we could analyze the top-selling utensil for each year. To do so, we’d want to make some notes about each of the 50 utensils included in our sample. For each top-rated utensil, we might note its name, its purpose, and perhaps its price in current dollar amounts. We might also want to make some assessment about how easy or difficult it is to use or some other qualitative assessment about the purpose of the utensil. To rate the difficulty of use we could use a 5-point scale, with 1 being very easy to use and 5 being very difficult to use. We could even record other notes or observations about the utensils that may not occur to us until we actually see the utensils. Our code sheet might look something like the sample shown in Table 14.2.

Note that the sample sheet contains columns for only 10 years’ worth of utensils. If you were to conduct this project, you’d need to create a code sheet that allows you to record observations for each of the 50 items in your sample.

Table 14.2 Sample code sheet for study of kitchen utensil popularity over time
1961 1962 1963 1964 1965 1966 1967 1968 1969 1970
Utensil name
Utensil purpose
Price (in 2011 $)
Ease of use (1-5 scale)
Other notes

As you can see, our code sheet will contain both qualitative and quantitative data. Our “ease of use” rating is a quantitative assessment, therefore we can conduct statistical analysis of the patterns. Perhaps we could note the mean value on ease of use for each decade we’ve observed. We could do the same thing with the data collected in the row labeled “price,” which is also quantitative. The final row of our sample code sheet, containing notes about our impressions of the utensils we observe, will contain qualitative data. We may conduct open and focused coding on these notes to identify patterns across those notes. In both cases, whether the data being coded are quantitative or qualitative, the aim is to identify patterns across the coded data.

The “purpose” row in our sample code sheet provides an opportunity to assess both manifest and latent content. Manifest content is the content we observe that is most apparent; it is the surface content. This is in contrast to latent content, which is less obvious. Latent content refers to the underlying meaning of the surface content we observe. In the example of utensil purpose, we might say a utensil’s manifest content is the stated purpose of the utensil. The latent content would be our assessment of what it means that a utensil with a particular purpose is top-rated. Perhaps after coding the manifest content in this category we see some patterns that tell us something about the meanings of utensil purpose. Perhaps the meanings of top-rated utensils across five decades tell us something about sociocultural history. Perhaps the shift from utensils designed for entertaining in the 1960s to utensils designed for kitchen efficiency in the 1980s reflects a shift in how much time people spend in their homes.

Kathleen Denny’s (2011) [16] study of scouting manuals offers another excellent example of the differences between manifest and latent content. Denny compared Boy Scout and Girl Scout handbooks to understand gender socializing among scouts. By counting activity types described in the manuals (manifest content), Denny learned that boys are offered more individual-based, scientific activities, while girls are offered more group-based, artistic activities. Denny also analyzed the latent meaning of the messages that scouting handbooks portray about gender; she found that girls were encouraged to become “up-to- date traditional women” while boys were urged to adopt “an assertive heteronormative masculinity” (Denny, 2011, p. 27).

 

Key Takeaways

  • Content analysts interpret texts.
  • The texts that content analysts analyze include actual written texts such as newspapers or journal entries, as well as visual and auditory sources such as television shows, advertisements, or movies.
  • Content analysts most typically analyze primary sources, though in some instances they may analyze secondary sources.
  • Content analysts examine indirect measures, which include physical traces and material artifacts.
  • Manifest content is apparent; latent content is underlying.
  • Content analysts use code sheets to collect data.

 

Glossary

Code sheet– the instrument an unobtrusive researcher uses to record observations

Content analysis– a type of unobtrusive research that involves the study of texts and their meaning

Latent content– the underlying meaning of the surface content

Manifest content– the most apparent and surface-level content in a communication

 


  1. Babbie, E. (2010). The practice of social research (12th ed.). Belmont, CA: Wadsworth.
  2. Berzins, M. (2009). Spams, scams, and shams: Content analysis of unsolicited email. International Journal of Technology, Knowledge, and Society, 5, 143–154.
  3. Neuendorf, K. A., Gore, T. D., Dalessandro, A., Janstova, P., & Snyder-Suhy, S. (2010). Shaken and stirred: A content analysis of women’s portrayals in James Bond films. Sex Roles, 62, 747–761.
  4. Downs, E., & Smith, S. L. (2010). Keeping abreast of hypersexuality: A video game character content analysis. Sex Roles, 62, 721–733.
  5. Greenberg, J., & Hier, S. (2009). CCTV surveillance and the poverty of media discourse: A content analysis of Canadian newspaper coverage. Canadian Journal of Communication, 34, 461–486.
  6. Borzekowski, D. L. G., Schenk, S., Wilson, J. L., & Peebles, R. (2010). e-Ana and e-Mia: A content analysis of pro-eating disorder Web sites. American Journal of Public Health, 100, 1526–1534.
  7. Reinharz, S. (1992). Feminist methods in social research. New York, NY: Oxford University Press.
  8. Loewen, J. W. (2007). Lies my teacher told me: Everything your American history textbook got wrong. Grenwich, CT: Touchstone.
  9. Goolsby, A. (2007). U.S. immigration policy in the regulatory era: Meaning and morality in state discourses of citizenship (Unpublished master’s thesis). Department of Sociology, University of Minnesota, Minneapolis, MN.
  10. Schaller, M. (1997). The psychological consequences of fame: Three tests of the self-consciousness hypothesis. Journal of Personality, 65, 291– 309.
  11. Houle, J. (2008). Elliott Smith’s self-referential pronouns by album/year. Prepared for teaching SOC 207, Research Methods, at Pennsylvania State University, Department of Sociology.
  12. Rathje, W. (1992). How much alcohol do we drink? It’s a question…so to speak. Garbage, 4, 18–19; Rathje, W., & Murthy, C. (1992). Garbage demographics. American Demographics, 14, 50–55.
  13. Watch the following clip, featuring satirist Joe Queenan, from the PBS documentary People Like Us on social class in the United States: http://www.youtube.com/watch?v=j_Rtl3Y4EuI. The clip aptly demonstrates the sociological relevance of kitchen gadgets.
  14. Figure 14.1 copied from Blackstone, A. (2012) Principles of sociological inquiry: Qualitative and quantitative methods. Saylor Foundation. Retrieved from: https://saylordotorg.github.io/text_principles-of-sociological-inquiry-qualitative-and-quantitative-methods/ Shared under a CC-BY-NC-SA 3.0 license (https://creativecommons.org/licenses/by-nc-sa/3.0/)
  15. Parisi, M. (1989). Alien cartoon 6. Off the Mark. Retrieved from: http://www.offthemark.com/System/2006-05-30.
  16. Denny, K. (2011). Gender in context, content, and approach: Comparing gender messages in Girl Scout and Boy Scout handbooks. Gender & Society, 25, 27–47.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Scientific Inquiry in Social Work Copyright © 2018 by Matthew DeCarlo is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book