First Monday

Some clarifications on the Facebook-GPA study and Karpinski’s response

 


 

Karpinski’s reply rightly concludes that the Facebook–GPA relationship represents a fruitful area for academic research. It is important to examine whether Facebook use may be diverting time that might otherwise be spent in more academically and occupationally productive ways. Our study was not only motivated by the need for significant research in this area — but also by the surge of public interest generated by the media following the press release on Karpinski’s data [1]. Our primary criticism of the Karpiski study was not in reference to her results, but rather to the process of alerting the mass media without probability–based sampling, replication, or comprehensive peer review.

Karpinski’s response to our paper raised several issues that require additional clarification:

Sampling. Karpinski asks that we clarify the nature of our sample. Yet, the sample of students at the University of Illinois at Chicago was clearly stated as that of “1,060 first–year students” from the “mandatory first–year writing class.” While not a complete census of first–years, the sample was representative of this group. The value of this particular sample was that it accounted for potential differences between colleges, which could not be fully addressed in the general population.

Importantly, the 45 percent response rate for the NASY study, which was also a subject of Karpinski’s critique, is unusually high for a telephone sample. As Karpinski should be aware, a response rate is not the same as a refusal rate, and the rates reported use AAPOR formula 3, which also considers failures to reach likely households as non–respondents. Further, researchers have been unable to find significant reductions in data quality for response rates far lower than 45 percent (e.g., Keeter, et al., 2000).

Coding of GPA. Recoding GPA onto a zero–to–one scale meant that individuals were assigned to scores proportionally between zero and one (e.g., a GPA of ‘some As and some Bs’ became a .75). The critique rightly notes that an ordinal regression might have been statistically more appropriate than the ordinary least squares regression used. The results, however, would also have been more difficult to interpret and less comparable between the four– and eight–point GPA measures [2]. We regret that in our initial manuscript this point was not sufficiently clear.

Assumptions of the Lagged Study. Karpinski’s critique of our lagged study presumes that Facebook use has an impact on GPA at a single point in time. It is possible that a negative relationship either occurs over the course of Facebook exposure or when individuals join the site. While our analysis was constrained by available data — such that we were not able to distinguish new users from old users, the current analysis strategy should show significant effects if either process is evidenced. Indeed, the only ways that a null result should emerge are when new and old users experience opposing effects or when there is no relationship. While we cannot disentangle these possibilities, neither fits with Karpinski’s original claim.

Setting the Record Straight. Karpinski rightly notes that in our attempt to “set the record straight” we do conclude that more research is needed. We specifically note that the study does not provide a “definitive answer on the implications of a medium.” At our strongest, we find the two variables “likely unrelated.” In the wake of a media feeding frenzy, setting the record straight is not a process of delivering definitive results, but instead one of putting on the brakes. We believe our article shows why such a reexamination is necessary and provides the most robust evidence to date on what is actually going on.

We look forward to a continued rigorous academic dialogue on these issues. End of article

 

About the authors

Josh Pasek (joshpasek.com) is a Ph.D. candidate studying political communication at Stanford University. Josh’s papers have been published in Communication Research, Political Communication, the American Journal of Education, and the Journal of Applied Developmental Science and have been presented at meetings for the American Political Science Association, the International Communication Association, the American Association for Public Opinion Research, and the Association of Consumer Research among other locales. He recently finished serving as the Assistant Editor for Political Communication and is co–director of the Methods of Analysis Program in the Social Sciences at Stanford University. His research interests include political socialization, civic education, the role of media as a democratic institution, survey design, public opinion, and civic engagement.

eian more is Senior Research Coordinator for the Adolescent Risk Communication Institute within The Annenberg Public Policy Center of the Annenberg School for Communication at the University of Pennsylvania. He received both his BA and MA from the University of Pennsylvania. His previous publications focus on youth behaviors related to new media, online civic engagement, health behavioral trends depicted in media across time, and risky behaviors among adolescents. Eian’s research interests include survey methodology, the relationship between youth behavior and their media consumption, globalization, and identity performance in virtual communities.

Eszter Hargittai (eszter.com) is Associate Professor of Communication Studies and Faculty Associate of the Institute for Policy Research at Northwestern University where she heads the Web Use Project. In 2008/09, she is also Fellow at the Berkman Center for Internet & Society at Harvard University. Her research focuses on the social and policy implications of information technologies with a particular interest in how IT may contribute to or alleviate social inequalities. Her research projects have looked at differences in people’s Web–use skills, the evolution of search engines and the organization and presentation of online content, political uses of information technologies, and how IT are influencing the types of cultural products people consume. Her papers are available at www.webuse.org/papers.

 

Notes

1. The urgency we felt in responding to the original Karpinski study unfortunately meant that the paper was written in less than a week and was under revision and proofs simultaneously — with a window of fewer than four days — after acceptance. We apologize again to Karpinski for the short time offered for a response..

2. The decision was motivated in part by a much broader understanding of coefficients in OLS regression among the wide variety of individuals who read First Monday. The results were not substantively different for any analyses when using an ordinal regression.

 

Reference

Scott Keeter, Carolyn Miller, Andrew Kohut, Robert M. Groves and Stanley Presser. 2000. “Consequences of reducing nonresponse in a national telephone survey,” Public Opinion Quarterly, volume 64, number 2, pp. 125–148.http://dx.doi.org/10.1086/317759

 


Editorial history

Paper received 1 May 2009; accepted 3 May 2009.


Copyright © 2009, First Monday.

Copyright © 2009, Josh Pasek, eian more, and Eszter Hargittai.

Some clarifications on the Facebook–GPA study and Karpinski’s response
by Josh Pasek, eian more, and Eszter Hargittai
First Monday, Volume 14, Number 5 - 4 May 2009
https://firstmonday.org/ojs/index.php/fm/article/download/2504/2187