On that Stanford information literacy study

stanford-history-education-groupA Stanford University team won a lot of attention this week by releasing a study on how badly teenagers assess information online.  “Evaluating Information: The Cornerstone of Civic Online Reasoning” examined more than 7,000 students to check their information literacy skills.  The results?

at each level—middle school, high school, and college… young people’s ability to reason about the information on the Internet can be summed up in one word: bleak.

[W]hen it comes to evaluating information that flows through social media channels, they are easily duped. (emphasis in original)

It’s an interesting and useful article (and I look forward to more from that team), with important implications for information and digital literacy.  Let me pull out some points and themes that struck me.

What literacy?  Weirdly, information literacy as a term doesn’t show up in the report, despite being evoked as a concept all over the place.  Digital literacy only appears once, at the very end, in a discussion of next steps: “We hope to produce a series of high-quality web videos to showcase the depth of the problem revealed by students’ performance on our tasks and demonstrate the link between digital literacy and citizenship.”  In their coverage NPR mentions no form of literacy at all. The WSJ piece mentions media literacy, but not information literacy, which is probably a better term there, especially since the same article complains about a lack of librarians in schools.  This doesn’t speak well to the presence of digital/media/information literacy in late 2016.

Political literacy Some of the questions presume a bit of political awareness on the part of students.  For example, one asks undergraduates to assess the quality of a poll tweeted by MoveOn.org.  Assessment includes understanding MoveOn’s nature as a partisan advocacy group, plus a similar awareness of the linked Center for American Progress‘ politics:

students must acknowledge how the political motivations of the Center for American Progress and MoveOn.org, both of which support stronger gun control measures, may have shaped the structure of the poll and how its results were publicized.

Is this fair?  I honestly don’t know if most traditional-age undergraduates would be that aware of advocacy groups and think tanks.  It does suggest information or digital literacy requires a political awareness.

Sponsored research This seems to be a particular blind spot, at least for one exercise.  That speaks well to its power and deviousness as a business outreach approach.


More than 80% of students believed that the native advertisement, identifed by the words “sponsored content,” was a real news story. Some students even mentioned that it was sponsored content but still believed that it was a news article. This suggests that many students have no idea what “sponsored content” means…

Why is this happening? The report quietly recommends improvements in education, naming teachers and curriculum without casting blame, but it’s clear from the document that K-12 has failed these students.  Higher ed, too, has its share of failure, based on the undergraduate responses.  A good question to ask: why is K-20 is so bad at teaching information literacy?

The paper doesn’t speculate on non-scholastic causes, beyond this mildly dismissive account of students’ digital practice: “Our ‘digital natives’ may be able to  it between Facebook and Twitter while simultaneously uploading a selfie to Instagram…”  They don’t focus exlusively on social media in their questions, however, focusing at least as much on home pages (Slate.com and, alas, CNN) as on tweets and Facebook.  Would the researchers like to argue that contemporary social media is structured to challenge information literacy?

We could also reach back to classic media literacy and select advertising as a major culprit.  I would add tv, but that seems out of scope here.

The article Bravo to the Stanford team for paying attention to info lit challenges in underresourced schools.  (There’s an important little note about using paper and pencil, rather than computer-based, exercises)  Kudos to them as well for sharing their exercise materials openly, including media samples, rubrics, etc.  It’s useful to see samples of student responses arranged by rubric positions, too.

Digital literacy Thinking about this research in context of the NMC digital literacy briefing, the Stanford research is much narrower than digital literacy.  Students are not producing much, beyond hand written responses to survey-givers, and those responses don’t seem shared with anyone else – i.e, there’s not sign of students as producers within social networks.  Students did have access to technology to research the problems, which is, once more, within classic information literacy parameters.

I’d love to see how this survey would change if students could create, say, a critical version of that flower photograph, or record a podcast, or ping their social networks for thoughts and feedback.  That would be a very different study.

How worried should we be?  Plenty.  This research should be a serious spur to any educational institution considering information, media, or digital literacy.

But adults shouldn’t rest easy.  We know that older adults – the people more likely to vote than the young – are more likely than these students to rely on tv “news” for information, which is a problem of equal salience and danger.  They have also experienced less training in info/media/digital literacy, which they’ll need as they gradually explore new digital domains.

Liked it? Take a second to support Bryan Alexander on Patreon!
Become a patron at Patreon!
This entry was posted in digital literacy, libraries. Bookmark the permalink.

17 Responses to On that Stanford information literacy study

  1. Pingback: IL in the air | raptnrent.me

  2. “A good question to ask: why is K-12 so bad at teaching information literacy?”
    Where do K-12 teachers come from?
    What enduces them to such a career?
    What literacy standards do they have to meet in order to enter, compete and complete their programs?
    I suggest the exact same study done with K-12 teachers as the subjects and then see where we are.
    And while we’re at it, move on to my community college writing and library science instructor colleagues.
    C’mon, Stanford, let’s go for the really big numbers and see where we get.
    My prediction? I’m with poet William Stafford, “The darkness around us is deep.”

  3. Matthew Henry says:

    Two items for further thoughts.

    1) As Sandy mentions, how would those finished with K-12 do? How would we fare? If we can’t identify fake stories, how can we teach our young people?

    2) What are the solutions? I’ve seen a project from a Hackathon at Purdue were some young people build a tool that plugs into Chrome that immediately labels the stories as questionable versus verified. Good work! Also, the Chronicle had a story last week of a professor making a list with good suggestions. Her list had strong bias, maybe all information literacy is going to have bias?

  4. Dave Crusoe says:


    While I agree with your assessment and have observed students’ shallow information techniques first-hand (qualitatively and quantitatively) I want to push us all a bit on the discourse.

    (1) What does an appropriate developmental timeframe for learning to tell truth from fiction look like? Yes, I understand that ‘state standards exist,’ but I believe you’d agree that standards don’t necessarily speak to human development. I’d argue that development of a sense for truth v fiction is as much instructional as it is experiential; and that much experience is gained post-adolescence through observation of the world first-hand. So, then, is 80% bad? Good? What’s the frame of reference here? What should it be, and when?

    (2) Is this new? Do you think, for example, that the pattern of understanding that youth display about the veracity and/or source for the media is fundamentally different than, say, youth understanding of the veracity and/or sourcing for content would have been within the television generation? What can we point to to indicate that young people are applying less effort now than before, or that youth are believing more about the information they find online now than they have in the past?

    (3) What role does the internet play in youth development of information literacy, more broadly? Ultimately, does it matter that youth perceive the information ‘on the internet’ in a certain way? Do they view all information on the internet through the larger filter of ‘information on the internet is inherently unreliable’? This is essential to understand; if one is skeptical of all information ‘on the internet,’ this may preclude any specific information from being particularly convincing. We might call this: “the human filter bubble.” Does it exist and if so, what does it do?

    These are some simple questions we can ask – I’m sure there are others. Glad to hear your responses & would make an interesting chat over coffee.


    twitter @davecrusoe

    • Great questions, Dave.

      To 1: I don’t have dates to pin down, except to say “earlier”. Experiential now includes a growing amount of online life; not sure if you’re including that.

      To 2: Is it new… Great question, and perhaps hard to answer. My dim understanding is that media literacy arose in response to a perceived excess of credulity among tv watchers and magazine readers. If it succeeded, said credulity should have declined… but how would we measure that, beyond anecdotes? What a vast project.

      To 3: Not sure I follow. Are you suggesting that teaching a healthy skepticism could yield too mush distrust in a massively digitally intermediated world?

      • dhcrusoe says:


        Easier to summarize as one thought-and a fascinating dialogue.

        Great point that ‘experiential’ and ‘internet’ are entwined even from early in life. We can at least get a start in understanding this, thus: Prior (2005; see below) makes the point that people may voluntarily segment themselves into their consumption pattern, though it was written about television and not the internet. We may further adapt his mention of Klein’s “Theory of Least Objectionable Programming,” since due to the simple nature of switching web media and instead utilize the “Theory of the Highest-Gain/Quickest Emotional Reward”. A number studies explore this; see ‘internet’ ‘gratifiations’, e.g., Dimmick et al (2004).

        But, Petty et al (2002; p 126) suggest that it isn’t “… the amount or direction of information per se that produces persuasion, but rather, people’s idiosyncratic reactions to the information.” That’s interesting to us, as educators (among other things) because it means that our goal is, perhaps, to help people understand not how to select information to consume; and perhaps not even to decipher the advertising within it; but instead, about how to react to it more globally. Do view the matrix input construct (p. 128) that Petty et al. (link below) propose and wonder: what, within that process, does an educator target?

        For example, perhaps this would imply that helping readers understand why sourcing counts (a cry of professional journalists everywhere) rather than helping them to identify all the elements of an image to understand whether it’s an ad (or not) would be a more productive use of time.

        On the other hand, people may be stronger persuaders than information alone; this is where (mis)fact is an especially pernicious weapon. So then, does a critical media literacy not just need to talk about how to read the media (or its sourcing) and instead about how to read the motives of peers?

        I’m afraid this doesn’t directly answer the three questions + statements above, but the overall point was: “Yes, the Stanford group is right. Youth (and people) have trouble discerning truth from fiction online. But what matters about this finding? What do we do to support young people’s understanding of what they read; when do we do that; and how? By heading back to the research, we may find that due to significant shifts in the media themselves, including that they’re socialized, a new look at how we define and/or teach ‘media literacy’ is needed.

        But – ha – that’s just my read of the media (without peer influence). Maybe you have critique of that reasoning?

        twitter @davecrusoe


        Dimmick, J., Chen, Y., & Li, Z. (2004). Competition between the Internet and traditional news media: The gratification-opportunities niche dimension. The Journal of Media Economics, 17(1), 19-33. Chicago

        Prior, M. (2005). News vs. entertainment: How increasing media choice widens gaps in political knowledge and turnout. American Journal of Political Science, 49(3), 577-592. http://acme.highpoint.edu/~msetzler/SeniorSem/SampleWork/SoftnewsAndPolKnow05.pdf
        Petty, R. E., Priester, J. R., & Brinol, P. (2002). Mass media attitude change: Implications of the elaboration likelihood model of persuasion. Media effects: Advances in theory and research, 2, 155-198. – https://www.uam.es/otros/persuasion/papers/2009MediaChapterPettyBrinolPriester.pdf

      • Belated greetings and thanks, Dave. Sorry I didn’t reply more quickly, but had to travel to lead a day-long seminar.

        Thanks, too, for the bibliography. I know there’s a world of media lit lit I need to dive into.

        How to react more globally… now there’s a vast range of possibility. On the one hand it brings us back to the literary roots of literacy, helping learners read better. On the other it becomes something like critical theory, teaching students to trace the networks within which media objects situate themselves. (Btw, are you following our Horton/Freire reading? Very germane.)

        I suspect people being better persuaders brings us away from just thinking about social media (composed, like Soylent Green, of people) and returns us to older media, specifically tv and radio. Hosts there were (are) huge presences, unlike in newspapers or magazines. We can see that now with figures like Rush Limbaugh or Rachel Meaddow, outsize personalities who strongly influence viewers.

        Agreed that a new look at media literacy is needed. That’s why I like bringing together media lit with information lit and digital lit.

        See? You are persuasive.

      • Zack says:

        How does one form a model of reality or truth? Information or the processing of perceived information, idk…not my field 🙂 but, to point 3, perhaps something like…corroborating multiple (ideally but rarely) independent sources of (something like information) with some intuitively assigned weight or relevance and overall sense of veracity…anyway. First hand experience, travel is good…bump that up against “common knowledge” and whatever the currently observed media or ad is trying to do and… well, how do you do the “the human filter bubble.”?

  5. A thought about the lack of information literacy instruction in k-12. The lack is directly linked to the cutting of accredited librarians in the schools. Budget cuts and legislative action has taken librarians out of the schools in many states.

  6. Pingback: Information Literacy: The Core of a College Education – The Center for Excellence in Teaching & Learning

  7. Pingback: It’s time to upgrade our CRAAP detectors | Bryan Alexander

  8. Shane Horn says:

    I’m not sure if this has been mentioned already or not, but the assessments at the collegiate level were more ‘real world’ than those at the middle and high school levels, which were paper-based (wherein one could question the validity of testing for digital literacy with paper-based assessments, but this argument is considered and discussed in the SHEG’s executive summary). That is, participants had access to the internet and could search for MoveOn.org to see who they were and what agenda they may have on the topic under consideration.

    This seems to me to be more on par with testing for information literacy than relying on previous knowledge of advocacy groups and think tanks. You’re right in that students would have to have some awareness of the nature and purpose of political interest groups, but I believe if we’re including considering the source and any vested interest a source may have in the information produced within the fold of ‘information literacy’ (which I would), then I think it’s testing more along those lines than requiring any sort of pre-existing political knowledge necessarily.

Leave a Reply

Your email address will not be published. Required fields are marked *