A Stanford University team won a lot of attention this week by releasing a study on how badly teenagers assess information online. “Evaluating Information: The Cornerstone of Civic Online Reasoning” examined more than 7,000 students to check their information literacy skills. The results?
at each level—middle school, high school, and college… young people’s ability to reason about the information on the Internet can be summed up in one word: bleak.
[W]hen it comes to evaluating information that flows through social media channels, they are easily duped. (emphasis in original)
It’s an interesting and useful article (and I look forward to more from that team), with important implications for information and digital literacy. Let me pull out some points and themes that struck me.
What literacy? Weirdly, information literacy as a term doesn’t show up in the report, despite being evoked as a concept all over the place. Digital literacy only appears once, at the very end, in a discussion of next steps: “We hope to produce a series of high-quality web videos to showcase the depth of the problem revealed by students’ performance on our tasks and demonstrate the link between digital literacy and citizenship.” In their coverage NPR mentions no form of literacy at all. The WSJ piece mentions media literacy, but not information literacy, which is probably a better term there, especially since the same article complains about a lack of librarians in schools. This doesn’t speak well to the presence of digital/media/information literacy in late 2016.
Political literacy Some of the questions presume a bit of political awareness on the part of students. For example, one asks undergraduates to assess the quality of a poll tweeted by MoveOn.org. Assessment includes understanding MoveOn’s nature as a partisan advocacy group, plus a similar awareness of the linked Center for American Progress‘ politics:
students must acknowledge how the political motivations of the Center for American Progress and MoveOn.org, both of which support stronger gun control measures, may have shaped the structure of the poll and how its results were publicized.
Is this fair? I honestly don’t know if most traditional-age undergraduates would be that aware of advocacy groups and think tanks. It does suggest information or digital literacy requires a political awareness.
Sponsored research This seems to be a particular blind spot, at least for one exercise. That speaks well to its power and deviousness as a business outreach approach.
More than 80% of students believed that the native advertisement, identifed by the words “sponsored content,” was a real news story. Some students even mentioned that it was sponsored content but still believed that it was a news article. This suggests that many students have no idea what “sponsored content” means…
Why is this happening? The report quietly recommends improvements in education, naming teachers and curriculum without casting blame, but it’s clear from the document that K-12 has failed these students. Higher ed, too, has its share of failure, based on the undergraduate responses. A good question to ask: why is K-20 is so bad at teaching information literacy?
The paper doesn’t speculate on non-scholastic causes, beyond this mildly dismissive account of students’ digital practice: “Our ‘digital natives’ may be able to it between Facebook and Twitter while simultaneously uploading a selfie to Instagram…” They don’t focus exlusively on social media in their questions, however, focusing at least as much on home pages (Slate.com and, alas, CNN) as on tweets and Facebook. Would the researchers like to argue that contemporary social media is structured to challenge information literacy?
We could also reach back to classic media literacy and select advertising as a major culprit. I would add tv, but that seems out of scope here.
The article Bravo to the Stanford team for paying attention to info lit challenges in underresourced schools. (There’s an important little note about using paper and pencil, rather than computer-based, exercises) Kudos to them as well for sharing their exercise materials openly, including media samples, rubrics, etc. It’s useful to see samples of student responses arranged by rubric positions, too.
Digital literacy Thinking about this research in context of the NMC digital literacy briefing, the Stanford research is much narrower than digital literacy. Students are not producing much, beyond hand written responses to survey-givers, and those responses don’t seem shared with anyone else – i.e, there’s not sign of students as producers within social networks. Students did have access to technology to research the problems, which is, once more, within classic information literacy parameters.
I’d love to see how this survey would change if students could create, say, a critical version of that flower photograph, or record a podcast, or ping their social networks for thoughts and feedback. That would be a very different study.
How worried should we be? Plenty. This research should be a serious spur to any educational institution considering information, media, or digital literacy.
But adults shouldn’t rest easy. We know that older adults – the people more likely to vote than the young – are more likely than these students to rely on tv “news” for information, which is a problem of equal salience and danger. They have also experienced less training in info/media/digital literacy, which they’ll need as they gradually explore new digital domains.