Media Literacy Takes on the Digital Morass

Type: Article
Topics: Curriculum & Assessment, School Administrator Magazine

April 01, 2018

Creating districtwide curricula to help students and their teachers assess the credibility of information on the web


Joel Breakstone, standing, leading a training workshop for social studies teachers and administrators from across the country at the Stanford Summer Teaching Institute.

We’ve witnessed a year of unprecedented attention to fake news, misinformation, disinformation and propaganda on the web. Countless articles have detailed Russian bots spreading misinformation, Macedonian teenagers profiting from fictional Facebook posts and Americans’ general inability to evaluate digital content.

As a result, state legislators have rushed to draft legislation requiring media literacy in K-12 education, according to Education Week. Connecticut, New Mexico, Rhode Island and Washington have passed new laws. The California legislature mandated the creation of media arts standards. Another law proposed in California would require the creation of a model media literacy curriculum.

Media literacy is coming to a district near you.


Verifying Claims

The need is acute, yet few tools truly help students navigate the digital morass. When the Pew Research Center asked more than 2,400 teachers which resources their students were most likely to use for research, the most common responses were search engines, Wikipedia, YouTube and social media sites. Teachers reported their students rarely looked beyond the first search result, blithely accepted information when it appeared on multiple websites and credulously concluded that “because it’s on the Internet, it’s right,” according to Pew’s report, “How Teens Do Research in a Digital World.”

Our research at the Stanford History Education Group adds to teachers’ worries. Over 18 months, we developed 56 tasks that assessed students’ ability to judge the information that streams across their smartphones, tablets and computers. In total, we collected 7,804 student responses in 12 states.

From middle school through college, young people struggled to perform basic evaluations of online content. They couldn’t distinguish between news articles and advertisements, had trouble verifying stories on social media and floundered when trying to identify the sources behind websites. One of our assessments asked middle school students to explain whether items on a news website’s homepage were advertisements. More than 80 percent identified a piece of “sponsored content” as news.

Students performed no better on tasks that involved searching the web. Another assessment asked students to examine an article on minimumwage.com. The article claimed that paying American workers more would lead to increased food prices and unemployment. Students were told they could use any online resource to determine whether the website was a reliable source of information.

Minimumwage.com is a professional-looking website with “research” and “media” pages that are filled with links to news stories and research reports. The site’s About page indicates it is a project of the Employment Policies Institute, “a nonprofit research organization dedicated to studying public policy issues surrounding employment growth.” The article we asked students to read includes links to reputable sources like The New York Times and the Columbia Journalism Review.

The vast majority of students accepted the site as trustworthy, pointing to its research and parent organization as evidence of credibility. Most never bothered to leave the website to investigate what other sources said about the site or its sponsor. Had they done so, they would have learned that minimuwage.com is “run by a public relations firm that also represents the restaurant industry” and that the firm’s owner has a record of creating “official-sounding nonprofit groups” to promote the interests of corporate clients, ac-cording to coverage in The New York Times. Only 6 percent of college students and 9 percent of high school students reached this conclusion.

Evaluation Skills

Students’ inability to navigate the web threatens the future of our democracy. We encounter the contemporary world via screens, and sophisticated evaluation skills are required to separate wheat from chaff. For every social, cultural and political issue, there are scores of groups seeking to persuade us. If students are unable to identify the interests behind the information they consume, they are easy marks for dubious causes of all kinds.

Democracy atrophies without a robust and informed citizenry. In the face of unprecedented access to information, school districts should be attending to the development of sophisticated online evaluation skills.

Discernment will be crucial in making wise decisions about the options available among digital citizenship, news literacy and media literacy programs. Currently, the dominant curricular approach is to give students checklists for evaluating unfamiliar online content. Comprised of anywhere from 10 to 30 questions, such lists focus students’ attention on the most easily manipulated features of a website: Is a contact person provided? Are the sources of information identified? Is the website a .com (supposedly bad) or a .org (supposedly good)?

These questions may have mattered in the web’s infancy. But in an era of astroturfing (sites that appear to be grassroots efforts but are craftily organized by corporate interests) and sock puppetry (false web identities often used to troll comment sections), checklists create a false sense of confidence that may actually do more harm than good. Checklist questions often are cumbersome, repetitive and, at their worst, misleading.

Educators must move beyond checklist approaches. Our research with professional fact checkers from the country’s leading news outlets identified three key questions to ask when evaluating online content: Who’s behind the information? What’s the evidence? What do other sources say?

In contrast to other users who approached websites by closely examining the information on them, our research group discovered fact checkers did the opposite: They read laterally, opening up multiple browser tabs to investigate what others said about an unfamiliar organization. Paradoxically, they evaluated a site, first, by leaving it.

Wide Relevancy

These questions and strategies do not represent a panacea for the maelstrom of misleading information online, but they provide a foundation for web credibility instruction. Using these questions, teachers can model approaches to evaluating online information and guide students in practicing these strategies.

Such lessons cannot be limited to one-off sessions with school librarians. This problem implicates the science classroom (think anti-vaccination groups and climate change deniers), the English/language arts classroom (think rhetoric and skills of propaganda) and government and economics classes (think about the use and misuse of polling data and statistics of all kinds). The need to distinguish quality information from sham leaves no corner of the curriculum untouched.

As school districts consider digital curriculum options, they should ask three questions:

» What is the research behind the program they are considering?

» Who conducted that research?

» What data attest to student learning?

With the proliferation of legislation calling for digital literacy instruction, curriculum developers of all kinds are rushing to bring materials — along with broad claims about their effectiveness — to market. School leaders will want to consider the basis for such claims. Are they founded on self-reports (either by teachers or by students) or on actual as-assessments of what students do when faced with online content? Are the evaluations conducted by hired-hand evaluators or by disinterested third-party researchers without a vested interest in the outcome? Has the research been submitted for peer review and published in reputable outlets? Preparing students to meet the digital future is too important to rely on unproven curricular tools.

Staff Needs Too

Finally, digital literacy cannot be geared exclusively toward students. Too many teachers regard their students as experts, mistaking students’ fluency with digital devices for sophistication at judging the information such devices provide. Our group’s research suggests that many teachers’ ideas of how to navigate the web are dangerously outdated. For instance, many students told us that their teachers taught them that .org sites are more trustworthy than .com sites, yet any organization today can receive a .org URL. Even the 501(c)(3) designation has lost its cachet, according to a report by the Stanford Center on Philanthropy and Civil Society.

Given the challenges facing our democracy, teachers along with students need help in mastering strategies for evaluating online information. On top of that, teachers need help figuring out how to integrate these concepts into the curriculum as well as time to plan with colleagues to ensure integration across subjects. This is the best way to ensure students develop the digital literacy skills they so desperately need.


SARAH MCGREW co-directs the Civic Online Reasoning project for the Stanford History Education Group.

TERESA ORTEGA serves as the project manager for the Stanford History Education Group.

MARK SMITH is director of assessment for the Stanford History Education Group.

SAM WINEBURG is the Margaret Jacks Professor at Stanford University and the founder of the Stanford History Education Group

Additional Resources

» The Stanford History Education Group website includes assessments of digital literacy, rubrics and sample student responses.

» The executive summary of the group’s research about how students answered assessments of online reasoning

» The Civic Engagement Research Group at the University of California, Riverside, has collected materials titled Educating for Democracy in the Digital Age.

» An excellent blog on digital literacy education is maintained by Michael Caulfield, director of blended and networked learning at Washington State University, Vancouver.

Advertisement

Advertisement


Advertisement

Advertisement