top of page
  • Writer's pictureAlan Seder

Play-Testing Information Literacy and Content Objects

As I approach play-testing information literacy and content objects, I am struct by how great the need is for instruction in this area. Far too many students appear at best baffled by, and at worst unaware of the many pitfalls of sourcing information in cyberspace. Although my field is science and many scientific information resources on the internet are also politicized, biased, misleading, fabricated, and occasionally downright untrue, I feel the issue and needs of students is larger than just scientific information literacy. As I have stated before, few of my students will ever become scientists, but they will all become citizens who will need to observe, think critically, and make important life choices, hopefully with discerning tastes with respect to their information sources. So, I have eschewed narrowing my play-testing to those information literacy and content objects focused on science alone. I feel the play-tested applications that follow address a fair cross-section sample of life skills my students will need, and will ultimately help students develop the same underlying thinking skills that will help them be more selective in their scientific information sources as well.

Informable is an application on the Android and Apple iOS platforms that is a fairly simple game designed to help students explore four media literacy challenges - how to spot ads, distinguish news from opinion, detect faulty evidence, and identify fact-based versus opinion-based statements. The game has four modes with binary choices, one for each of these challenge types, plus a mixed mode with a random selection from all four types. There are multiple levels, with increasing difficulty, each presenting a series of authentic media clips in print or video, with the student having to make a binary choice of either ad or not, evidence or not, news or opinion, checkable or not. At the end of each level, the student’s responses are each scored correct or incorrect, with a button labeled “Why?” that opens an explanation for score.

Informable has all the characteristics of a good online quiz game that will draw students in. At the same time it has authenticity and provides insight about media literacy through the feedback that the student must absorb in order to improve their scores. Passive play is not rewarded, but engaged, close observation and critical thinking are. The game is short with limited questions and probably is best used as the front end to a deeper, more expansive curriculum on media literacy.

Bad News is a web-based, text game that allows a player to be the villain while highlighting how social media can be manipulated to propagate fake news. At various decision points, the player makes choices about what to post with feedback scored on the bases of credibility score and number of followers. Pedagogically, after each decision point the game debriefs the player on why an approach did or did not work. When the player reaches a particularly high level of manipulation of their followers, a badge is earned identifying the prime driver of the manipulation such as discredit, conspiracy, trolling, polarization, impersonation, and emotion.

This is a short, engaging, text-only game that introduces the student to techniques that may be used to manipulate them on social media. Of course they learn about these techniques by being the manipulator; so, this is not a standalone information literacy tool. In spite of this Bad News could be a tool used to stimulate discussion about the consequences of the actions used in the game in the context of more traditional information literacy materials. In particular, having students discussing how the posts in Bad News would be scored via the CRAAP test could reinforce how to effectively use this traditional information literacy tool.

Checkology is similar to Bad News in that its focus is ferreting out fake news in media, but the student plays the role of a budding journalist just starting out. This is a fairly long, web-based game having a high production value with realistic news posts, social media posts, and videos of various mentors giving their feedback. There are many decision points where the student is required to write evaluations or justifications for their choices, there are also ranking exercises, all before a check-tool provides feedback. There are four basic modules offered for free with more modules available with a paid subscription.

Checkology clearly has structured lessons as part of its pedagogy. These lessons have authentic news examples designed to help students learn about four main ideas - filtering news and information, exercising civic freedoms, navigating today's information landscape, and knowing what to believe. The choose-justify-check approach has potential to help students use critical thinking to evaluate a news source's credibility. This clearly is less of a “guilty pleasure” than Bad News and more of an “eat-your-vegetables” approach, but the presentation is engaging, entertaining, and exudes credibility due to the video feedback from actual journalists.

Diigo is not an information literacy tool per se, but more of a digital total to help readers to easily collect, annotate, organize, and share information from the web. The main functionality is highlighting, creating a personal library of excerpts, tagging, outlining, archiving web pages, annotating, and bookmarking. As such, Diigo provides a way to collect and organize multiple internet sources on a targeted subject. This collection can then be used to identify connections and inconsistencies. Instruction on information literacy concepts can use this collection of student generated sources as fodder for analysis.

One possibility of using Diigo is to require students to analyze and annotate their internet source selections using an information literacy assessment tool like the CRAAP test. One powerful aspect of using Diigo in this manner over the foregoing game applications is that the selection of material analyzed is largely unscripted with a commensurate level of ambiguity. This ambiguity is both authentic and has the potential to stimulate both deep dialogue and critical thinking amongst students as they share and discuss their findings.

ThinkCerca is a completely different type of information literacy tool than all those previously play-tested targeting a combined thrust of developing both critical reading and writing skills. This web-based program is a highly integrated, comprehensive combination of content and pedagogy designed to scaffold a student’s development of critical thinking skills and argumentative writing skills. CERCA stands for claims, evidence, reasoning, counterarguments, and audience, which constitutes the structure students use to analyze their reading and writing. From readings provided, students have multiple modes to analyze, receive feedback, and make claims about the material. Students are then prompted to support their claims as they walk through each element of CERCA. The nonfiction readings are generally well curated, if not always timely, with a well-orchestrated, cross-discipline selection of materials.

Although there are many programs with some similarities of ThinkCerca in terms of structure, most are more outline and analysis tools, lacking the curated materials that must be procured by the student through exploration of identified resource sites. Many universities offer programs like Research+ from Long Island University to help students select topics, research credible material sources, and evaluate critical writing skills in an effort to write effective project reports, but few attainthe comprehensiveness of ThinkCerca’s total package approach. For all its strengths, ThinkCerca feels a bit stilted without much passion evident for collaborative, constructivist student interaction. As a teacher, I would feel compelled to supplement ThinkCerca with other information literacy tools and perhaps deviate from the script from time to time to assure collaborative, constructivist student interaction comes more into play.

For an example of student work that fits in the enhancement end of the SAMR model, possibly up to the level of augmentation, I would continue to do an exercise I have done each year of having students bring in what they perceive as “science” articles to be shared once a week in class and then briefly discussed to analyze their credibility. What I would do differently is frontload the process by doing the binary exercises in Informable to give the students a simple analysis structure to help them think critically about the article presented. Once the students have the training through Informable, I would use an online opinion tool to survey the class on each of the four binary axes about the specific article for that week. The results of the survey could then be used to drive a class discussion where students can practice their critical argumentation. I would also explore resurveying opinions after the discussion to show students the impact of their arguments.

From the TPCK model perspective, the above student work fits nicely with my pedagogical approach of collaborative constructivism while providing ample opportunities for Socratic questioning. The content aspect is to build skills in observation, critical thinking, and reasoned argument. The affordances of the Informable technology is to overlay a common structure to use as a scaffold for observations, critical thinking, and reasoned argument in a manner that draws students into collaborative constructivist discussion. The addition of opinion survey technology gives students a mechanism for feedback, that comes collectively from their peers,

on the effectiveness of their reasoned argumentation skills.

A student work approach that would fit in the transformation end of the SAMR model, likely at least approaching the redefinition level, is to have the students do a group inquiry project on the topic of media coverage of some controversial scientific topic with social impact, for example global warming. The transformation comes into play in that the students would deliver a work product that converts their research into a tutorial on the subject in the style of Checkology. After playing through Checkology, the students would be asked to collect media pieces relevant to their controversial science topic and use multimedia to develop a tutorial for their peers to help their peers build skills in filtering news and information, exercising civic freedoms, navigating today's information landscape, and knowing what to believe.

From the TPCK model perspective, the above student work aligns well with my pedagogical approach of using project-based learning to drive collaborative student inquiry, but achieving an even higher level of authenticity. The content is elevated from simply observation, critical thinking, and reasoned argument to information literacy through peer-to-peer instruction of that very same information literacy learned. The Checkology technology gives students instruction in information literacy, a particular framework for segmenting that information literacy, and models an approach providing affordances for the students to utilize these inputs in creatively delivering information literacy instruction on a specific controversial scientific topic to their peers. In effect, this project invokes the old adage that learning is made more concrete through “Watch one, do one, teach one…”.

Upon reflection, the play-testing of information literacy and content objects may the most stimulating play-testing investigation to date. Although information literacy is part and parcel of scientific inquiry, the opportunities for broader cross-disciplinary applications was somewhat surprising to me. Pursuing instructional opportunities along the lines outlined above, stands a real chance of enhancing my students skills as both scientific thinkers and soon-to-be astute citizens.

6 views0 comments

Recent Posts

See All

Comments


Post: Blog2_Post
bottom of page