Learning to Measure Media Literacy Outcomes

July 08, 2016
By Educator Innovator

To measure the impact of its media literacy programs and tools, our partner The LAMP developed an assessment rubric to analyze the work created by students and how that form of activism enables the role a student plays in a participatory digital culture.

On average, youth spend nine hours a day consuming entertainment media. Our primary goal at The LAMP is to help people comprehend, create, critique, and ultimately challenge the media they consume on a daily basis, but how can we measure the impact of our media literacy programs and tools?

Previously, researchers have found it difficult to measure changes in students’ attitudes towards media as a result of media literacy training, and how it influenced their choices. We have administered surveys to students to track their changes in attitude after a program, but find that most of the time students are tired of surveys that ask them what they learned, and so we typically receive little or no response. We have wanted to do things differently for quite some time, but finally an opportunity presented itself: an Outcomes and Measurement series from the Support Center/Partnership in Philanthropy and the Department of Youth and Community Development. Beginning in October, our team collaborated with outcomes and measurement specialists to help us evaluate our education programs. Initially, we wanted to create a tool that measured critical thinking skills (I’m sure you can imagine how stressful that is). We then decided to hone in and focus on education programs and in particular, the programming tracks and the key concepts they encompass.

Education Department Staff Zenzele Johnson and Alan Berry hold the certificate of completion for the Outcomes and Measurement program.

Education Department Staff Zenzele Johnson and Alan Berry hold the certificate of completion for the Outcomes and Measurement program.

Our Media Literacy Competency Assessment Rubric uses the students’ final projects as indicators for levels of competency of a variety of skills, and measures how well students can put their new media literacy skills into practice. Every program track has a set of central skills-based competencies that fall under the LAMP’s broader intended outcomes, and the rubric can be adapted to measure these competencies according to the program. In designing the rubric, we had to clarify these outcomes and came to the determination that activism is key to the LAMP learning experience. So instead of trying to measure attitudes, our assessment tool evaluates the active work created by the students, and how it enables the roles they play in a participatory digital culture. For many students, the very act of producing their own work and challenging the media is a form of activism, and our rubric takes this into account as a key goal. We seek to uplift their voices so they feel comfortable challenging the media, and the projects they create are where these newly learned media literacy concepts and active media-making skills can be seen. The final projects produced by students show us how well they understand the concepts being taught, but also if students feel empowered enough to make their own active media messages.

Generally, it was important to create a system based on the skill set students are expected to learn. It was also important to create a rubric that focused on what we believe to be the levels of learners of media competencies. While we want our students to reach the Activist level – which is the highest on the scale – we believe in meeting students where they are and raising their media competencies through levels of understanding. A lot of our students start as passive consumers consistently bombarded by media without challenge, but we want them to become active and curious, asking about representation, inclusion and other concepts around how, why and for whom media messages are produced.

This process of questioning media raises awareness levels, and helps the students actively decode and deconstruct messages. We then would like to them to feel comfortable in sharing this newfound awareness and insight with their peers, and use what they learned within the program to take an active stand on the media they encounter. This might take the form of remixing a print advertisement, using social media to question a piece of problematic media, or being vocal about their choice to engage with the media in a way they never had before. In order to make these processes occur, we as an organization had to confront our perceptions about what our programming actually does.

Currently, we are looking forward to tailoring our Media Literacy Competency Assessment Rubric to fit all of our program tracks. We’re also looking forward to implementing the rubric for the first time over the summer, and expect we will need to make changes once we put it in practice.

Measuring media literacy is a huge undertaking, partly because working examples at the K-12 level are so few. But we want to know if our programs work, and achieve what we set out to do every day across New York City and beyond. While we are constantly improving this effort, we hope that it serves as a strong tool for media literacy assessment, and look forward to sharing our progress.

By Zenzele Johnson
Originally published at The Lamp