Click here to view or download a PDF of this report.
Caty Borum Chattoo,
Creative Director, Center for Media & Social Impact, American University
Associate Director, Center for Media & Social Impact, American University
With editorial review by:
University Professor, American University School of Communication
Director, Center for Media & Social Impact, American University
Research Director, Media Impact Funders
Founder/Director, Dot Connector Studio
Founder, Active Voice
Director, AV Lab for Story & Strategy
The resources available to assess the social impact of issues-focused documentaries have increased during the digital era. And yet, making the decision about which research methods (and tools) to use to examine the social impact of storytelling may be a challenge within the ecosystem of creators and strategists working in the pursuit of storytelling for social change. At the same time, research methods from social science – in the fields of communication/media studies, social psychology, political science and sociology – have been tested in decades of published studies. This white paper provides a breakdown of social science and market research methods to clearly explain the benefits and limitations of using each one to understand issue-focused documentaries in particular. We examine a group of branded media-impact tools now available, dissecting their underlying research approaches and the ways in which they work optimally to help tell a story about the social impact of storytelling.
This paper is based on a key premise: If strategists and producers are to build on current conversations to continually evolve impact assessment work, it’s crucial to begin with a shared language and understanding about how impact evaluation can fit in the cycle of work, and how to ensure it is optimally useful. Additionally, this report provides a mapping of research methods in service of a particular position: For any of a number of projects, an ideal impact assessment approach layers several research methods and clearly maps appropriate approaches to the definition of impact, depending upon whether individual behavior change, public interest (policy) change, or institutional (corporate) change is the goal.
If ever a golden age existed for studying the social impact of documentary storytelling, then it is not only here, but it may be just beginning. An “Ecosystem of Change” – funders and philanthropists, filmmakers, researchers and strategists – is deeply engaged in questions about how storytelling influences individual attitudes and behavior, the collective actions that lead to policy or other institutional change, or the media framing and agenda-setting effects that contribute to shifts in public opinion. While the idea that documentary film and media can have a profound impact is not new, the discussion about evaluation continues to evolve.
A specific formula able to investigate and articulate impact does not exist. There is no one single way to perform impact evaluation. There are many. Indeed, “A comprehensive approach to impact assessment typically requires the application of multiple methodological approaches that address different levels of analysis that reflect the different spheres of potential impact (e.g., on individual attitudes/behaviors, on media debate/discussion, on public policy” (Napoli, 2014, p. 4).
Not all documentaries strive to create social change and nor should they. But for those that do, understanding research approaches and tools – and identifying them appropriately in the beginning phases of a project, not only the final stages – is invaluable for capturing social impact. For the individuals and organizations that produce documentaries designed for social impact, along with the issue interest groups and individuals (NGOs, policy makers) who use them strategically within targeted advocacy efforts, articulating the core impact questions and mapping them to appropriate research methods is key. For some documentary projects, individual-level attitude and behavior change may be the goal, thus pointing to a particular impact research method. But in other documentary projects, the goals may be either more diffuse or broadly institutional; methods used to examine individual attitude change compared to institutional policy change are distinct.
Although the depth of digital data is still evolving and tools to mine it (and make sense of it) are increasingly prolific, most promising research tools and systems for documentary impact assessment are based on classic social science research methods in communication studies that have been tested and used to study media effects: surveys, experiments, content analysis, focus groups. Mapping the existing research tools within the framework of the underlying social science research methods – with a clear articulation of the kinds of questions each method answers and does not answer – may be a helpful contribution in the development of shared language around this work.
This white paper is designed as a roadmap for the creators, impact producers, funders and thought-leaders – and students – working deeply in the increasingly sophisticated industry of media designed for social impact. Its purpose is to continue to build a shared language and understanding between creative teams, producers, evaluators and the funders who work in this space, with the hope that it contributes to moving a collective conversation forward. Most importantly, this work is designed to be practical, rather than largely theoretical, for the individuals and groups working in this field.
In this paper, we provide a landscape of the various research methods and tools now available to evaluate the effect of a social-issue documentary across different ends of the “impact spectrum” – from individual change to cultural shifts. Each method and tool is anchored by a set of questions: What is the underlying research method? How does it work, in practical terms? What research questions does it answer?
Within the context of documentary storytelling designed for some kind of social change, the definition of “impact” from the Learning for Action report, Deepening Engagement: A Framework for Measuring Media Performance and Results, is both precise and inclusive:
We define impact as change that happens to individuals, groups, organizations, systems, and social or physical conditions. Typically long-term and affected by many variables, impact represents the ultimate purpose of community-focused media efforts – it’s how the world is different as a result of our work. (Learning for Action, 2013, p. 1)
Notably, this articulation allows for a full spectrum of ways in which a storytelling project – and associated action campaigns – can lead to social impact, from individual behavior change to cultural or normative shifts. And, importantly, this articulation points to underlying research methods that can be used, built upon, and often layered with others to assess the social impact an issue-focused documentary hopes to have.
The Learning for Action articulation shares the traits of the Media Impact Project’s “social value” reference (in Measuring Media Impact: An Overview of the Field) in the context of documentary storytelling designed for social impact:
Social value in this context refers to analytical approaches that extend beyond financial measures of success to take into account criteria such as improving the well-being of individuals and communities across a wide range of dimensions that are central goals of most public interest media initiatives. (Napoli, 2014, p.6)
By “social impact,” then, used throughout this paper, we operate from the same premise and principles as the two definitions above, which are:
As the Center for Social Media’s report, Social Justice Documentary: Designing for Impact states, “the primary goals of social issue media projects are to inform, engage, and motivate publics” (Clark & Abrash, 2011, p. 8). The shared premise is strategically flexible, allowing room for customization of the tactics used for “motivating publics” for individual projects. Publics in this perspective are “groups of users for whom the film and related campaigns serve as a catalyst for debate—as well as advocates who seize upon the film as a hub for action” (Clark & Abrash, 2011, p. 4).
Further, “People come in as participants in a media project and leave recognizing themselves as members of a public—a group of people commonly affected by an issue. They have found each other and exchanged information on an issue in which they all see themselves as having a stake. In some cases, they take action based on this transformative act of communication” (Clark & Aufderheide, 2009, p. 11). In other words, while publics are distinct and defined strategically for each storytelling project and campaign, motivating their intentions and actions is the key.
According to the report and the work of many in this field (see case studies from BRITDOC and Active Voice as examples), both qualitative and quantitative measurement approaches to evaluating these kinds of projects are valuable, from user engagement and viewer metrics (number of viewers, social conversations) to changes to individuals (attitudes, behaviors/actions) to cultural/discourse changes (media framing and public discourse) and institutional changes (policies, laws).
Additionally, another seminal report in the field – the Fledgling Fund’s Assessing Creative Media’s Social Impact – provides a critical articulation of the social change process that is possible with an issue-focused documentary and campaign (Barrett & Leddy, 2008). The model uses a “compelling story” (high-quality documentary) as the core component from which an effect ripples out from individual-level impact to institutional social change, understanding that not every documentary needs to – or is able to – achieve all levels in the ripple effect. The model, a kind of “life cycle” understanding of an impact documentary project, establishes the importance of:
Additionally, the authors of the report underscore a crucial notion in the pursuit of social change around a core social issue:
After reviewing case studies as well as models of individual and community change, we believe it is critically important to understand the state of the movement and where an issue is in the public consciousness in order to set realistic expectations for impact. It is not reasonable to expect broad social change if there is little public awareness that a problem exists. In some cases, just getting audiences to see the film, connect with the story and better understand an issue is enough. This awareness is the first step to social change. If an issue is incredibly complex or not well understood, the goal of the film and its outreach campaign may focus simply on raising awareness and stimulating dialogue. On the other hand, if an issue is well-understood and there are clear solutions, we would hope the goals of the campaign would shift to something more concrete than simply dialogue. There needs to be an infrastructure in place that encourages individuals, organizations, and/or communities to act. (Barrett & Leddy, 2008, p. 14)
In other words, social issues are idiosyncratic and complex, and so are their lifecycles in the court of public opinion – the final stop in many articulations of impact for a social-issue documentary. For some issues, social impact may mean moving a nascent social issue from relative unknown to public awareness. And for other social issues, in which critical societal tipping points may already exist – i.e., key publics are already informed, and policy or other solutions are clear and straightforward – impact may mean mobilizing key publics to take specific individual and collective action.
Qualitative and quantitative research methods used to study media impact vary depending upon the research question and type of media impact (and media content) in question. Most media impact research is based on one of these underlying methods from foundational media effects research:[i]
Together, the research methods that are able to capture a spectrum of “impact” in issues-focused documentaries – from individual to public interest to institutional – can be captured in four main categories:
Each method can be used alone, as appropriate for the objectives of a particular film project and associated campaign. Importantly, and as many successful film case studies have illustrated, including “The End of the Line” and “Budrus,” among others, research methods can also be layered for a comprehensive view of a film and campaign’s impact. In fact, layering research methods may be the ideal scenario for most projects in order to capture a full story of impact.[ii]
The charts in the following pages provide a scan of the underlying social science and/or market and media research methods, illustrated according to the research questions each answers, the branded tools currently available, along with several “do it yourself (DIY)” methods and tools.
In this section, we provide overview information and available case studies about several of the new storytelling assessment tools designed with social impact evaluation as a central focus (i.e., many of the research tools included in the grids in charts in the previous section of this paper). Each overview below reflects a combination of publicly-available information from each project’s website, along with verbatim quotes and insights, as available, from the projects’ founders and key strategists based on interviews conducted by phone and email in August and September 2014. (Note: This section is not exhaustive of all tools and methods included in the charts above, and it should be considered a start to an evolving list.)
Created by Jana Diesner, PhD, a computer science scholar at the University of Illinois Urbana-Champaign (UIUC), ConText is an open-source network mapping tool that allows users to examine the content and discourse of social issues as they connect with social issue documentaries. From the ConText website: “ConText stands for Connections and Texts. This is our short way of saying that ConText supports: The construction of network data from structured and unstructured natural language text data, a process also known as relation extraction: The joint analysis of text data and network data.”
Diesner’s research team conducted an analysis on the Sundance-award-winning documentary, “The House I Live In,” a film that challenges any positive effects of the war on drugs. The team identified mandatory minimum sentencing and prisons as the baseline issues for their computation, and social media as indicators for measuring impact. They found that public discourse for the film did change over time. Comments on social media evolved from discussion of the film as an individual art product – screening times and credits – to deeper connections and framing to the major issues. Eventually, conversation also evolved to include connections with additional related social issues.
Harvis is a mobile survey application designed for use by storytellers or individuals facilitating film screenings, video clips, debates, presentations and the like. The customizable application collects real-time information from participating audience members; the information includes demographics, comments, survey responses, and motivational or emotional responses. Any form of presentation, such as a film, can be mapped to identify topics and then compared as a cross-section with emotional responses during the screening. Emotional responses are pre-set in advance with two potential options, recorded by swiping either up or down on the application screen. Information collected can be used to identify entry points for conversation, post-screening analysis, and ongoing audience engagement through mobile communication. According to Andrew DeVigal, Harvis co-creator, “When used over time, we hope to track shifts in sentiment around the subject of the film. And we can easily track and sort the engagement data based on demographic and survey information.”
A Fourth Act premiered Harvis at three screenings of the feature documentary “American Promise,” including a classroom at Oregon State University, a theater screening with panel discussion including the filmmaker and black youth, and a theater screening with a panel discussion including community advocates. Audiences were asked to provide emotional responses of swiping up when “motivated to act” and swiping down when “I feel helpless.” “We were able to identify immediate topics of conversation to help shape the post-screening facilitated dialogue. We were also able to show how on-screen moderated comments became entry points of conversation,” says co-creator DeVigal.
Media Cloud is a Web-based open source data platform created as a joint project of the Harvard Berkman Center for Internet & Society and the MIT Center for Civic Media. Media Cloud is designed for and by academic researchers to analyze news streams from online sources. Functioning as an online news database, users can query content through the Media Cloud API or the Media Meter Dashboard. Researchers can run searches and conduct analysis by specifying media sources, date ranges and key words. According to the Media Cloud website (see “About”): “Using Media Cloud, academic researchers, journalism critics, policy advocates, media scholars, and others can examine which media sources cover which stories, what language different media outlets use in conjunction with different stories, and how stories spread from one media outlet to another.” According to an interview with Ethan Zuckerman, director of the MIT Center for Civic Media and developer of Media Cloud: “The basic research question that underlies Media Cloud is ‘how is the network public sphere – blogs, Twitter, Facebook, new online publications – changing public debate over key issues?’ In other words, with more people able to participate in conversations about public issues, are we getting a more diverse, inclusive conversation on issues of local, national and international importance?”
According to Zuckerman: “Media Cloud is just starting to be used outside the MIT and Berkman context. The best case studies for how it has been used are a pair of papers - Yochai Benkler's paper on activism around SOPA/PIPA and Erhardt Graeff's paper understanding the spread of the Trayvon Martin story. The two papers show the key uses of the system – it allows you to map and understand patterns of influence as represented in hyperlinks between stories, and allows you to explore how a story is presented and framed at different points in its lifespan by understanding the language used to represent stories.” See the SOPA/PIPA paper (http://mediacloud.org/2013/07/25/mapping-the-sopa-pipa-debate/) and the Trayvon Martin paper (http://mediacloud.org/2014/02/03/the-battle-for-trayvon-martin/)
OVEE is a platform used to facilitate a social video-watching and evaluation experience. Users can host a virtual film or video simulcast that incorporates audience interaction. Videos can be selected from free streaming PBS content, Youtube or UStream. Screenings can be limited to private invitation-only audiences or remain open to the public. Virtual audiences can interact during the screening through live chats, polls, emoticators, and social media as designed by the screening host. OVEE also offers space on the screening platform for branding and calls to action. Screenings can accommodate up to 1,000 participants and hosts have access to a metrics report at the end of the screening experience. According to the organization, OVEE is not a research framework, but rather a tool that can be used to conduct impact evaluation. Through OVEE, a screening moderator can gather data about the composition and level of engagement of the audience, how the audience responds to a story emotionally, how the audience shares information and takes action.
According to Sharan Sklar, business development director for Ovee: “Sierra Club hosted an OVEE screening of the short film, ‘Plastic Bag,’ with a moderated discussion that brought together key environmental stakeholders from across California to support the organization’s work forwarding SB270, the statewide legislation banning single-use plastic bags. The goals of the screening were to increase awareness of the issue and disseminate successful organizing strategies through information sharing, create behavior change through individual action, and connect viewers to the Sierra Club. Using the metrics report and the chat transcript, Sierra Club was able to evaluate the emotional response of the audience to the film, examine the organizing strategies shared, and to track how many people clicked through to the Sierra Club website. Using the polling feature, they were able to survey: 1) how likely a viewer would be to recommend the film to a friend or family member, and 2) if watching the film impacted a viewer’s decision to take a plastic bag from a store (e.g. grocery store, take-out restaurant, quick-mart, etc.).”
The Participant Index (TPI) is a media-impact research system from Participant Media that examines the social impact of entertainment on its audience. Through a mixed-dataset method that compiles social media conversations, viewership information and audience opinion data, TPI provides insights about what an audience learns (knowledge), feels (attitudes) and does (behaviors and actions) in response to viewing a piece of social-issue-focused entertainment in four categories: Narrative film, documentary film, TV (narrative and reality/unscripted), and short online video (narrative, documentary, branded entertainment, corporate social responsibility [CSR] content). The methodology was developed and tested in 2013 in consultation with market researchers and university collaborators, including the USC Norman Lear Center’s Media Impact Project and the American University School of Communication’s Center for Media & Social Impact. During the inaugural run of TPI, which was completed in June 2014, the team examined 36 individual entertainment titles across documentary film, narrative film, TV and online videos. The system gathers data about up to 35 entertainment titles twice a year.
In its first study, among many other titles, The Participant Index examined the gender-equality documentary “Girl Rising,” which premiered theatrically in early 2013, and later in the year on CNN. According to TPI results, the audience found the film to be highly emotionally engaging and impactful, and they indicated a high degree of social action. Specifically, nearly nine in 10 viewers come away from the film understanding more about the attached social issues than they did before watching, three in four viewers experienced a high level of emotional engagement, and nearly half of those who watched said they engaged in some kind of community-oriented activity as a direct result of the viewing. Of four levels of social actions measured by TPI (individual information seeking, individual information sharing, individual action, encouraging community action), the highest levels of social action were seen in the individual information seeking and sharing categories; this is, perhaps, both appropriate and intuitive for a broad, diffuse social issue with a variety of “solutions.”
Sparkwise is a cloud-based platform for aggregating and showcasing data and information collected through existing sources. Widgets enable users to pull in numbers from Google Analytics, Facebook, Twitter and other web-based sources. Users can also input qualitative information including anecdotes and videos for a rich media experience. Data visualization can be customized for each source through graphs, charts, maps or direct numbers, and overall presentation is manipulated in a user-friendly drag-and-drop module format (individual boxes in various sizes in a similar style to Pinterest). Aggregated modules come together on a single Web page for a presentation-friendly story of impact that allows for visual comparison of numbers. Sparkwise is designed for “civic engagement, public media, business and social change initiatives,” and is tailored for use by individuals. According to Wendy Levy, co-founder: “Sparkwise was not designed to answer any research questions. This is an important distinction between Sparkwise and other tools. Sparkwise is designed as a practical data and storytelling platform to enable a wide range of users to collect, track and share data and rich media in context, and provide tools to action that data for even deeper impact. It is about strategic storytelling, insight, impact and action.”
Sparkwise is a project of Tomorrow Partners, a strategic design firm based in Berkeley, California. The John D. and Catherine T. MacArthur Foundation, the Ford Foundation, the Wyncote Foundation, and the Fledgling Fund supported earliest phases of the project known as the Impact Dashboard, which was housed at the Bay Area Video Coalition. Sparkwise credits the Bill & Melinda Gates Foundation and the Bay Area Video Coalition as partners in the research and Alpha test phases. Sparkwise is now in a Beta phase.
Sparkwise presentation is visual. In the case of the Sparkwise page created for “Rich Hill,” a documentary about youth in rural impoverished America, the public can see how specific numbers and story content elements are emphasized. The film’s trailer takes up a major space at the top. Numbers are emphasized with minimal text to explain their context, such as the number of Facebook fans. Content boxes also have subtle borderline color-coding: “story” content (trailer, articles, quotes) is outlined in orange, Facebook has its signature blue and Twitter uses a complimentary green. A dollar amount for SNAP payments shows up in pink. All content provides links for deeper exploration.
StoryPilot is a project of the Harmony Institute, a non-profit research center dedicated to understanding the impact of entertainment. StoryPilot (formerly known as ImpactSpace) uses a solar-system-inspired model for visualizing the metrics on the social impact of documentary film. At the macro solar system level, users can examine the media context of social issues identified/curated by the makers (currently there are 24 social issues to choose from). Users can zoom in to see the system of strategies and metrics for individuals and films including data about the films production and numbers related to impact. Further zoom provides audience analysis, identifying where there is conversation about the film and related action and impact. Datasets come from publicly available sources including but not limited to IMDB, film websites and social media. Filmmakers can access data and filter through customizable graphs and presentation formats for sharing, and they can also access an evaluation toolkit that draws on social science theories of impact evaluation to understand individual film success.
Currently in the beta phase and launching in early 2015, StoryPilot incorporates over 400 documentary film products and has identified 24 social issue categories (via its website): criminal justice, illegal drugs, organized crime, violent crime, banks and corporations, economic inequality, economic policy, educational policy, school climate and safety, teaching and learning, climate change, conservation, energy and sustainability, conditions and treatments, food and nutrition, healthcare, peace and reconciliation, protest and revolution, terrorism, war and military, children's rights, civil liberties, minority rights, women's rights.
Ideally, effective impact assessment requires the storytelling team to begin well before production with a deep understanding of the social issues it plans to tackle, by examining existing media framing, public opinion, and the landscape of influencers (NGOs, government agencies, others) before understanding a an opportunity for change or action. In a parallel fashion, setting the stage for the final social-impact evaluation of the project should also happen at an early phase in the project’s life cycle. Setting objectives for the project ideally includes counsel on the eventual research question and research method that will be used to assess social impact.
Additionally, it’s important to acknowledge that an individual media product cannot always take full credit for the “impact” seen culturally on a social issue, even though it may be a crucial part of the change that happens both immediately and over time. Similarly, the path to effective impact is paved by identifying the kind of change the story hopes to encourage, from individual change (knowledge, attitudes, behaviors) to public interest change (legislation, policy) to institutional change (corporate, systems). The goal is to calibrate the strategy and objectives of the project not only by the film’s wish list, but by the reality and moment of the social issue – and all of this comes together in the choice of research methods used to assess impact. The success of the strategy and the eventual research method – or methods – depends to some degree upon the team’s understanding of the social issue itself.
The attractive pull of big data – big numbers, big analytics – can lead to a conversation about documentary impact in which quantitative data and digital metrics are a near-exclusive focus. While quantitative methods are crucial and invaluable when used appropriately – e.g., to examine public opinion, to understand broad trends in discourse, to examine media coverage of an issue or media project – they do not necessarily articulate all social and cultural impact associated with storytelling. Qualitative data in the form of legislative victories, individual lives influenced in profound or minor ways, nuanced opinions about social issues, and community impact provide sophisticated and valued impact tracking metrics alongside quantitative research methods. But, to be useful, qualitative data need to be collected and analyzed methodically and strategically based on the objectives of the project, just as quantitative data are, with a strategy to capture the stories and to specify which stories and qualitative data to capture, and how to analyze them methodically to surface insights.
Deciding upon a research method depends on understanding the basics of different approaches – quantitative and qualitative – and matching the appropriate method to the core research question. If strategists and producers are to build on current conversations to continually evolve impact assessment work, it’s crucial to begin with a shared language that clearly articulates which kinds of research methods are most appropriate for examining which kinds of impact research questions. The most valuable media impact framework is likely one that clearly maps several appropriate research methods and tools to specific projects, based on whether individual behavior change, public interest change or institutional change is the goal.
An ecosystem of media impact strategists, creators, funders and others has evolved over decades of work, with perhaps a heightened level of awareness over the past decade alongside the explosion of the social media and digital era. The number of papers, tools, case studies, websites and resources is reaching a dazzling volume. And yet, this very specific conversation may not yet be taking place in universities with film production programs in a more formal, curriculum-based way that crosses disciplines (for example, research methods in social science are certainly taught in classic scholarly disciplines within communication studies, sociology, political science, psychology). Is it that time? If not as a formal curriculum, then perhaps simply as an informal series of conversations and workshops about the language and resources available for those who want to work in this niche area – and how to find individuals and groups with the skills to perform the evaluation work. If so, a certain amount of time and thought should be devoted to understanding and developing ways to train and cultivate the future impact producers – college students in the disciplines of film, communication, marketing and others. If impact assessment projects are to be ultimately useful within the full cycle of social-issue documentary lives, then they must be translated and made available in practical, useful terms to today’s film and communication students who plan to work in this space in the future. This calls again for a shared language – methods, tactics and informal curriculum to share both now and for future content creators – to not only share the work that already exists, but to translate it into practical, useful ways.
Many excellent tools are available for free, to some degree, or are open source to allow technically-proficient strategists and others to adapt them for particular projects. But customized research projects for particular projects are expensive – upwards of $50,000 to $100,000 and much more, depending upon the research design – given the need for expert, trained consultants and researchers. Even research projects conducted in an academic setting can fall within this ballpark or higher, unless some work can be conducted by supervised graduate students over a longer period of time. If impact assessment is required of an increasing number of impact-focused documentary projects, then the funding must be adequate and realistic. The funding is not only important for the tools themselves, but to fund the specialists able to design and carry out a research project from strategic design to conclusion. A clearly-defined set of objectives will ideally include a clearly-defined set of research methods and the budget that can support them.
Content creators in the “impact media/documentary” professional arena are appropriately spending their time and talents creating the evocative, powerful, artistic stories that connect emotionally with audiences. Filmmakers (directors, producers) and content creators do need to understand the shared language of this impact and assessment work, of course, given that these are ultimately their projects to direct – including the business ends of funding and evaluation – but the individuals and organizations with social science, market research and strategy backgrounds should work alongside them to help define the research objectives and conduct the assessment work. Documentary film artistry requires specialized training and experience, and so does research and strategy.
Well-designed and well-executed impact projects may be varied in their objectives and their research methods, but they share common DNA. Specifically, a storytelling venture designed with impact and evaluation may be most useful when it builds from a sound knowledge of the social issue landscape at the heart of a media story, when the storytellers collaborate or seek counsel from researchers in social science or market research during both the early and final phases of the project, and often when multiple assessment methods are outlined to create a complete impact story.
Barrett, D., & Leddy, S. (2008). Assessing creative media’s social impact. The Fledgling Fund. http://www.thefledglingfund.org/resources/impact
Clark, J. & Abrash, B. (2011). Social justice documentary: Designing for impact. The Center for Social Media (now Center for Media & Social Impact). http://www.cmsimpact.org/designing-impact
Clark, J., & Aufderheide, P. (2009). Public media 2.0: Dynamic, engaged publics. The Center for Social Media (now Center for Media & Social Impact). http://www.cmsimpact.org/future-public-media/documents/articles/public-media-20-dynamic-engaged-publics
Eurodiaconia. Measuring social value. http://www.eurodiaconia.org/files/Eurodiaconia_policy_papers_and_briefings/Briefing_-_Measuring_Social_Value.pdf
Learning for Action. (2013). Deepening engagement for lasting Impact: A framework for measuring media performance and results. http://www.learningforaction.com/wp/wp-content/uploads/2014/08/Media-Measurement-Framework_Final_08_01_14.pdf
Napoli, P. (2014). Measuring media impact: An overview of the field. Media Impact Project, USC Annenberg Norman Lear Center. http://www.learcenter.org/pdf/measuringmedia.pdf
Wimmer, R. D., & Dominic, J.R. (2009). Mass Media Research: An Introduction. 9th Edition. Stamford, CT: Cengage Learning.
Active Voice Portfolio and Case Studies: http://www.activevoice.net/portfolio/
BRITDOC Impact Awards Case Studies: http://britdoc.org/real_good/puma-creative-impact-award
BRITDOC Impact Field Guide & Toolkit: From Art to Impact: www.impactguide.org
Media Impact Project Web Metrics: Basics for Journalists: http://www.mediaimpactproject.org/web-metrics-guide.html
Media Cloud: http://mediacloud.org/
Meltwater News: http://www.meltwater.com/
The Participant Index (TPI): www.theparticipantindex.com
Story Pilot: http://www.storypilot.org/
 The organization Active Voice coined the term “Ecosystem of Change” to reference the researchers, filmmakers, funders, policymakers and philanthropists working in the creation of documentary storytelling and social action campaigns designed to motivated social change in social issues. See a broader discussion at http://www.activevoice.net/philosophy/.
 See BRITDOC’s collection of documentary impact case studies at http://britdoc.org/real_good/evaluation. Also see BRITDOC’s online resource, “The Impact Guide & Toolkit: From Art to Impact” at www.impactguide.org. See the Active Voice portfolio of projects and case studies at http://www.activevoice.net/portfolio/.
 See information about both “The End of the Line” and “Budrus” in the BRITDOC Impact Awards case studies: http://britdoc.org/real_good/puma-creative-impact-award
 This is not an exhaustive list of free and paid digital metrics tools, but it is merely meant to provide a broad understanding of these tools and approaches in the context of digital metrics tracking – as well as a visualization of their place in an “impact assessment ecosystem” for storytelling designed for social change. For a extensive listing of digital engagement metrics and tools, please see Learning for Action report, Deepening Engagement for Lasting Impact: A Framework for Measuring Media Performance and Results. http://www.learningforaction.com/wp/wp-content/uploads/2014/08/Media-Measurement-Framework_Final_08_01_14.pdf
 For a deeper dive into the use of Web analytics to track impact, see the USC Annenberg Norman Lear Center’s Media Impact Project online guide, Web Metrics: Basics for Journalists: http://www.mediaimpactproject.org/web-metrics-guide.html
 Disclosure: The primary author of this report is the lead consultant to Participant Media for the implementation and public writing for The Participant Index, a project of the Social Action & Advocacy Department of Participant.
[i] For deep understanding of research methods used in media and communication, see: Wimmer, R. D., & Dominic, J.R. (2009). Mass Media Research: An Introduction. 9th Edition. Stamford, CT: Cengage Learning.
The Center for Media & Social Impact at American University, formerly the Center for Social Media, is an innovation lab and research center that studies, designs, and showcases media for social impact. The center is a project of the School of Communication, led by Jeffrey Rutenbeck, at American University in Washington, D.C.