New publication — Reciprocal journalism: A concept of mutual exchange between journalists and audiences

I’m excited to announce the publication of this new piece, “Reciprocal journalism: A concept of mutual exchange between journalists and audiences,” published in the peer-reviewed journal Journalism Practice (a non-paywalled PDF is available here). The article will appear in 2014 as part of a special issue on “community journalism midst media revolution,” guest-edited by Sue Robinson (see her terrific introduction to the issue).

I was lucky to work with two fantastic co-authors in Avery Holton of the University of Utah and Mark Coddington of the University of Texas (all three of us were/are Ph.D. students in the School of Journalism at UT-Austin). We worked together in developing the “reciprocal journalism” concept last spring, drawing on theorizing about reciprocity from social psychology to imagine a way for understanding the evolving relationship between journalists and audiences. While a lot of what is classified as participatory journalism primarily works in the service of the news organization, we see reciprocal journalism as a concept for visualizing a process of mutual benefit between journalists and their communities of readers and followers—whether one-on-one in some instances or more indirectly and sustained over time. Now that we have begun to develop the contours of this concept, the next step is to test it in practice: To what extent does reciprocity—or the perception of reciprocity—factor into the way journalists perceive their relationships with audiences? How are such beliefs about reciprocity connected to certain kinds of news work practices or forms of participatory journalism? and so on. We hope to begin answering those questions via a survey of U.S. journalists that we’re launching soon.

Below is the citation information and abstract. If you can’t access the paywalled PDF, just email me for a copy: sclewis@umn.edu.

Lewis, Seth C., Holton, Avery E., & Coddington, Mark (2013). Reciprocal Journalism: A Concept of Mutual Exchange Between Journalists and Audiences. Journalism Practice, 1-13. doi:10.1080/17512786.2013.859840 (pre-print version)

Abstract

Reciprocity, a defining feature of social life, has long been considered a key component in the formation and perpetuation of vibrant communities. In recent years, scholars have applied the concept to understanding the social dynamics of online communities and social media. Yet, the function of and potential for reciprocity in (digital) journalism has yet to be examined. Drawing on a structural theory of reciprocity, this essay introduces the idea of reciprocal journalism: a way of imagining how journalists might develop more mutually beneficial relationships with audiences across three forms of exchange—direct, indirect, and sustained types of reciprocity. The perspective of reciprocal journalism highlights the shortcomings of most contemporary approaches to audience engagement and participatory journalism. It situates journalists as community-builders who, particularly in online spaces, might more readily catalyze patterns of reciprocal exchange—directly with readers, indirectly among community members, and repeatedly over time—that, in turn, may contribute to the development of greater trust, connectedness, and social capital. For scholars, reciprocal journalism provides a new analytical framework for evaluating the journalist–audience relationship, suggesting a set of diagnostic questions for studying the exchange of benefits as journalists and audiences increasingly engage one another in networked environments. We introduce this concept in the context of community journalism but also discuss its relevance for journalism broadly.

 

Call for papers: Journalism in an Era of Big Data (special issue)

rdij20.v001.i01.coverI’m excited to be guest editing a special issue of Digital Journalism, an international, peer-reviewed journal (new in 2013) that is led by the same editor (Bob Franklin at Cardiff University) who produces the well-renowned Journalism Studies and Journalism Practice journals. The topic, we believe, is a timely one: “Journalism in an Era of Big Data.”

The full Call for Papers can be found here and below. The deadline for abstracts is July 1, 2013, with eventual publication in 2015 (though online-first publication earlier than that, I’m sure; still, as with any academic publishing, there’s a certain time lag involved).

Please spread the word to colleagues, and let me know if you have any questions. Thanks!

• • •

Call for Papers for a special issue of Digital Journalism:

Journalism in an Era of Big Data

Deadlines: July 1, 2013 (abstracts); January 1, 2014 (full papers for peer review); June 1, 2014 (revised full papers due)

Guest Editor: Seth C. Lewis of the University of Minnesota, USA (Digital Journalism Editor: Bob Franklin)

The term “Big Data” is often invoked to describe the overwhelming volume of information produced by and about human activity, made possible by the growing ubiquity of mobile devices, tracking tools, always-on sensors, and cheap computing storage. In combination with technological advances that facilitate the easy organizing, analyzing, and visualizing of such data streams, Big Data represents a social, cultural, and technological phenomenon with potentially major import for public knowledge and news information. How is journalism, like other social institutions, responding to this data abundance? What are the implications of Big Data for journalism’s norms, routines, and ethics? For its modes of production, distribution, and audience reception? For its business models and organizational arrangements? And for the overall sociology and epistemology of news in democratic society?

This special issue of the international journal Digital Journalism (Routledge, Taylor & Francis) brings together scholarly work that critically examines the evolving nature of journalism in an era of Big Data. This issue aims to explore a range of phenomena at the junction between journalism and the social, computer, and information sciences—including the contexts and practices around news-related algorithms, applications, sophisticated mapping, real-time analytics, automated information services, dynamic visualizations, and other computational approaches that rely on massive data sets and their maintenance. This special issue seeks not simply to describe these tools and their application in journalism, but rather to develop what Anderson (2012) calls a “sociological approach to computational journalism”—a frame of reference that acknowledges the trade-offs, embedded values, and power dynamics associated with technological change. This special issue thus encourages a range of critical engagements with the problems as well as opportunities associated with data and journalism.

The special issue welcomes articles drawing on a variety of theoretical and methodological approaches, with a preference for empirically driven or conceptually rich accounts. These papers might touch on a range of themes, including but not limited to the following:

  • The history (or histories) of computational forms of journalism;
  • The epistemological ramifications of “data” in contemporary newswork;
  • Norms, routines, and values associated with emerging forms of data-driven journalism, such as data visualizations, news applications, interactives, and alternative forms of storytelling;
  • The sociology of new actors connected to computational forms of journalism, within and beyond newsrooms (e.g., news application teams, programmer-journalists, tech entrepreneurs, web developers, and hackers);
  • The social, cultural, and technological roles of algorithms, automation, real-time analytics, and other forms of mechanization in contemporary newswork, and the implications of such for journalistic roles and routines;
  • The ethics of journalism in the context of Big Data;
  • The business, managerial, economic, and other labor-related issues associated with data-centric forms of newswork;
  • Approaches for conceptualizing the distinct nature of emerging journalisms (e.g., computational journalism, data journalism, algorithmic journalism, and programmer journalism);
  • The blurring boundaries between “news” and other types of information, and the role of Big Data and its related implications in that process

Articles should be no more than 8,000 words in length, including references, etc. Please submit an abstract of 600-800 words that clearly spells out the theoretical construct, research questions, and methods that will be used. Also include the names, titles, and contact information for 2-3 suggested reviewers. Abstracts are due by July 1, 2013, to sclewis@umn.edu (with “DJ special issue” in the subject line). Providing the abstract meets the criteria for the call, full manuscripts are due by January 1, 2014 (also to sclewis@umn.edu), at which point they will be peer-reviewed and considered for acceptance. The proposed date of publication is 2015. Please contact guest editor Seth C. Lewis with questions: sclewis@umn.edu. Manuscripts should conform to the guidelines for Digital Journalism.

New research publication: Content analysis in an era of Big Data

hbem20.v057.i01.coverMy latest article—along with Rodrigo Zamith, my Ph.D. advisee, and wonderful colleague Alfred Hermida—has been published in the most recent issue of the Journal of Broadcasting and Electronic MediaWe discuss the challenges of doing content analysis in an era of Big Data, and suggest a hybrid approach that blends computational and manual methods of data collection, filtering, coding, and analysis.

Here’s the full citation, including a link to a preprint version:

Lewis, S. C., Zamith, R., & Hermida, A. (2013). Content Analysis in an Era of Big Data: A Hybrid Approach to Computational and Manual MethodsJournal of Broadcasting & Electronic Media57(1), 34–52. doi:10.1080/08838151.2012.76170 (preprint version)

What’s especially exciting is that the paper is part of a special edition of the journal that examines emerging methods in digital media research. A great team of guest editors, led by Jean Burgess, lead off the issue with this introduction.

Our paper shows how we used a combination of algorithmic and human-driven kinds of techniques to analyze Andy Carvin’s Twitter coverage of the Arab Spring—first by computationally parsing and cleaning the data, second by manually identifying source types, and also by developing a Web-based interface to improve the accuracy of coding. As Alf mentioned on his site, a separate paper on our findings, “Sourcing the Arab Spring: A Case Study of Andy Carvin’s Sources on Twitter During the Tunisian and Egyptian Revolutions,” is forthcoming in the Journal of Computer-Mediated Communication, sometime in 2013.

Here’s the abstract from our JOBEM piece:

Massive datasets of communication are challenging traditional, human-driven approaches to content analysis. Computational methods present enticing solutions to these problems but in many cases are insufficient on their own. We argue that an approach blending computational and manual methods throughout the content analysis process may yield more fruitful results, and draw on a case study of news sourcing on Twitter to illustrate this hybrid approach in action. Careful combinations of computational and manual techniques can preserve the strengths of traditional content analysis, with its systematic rigor and contextual sensitivity, while also maximizing the large-scale capacity of Big Data and the algorithmic accuracy of computational methods.

“Why are you going to Africa?”

I’ve heard this question a lot in the past few days. So, let me try to explain.

sunset Masai Mara

Sunset over the savannah in Masai Mara, a famous park reserve in Kenya. Gorgeous.

Angela Sevin via Compfight

The short answer: I’m going to Nairobi, Kenya, to conduct research (“fieldwork”) on three case studies at the intersection of journalism and open source / hacking / computer programming.

The longer answer: This work figures into the ongoing research that I’m doing with Nikki Usher on the rise of programmers and programming in the world of news and information — a book project we call “Hacking the News.” (Hey, we even have a working logo, designed by one of my research assistants, Jeff Hargarten.)

These three cases that I plan to study are positioned at this nexus of news and code in different ways:

(1) Ushahidi (Swahili for “witness”) is a non-profit tech organization that famously has developed a free and open-source platform for crowdsourcing crisis information across media channels and visualized in different ways — perhaps most notably via “crowdmaps” like these. (Incidentally, Ushahidi was an early Knight News Challenge grantee, so I interviewed founder Ory Okolloh during the course of my dissertation work.) The Ushahidi platform has gained all kinds of attention (Clay Shirky talks about it prominently), but for my purposes it’s interesting because it has elements of participatory news, user-generated content, and a civic information mission to go with a good dose of open-source and technological activism — so, a useful study of media + code. Ushahidi’s team is spread throughout the globe, but its heart and soul is in Kenya, including its headquarters at …

(2) the iHub, which describes itself as “Nairobi’s Innovation Hub for the technology community [and] an open space for the technologists, investors, tech companies and hackers in the area. This space is a tech community facility with a focus on young entrepreneurs, web and mobile phone programmers, designers and researchers. It is part open community workspace (co-working), part vector for investors and VCs and part incubator.” That about sums it up, and should explain why I’m interested in observing and participating in this space — particularly in meeting with hackers (et al.) who are developing projects with a news/media focus. Why are they interested in news/media/journalism?

(3) Last, but certainly not least, I’m excited to learn more about the newly launched Code4Kenya initiative, co-sponsored by the World Bank and the African Media Initiative. This program embeds developer “fellows” in media organizations, including newsrooms. What’s interesting about this case is how it blends an emphasis on open data and coding technologies with the context of media and journalism. As one fellow (see them all) puts it on his LinkedIn profile: “I am embedded inside a host organisation and knighted with the ground shifting task of changing hack journalism. Incorporating developing applications that will increase public data awareness and disseminating to the citizens as well as improving data journalism skills and approach.” These fellows are being coordinated through a startup incubator called 88mph, which should be an interesting site for study all its own!

From the website for 88mph, a startup accelerator (a la Y Combinator) that "makes investments in early stage mobile-web companies targeting the African market; focusing purely on ideas with potential to scale across Africa."

 

Oh, and in addition to all this, there’s a new Nairobi chapter of Hacks/Hackers. Hacks/Hackers — i.e., “hacks” for journalists, “hackers” for technologists” — is a grassroots global network that obviously, in its very name, captures the intersection of journalism and hacking, and so it has been an important case that I’ve been studying during the past year.

They say that Nairobi is becoming the Silicon Valley of east Africa, and I’m excited to see why. So, 11 days, 3 cases to study, and 1 amazing trip ahead!

(I should add: This research — like previous fieldwork in London at Mozilla Festival and in newsrooms and at hackathons in New York, Chicago and elsewhere — is being funded by a generous grant from the Office of the Vice President for Research at the University of Minnesota. Thank you, UMN!)