We all know that the "audience" analogy no longer represents the way journalism should work. We know that the people reading the news have opinions, perspectives, and facts that are relevant to the conversation. Some of them just have observations, but others are reporters at heart or maybe they have the wordsmithing abilities of a columnist.
This post is about how the news system I've been blogging about can be driven by user generated content and collective intelligence. In a larger sense, however, it is about the way in which any news organization can make the move past the one-sided "audience" view of things and incorporate the voices and minds of its readers to better serve the public.
Users can contribute all sorts of information to a website (assuming they are willing). Understanding the nature of these potential contributions will help news organizations creatively incorporate them into current processes.
- Original reports - The system will provide a platform for journalists and citizens alike to add important, researched dialogue to the community agenda. This type of user generated content is meaningful, but it is also a way to broaden the conversation and guarantee that all issues are touched upon. The essential and difficult task of ensuring quality in original reporting will largely rely on the critical ability of the readers and is discussed later in this post.
- Opinion pieces - People will voice their opinions about issues in a way that contributes something valuable to everyone... or they will just want to rant. Either way these views make it possible for interested readers to get a sense of the overall attitude surrounding any given thread of conversation and potentially respond.
- Summary pieces - Readers can be passionate about information; maybe they are into politics, maybe they read everything they can about computer hardware, or maybe they are following all of the latest developments in local environmental policies. Regardless of the niche, there is probably somebody out there who is learning everything they can. Whoever they are, they can summarize the story and save time for those who aren't willing to find and read it all by hand.
- Facts and data - This is the category for first person accounts, observations, statistics, hearsay, or any other piece of information. Standalone the content might not be robust enough to be a story or an opinion piece, but these things could add value to an existing report or inspire a journalist or citizen to dig deeper.
- External content - There will already be relevant knowledge out there. By allowing users to suggest links of interest and linking to or otherwise ethically referencing that content the site helps people access the information they want. (Think Digg)
Keep in mind that the person making a contribution generally can't be trusted to know to which category it falls! In particular, I'm sure we have all made the mistake of presenting/believing opinion as fact.
Eliminating the bad
The news is a domain where informational integrity is incredibly important. In English 101 at Carnegie Mellon I read a few papers by Hadley Cantril about the idea of Critical Ability (the individual's ability to critically analyze new information before deciding whether or not to accept it). The moderation processes I propose will rely heavily on collective critical ability and will use computers to help increase the odds by providing some useful data.
- Technique 1: Purgatory - New articles of any type will start in a section of the site dedicated to unchecked information. This content will not be 'elevated' to the mainstream area until it has been collectively rated and categorized, and meets a certain quality threshold. By placing content here the users' critical abilities will be explicitly triggered, they will be reading the content specifically to judge it. [A previous post on this subject]
- Technique 2: Context - The system's tagging process will make it possible to display potentially related articles for curious readers. During article purgatory this will help inform critical ability; a lone report about a huge explosion in Montana might not be credible, but seeing that there are 500 of them alongside links to a breaking story from the AP would make the piece much more believable.
- Technique 3: User history - Has the user contributed anything in the past? What is the average quality of those contributions? Has the user tended to write opinion or report pieces? The system can provide this information to readers, once again in the name of empowering critical ability.
- Technique 4: Intelligent systems - Spam is automatically caught by mail and forum filters all the time. Although our situation will still require human input, the system could flag particularly suspicious-looking or particularly good-looking content in order to help guide purgatory readers.
- Technique 5: Targeted moderation - Since people will define topical and geographic interests, new articles can be targeted during the moderation process. This would mean that Philadelphians would have higher clout when judging a story that is relevant to Philadelphia and that those who like nanotechnology would be more trusted to review the latest report on the nano-bot 5000. [A previous post on this subject]
In general when there is collective doubt, the system will be pessimistic; this is necessary because the consequences of misinformation being presented as news are incredibly dire.
Delivering the good
When all is said and done, after the people have spoken, and [insert other terribly cliché phrase I can't think of right now], the judged and categorized content will be displayed for everyone to see. By now the article will be labeled as an opinion piece, or a report, or a summary, etc. It will also have some numerical measure of quality based on the user feedback so far. Finally, it will have the appropriate tags and metadata associated with it, since those who have looked at it so far would have been able to suggest changes to these things.
The process won't be over, of course. People will continue to rank and rate. They will also be able to flag articles for "moderation," which would entail a harsher and more targeted version of the initial judgment process.
This probably all sounds like a lot to ask of Joe User, but it actually isn't so bad. It will just involve spending a minute or two reading an amusingly bad or refreshingly good article about a topic that is likely to have been targeted (i.e. of interest) to them. Combine that with the incentive mechanisms I mentioned in my last post and we should have a fully functional process for facilitating and moderating a reader driven news agenda.
(This post pertains to a bullet point from Tying it All Together - User aggregated/moderated content)