The New York Times’ Cascade: Data Visualization for Tweets [VIDEO]

[youtube yQBOF7XeCE0]

MASHABLE – by Jolie O’Dell

The research and development department of The New York Times has recently been pondering the life cycle of the paper’s news stories in social media — specifically, on Twitter. Cascade is a project that visually represents what happens when readers tweet about articles.

Even now, however, Cascade is more than just a nifty data visualization. [Read more…]

 

Infographics in the newsrooms, David McCandless [AUDIO]

Information Is Beautiful by David McCandless

 

Information is Beautiful by David McCandless

There is no denying it, David McCandless is the undefeated guru of data visualization. A compilation of his work called “Information is Beautiful” has been a success around the world and his visualizations for The Guardian’s Data Blog such as or are a good example of how pictures can sometimes speak better than words.

We met with him in a busy London cafe to discuss what news organisations need to do to embrace and adapt better to the emergence of open data…

[audio:https://www.datajournalismblog.com/wp-content/uploads/2011/04/David-McCandless1.mp3|titles=Infographics in the newsrooms, interview with David McCandless]

 

Announcing news:rewired – noise to signal, 27 May 2011

NEWS REWIRED

Logo from the News:Rewired website

Journalism.co.uk’s next News:Rewired event will take place on 27 May at Thomson Reuters’ London offices.

What’s it about?

news:rewired – noise to signal is a one-day event for journalists and communications professionals who want to learn more about the latest tools and strategies to filter large datasets, social networks, and audience metrics into a clear signal for both the editorial and business side of the news industry. [Read more…]

 

#ijf11: the raise of Open Data

source: Joel Gunter from Journalism.co.uk

Picture: "Where does my money go?" by the Open Knowledge Foundation

The open data movement, with the US and UK governments to the fore, is putting a vast and unprecedented quantity of republishable public data on the web. This information tsunami requires organisation, interpretation and elaboration by the media if anything eye-catching is to be made of it.

Experts gathered at the International Journalism Festival in Perugia last week to discuss what journalistic skills are required for data journalism.

Jonathan Gray, community coordinator for the Open Knowledge Foundation, spoke on an open data panel about the usability of data. “The key term in open data is ‘re-use’,” he told Joel Gunter from Journalism.co.uk.

Government data has been available online for years but locked up under an all rights reserved licence or a confusing mixture of different terms and conditions.

The Open Knowledge Foundation finds beneficial ways to apply that data in projects such as Where Does My Money Go which analyses data about UK public spending. “It is about giving people literacy with public information,” Gray said.

The key is allowing a lot more people to understand complex information quickly.

Along with its visualisation and analysis projects, the Open Knowledge Foundation has established opendefinition.org, which provides criteria for openness in relation to data, content and software services, and opendatasearch.org, which is aggregating open data sets from around the world.

“Tools so good that they are invisible. This is what the open data movement needs”, Gray said.

Some of the Google tools that millions use everyday are simple, effective open tools that we turn to without thinking, that are “so good we don’t even know that they are there”, he added.

Countries such as Itlay and France are very enthusiastic about the future of open data. Georgia has launched its own open data portal, opendata.ge.

The US with data.gov, spend £34 million a year maintaining that various open data sites. Others are cheap by comparison, with the UK’s data.gov.uk reportedly costing £250,000 to set up.

 

The 4 Golden rules to Data Journalism from the NYT

 
data visualisation by the New York Times graphics team
 
 
source: Joel Gunter from Journalism.co.uk
 

The New York Times has one of the largest, most advanced graphics teams of any national newspaper in the world. The NYT deputy graphics editor Matthew Ericson led a two-hour workshop at the International Journalism Festival last week about his team’s approach to visualising some of the data that flows through the paper’s stories everyday. Here is a short guide on how to make good data journalism…

The New York Times data team follows four golden rules:

  • Provide context
  • Describe processes
  • Reveal patterns
  • Explain the geography

Provide context

Graphics should bring something new to the story, not just repeat the information in the lead. A graphics team that simply illustrates what the reporter has already told the audience is not doing its job properly. “A graphic can bring together a variety of stories and provide context,” Ericson argued, citing the NYT’s coverage of the Fukushima nuclear crisis.

“We would have reporters with information about the health risks, and some who were working on radiation levels, and then population, and we can bring these things together with graphics and show the context.”

Describe processes

The Fukushima nuclear crisis has spurned a lot of graphics work in many news organisations across the world and Ericson explained hoe the New York Times went about it.

“As we approach stories, we are not interested in a graphic showing how a standard nuclear reactor works, we want to show what is particular to a situation and what will help a reader understand this particular new story.

Like saying: You’ve been reading about these fuel rods all over the news, this is what they actually look like and how they work.”

Reveal patterns

This is perhaps the most famous guideline in data visualisation. Taking a dataset and revealing the patterns that may tell us a story. Crime is going up here, population density down there, immigration changing over time, etc.

These so-called “narrative graphics” are very close to what we have been seeing for a while in broadcast news bulletins.

Explain geography

The final main objective was to show the audience the geographical element of stories.

Examples for this section included mapping the flooding of New Orleans following hurricane Katrina and a feature of demonstrating the size and position of the oil slick in the Gulf following the BP Deepwater Horizon accident, and comparing it with previous major oil spills.

Some of the tools in use by the NYT team, with examples:

Google Fusion Tables

Tableau Public: Power Hitters

Google Charts from New York State Test Scores – The New York Times

HTML, CSS and Javascript: 2010 World Cup Rankings

jQuery: The Write Less, Do More, JavaScript Library

jQuery UI – Home

Protovis

Raphaël—JavaScript Library

The R Project for Statistical Computing

Processing.org

Joel Gunter from Journalism.co.uk wrote a very interesting article on the workshop led by Ericson. He spoke to Ericson after the session about what kind of people make up his team (it includes cartographers!) and how they go about working on a story. Here is what he had to say.

Welcome to the Data Journalism Blog!

Wordle Art made by the DJB team

Journalists have tried for years to turn often complicated information into comprehensible news articles, graphs, or timelines. Some succeeded more than others.

What makes a difference today is the technology and design skills involved. After the recent Wikileaks scandals and the raise of “open knowledge”, journalists have to acquire new skills to keep up with the trend. Data journalism is often considered as an essential tool for the future of news.

Whether you are a journalist, a designer or simply a data lover, the Data Journalism Blog brings you the latest news on data driven journalism with reviews, how-to guides, interviews and news features.

The DJB is also a platform where you can comment, share and add to the content you read. So feel free to join in and let us know what you think!