This is what the best of data journalism looks like

This article was originally published on the Data Journalism Awards Medium Publication managed by the Global Editors Network. You can find the original version right here.

________________________________________________________________________________________________

 

After a year of hard work, collecting and sifting through hundreds of data projects from around the world, the news is finally out. The thirteen winners (and one honourable mention) of the Data Journalism Awards 2018 competition were announced on 31 May in Lisbon. Together they are the best of what the world of data journalism had to offer in the past year. They also teach us a lot about the state of data journalism.

 

 

All of the work I have done over the past few months has given me a pretty good perspective of what’s going on in the world of data journalism. Managing the Data Journalism Awards competition is probably the greatest way to find out what everybody has been up to and to discover amazing projects from all over the world.

And today I want to share some of this with you! Most of the examples you will see in this article are projects that either won or got shortlisted for the Data Journalism Awards 2018 competition.

When a news organisation submits a project, they have to fill in a form asking them to describe their work, but also how they made it, what technology they used, what methodology… And all of this information is published on the website for everyone to see.

So if you‘re reading this article in the hope of finding some inspiration for your next project, as I am confident you are, then here is a good tip: on top of all of the examples I will show you here, you can take a look at all of the 630 projects from all over the world which were submitted this year, right on the competition website. You’re welcome.

So what have we learned this year by going through hundreds of data journalism projects from around the world? What are the trends we’ve spotted?

 

Data journalism is still spreading internationally

And this is great news. We see more and more projects from countries that have never applied before, and this is a great indicator of the way journalists worldwide, regardless of their background, regardless of how accessible data is in their country, regardless of how data literate they are, are trying to tell stories with data.

 

Some topics are more popular than others

One of the first things we look at when we get the list of projects each year, is what topics did people tackle? And what we’ve learned from that is that some topics are more attractive than others.

Whether that’s because it is just easier to find data on them, or it’s easier to visualise things related to those topics, or it’s just the kind of big stories that everyone expects to see data on each year, we can’t really know for all of them. It’s probably a good mixture of all of this.

 

 

The refugee crises

The first recurrent topic that we’ve seen this past year is the refugee crises. And a great example of that is this project by Reuters called ‘Life in the camps’, which won the award for Data visualisation of the year at the Data Journalism Awards 2018.

This graphic provided the first detailed look at the dire living conditions inside the Rohingya refugee camps in Cox’s Bazar. Using satellite imagery and data, the graphic documented the rapid expansion and lack of infrastructure in the largest camp cluster, Kutupalong. Makeshift toilets sit next to wells that are too shallow, contaminating water supply.

This project incorporates data-driven graphics, photo and video. Reuters gained access to data from a group of aid agencies working together to document the location of infrastructure throughout the Kutupalong camp by using handheld GPS devices on the ground. The graphics team recognised that parts of the data set could be used to analyse the accessibility of basic water and sanitation facilities. After some preliminary analysis, they were able to see that some areas had water pumps located too close to makeshift toilets, raising major health issues.

They displayed this information in a narrative graphic format with each water pump and temporary latrine marked by a dot and overlaid on a diagram of the camp footprint. They compared these locations to the U.N.’s basic guidelines to illustrate the potential health risks. Reuters photographers then used these coordinates to visit specific sites and document real examples of latrines and water pumps in close proximity to each other.

Technologies used for this project: HTML, CSS, Javascript, QGIS and Illustrator.

 

 

Elections/Politics

Next topic that came up a lot this year was politics, and more specifically, anything related to recent elections, not just in the US, but also in many other countries. One great example of that was the Data Journalism Awards 2018 ‘News data app of the year’ award winner, ‘The atlas of redistricting’, by FiveThirtyEight in the US.

There’s a lot of complaining about gerrymandering (the process of manipulating the boundaries of an electoral constituency so as to favour one party or class) and its effects on US politics. But a fundamental question is often missing from the conversation: What should political boundaries look like? There are a number of possible approaches to drawing districts, and each involves tradeoffs. For this project, the team at FiveThirtyEight looked at seven different redistricting schemes; and to quantify their tradeoffs and evaluate their political implications, they actually redrew every congressional district in the U.S. seven times. The Atlas of redistricting allows readers to explore each of these approaches — both for the nation as a whole and for their home state.

The scope of this project really makes it unique. No other news organization covering gerrymandering has taken on a project of this size before.

To make it happen, they took precinct-level presidential election results from 2012 and 2016 and reallocated them to 2010 Census voting districts. That enabled them to add more up-to-date political data to a free online redistricting tool called Dave’s Redistricting App. Once the data was in the app, they started the long process of drawing and redrawing all the districts in the country. Then, they downloaded their district boundaries from the app, analysed their political, racial and geometric characteristics, and ultimately evaluated the tradeoffs of the different redistricting approaches. Sources for data included Ryne Rohla/Decision Desk HQ, U.S. Census Bureau, and Brian Olson.

Technologies used for this project: Ruby, PostGIS, Dave’s Redistricting App, Node, D3

 

 

An other great example of how politics and elections were covered this year comes from the Financial Times. It is called ‘French election results: Macron’s victory in charts’ and was shortlisted for the Data Journalism Awards 2018 competition.

Let’s say it, elections are a must for all data news teams around the world. That’s probably the topic where the audience is the most used to seeing data combined with maps, graphics and analysis.

Throughout 2017 and 2018, the Financial Times became an expert in:

  • producing rapid-response overnight analyses of elections,
  • leveraging their data collection and visualisation skills to turn around insightful and visually striking reports on several elections across Europe,
  • responding faster than other news organisations both in the UK and even those based in the countries where these elections have taken place.

Over and above simply providing the top-line results, they have focused on adding insight by identifying and explaining voting patterns, highlighting significant associations between the characteristics of people and places, and the political causes they support.

To deliver this, the team developed highly versatile skills in data scraping and cleaning. They also have carried out ‘election rehearsals’ — practice runs of election night to make sure their workflows for obtaining, cleaning and visualising data were all polished, and robust to avoid any glitches that might come up on the night of the count.

The work has demonstrably paid off, with readers from continental Europe outnumbering those from Britain and the United States — typically far larger audiences for the FT — for the data team’s analyses of the French, German and Italian elections.

For each election, the team identified official data sources at the most granular possible level, with the guidance of local academic experts and the FT’s network of correspondents.

R scripts were written in advance to scrape the electoral results services in real time and attach them to the static, pre-sourced demographic data.

Scraping and analysis was primarily conducted in R, with most final projection graphics created in D3 — often adapting the Financial Times’ Visual Vocabulary library of data visualisation formats.

Technologies used for this project: R, D3.

 

 

Crime

The last topic that I wanted to mention that was also recurrent this past year is crime. And to illustrate this, I’ve picked a project called ‘Deaths in custody’ by Malaysiakini in Malaysia.

This is an analysis of how deaths in police custody are reported, something that various teams around the world have been looking at recently. The team at Malaysiakini compared 15 years of official police statistics with data collected by a human rights organisation, called Suaram. The latter is the sole and most comprehensive tracker of publicised deaths in police custody in the country.

The journalists behind this project found that overall, deaths in Malaysian police custody are underreported, with one in four deaths being reported to the media or to Suaram.

They also highlight the important role that families of victims play in holding the police accountable and pushing to investigate the deaths. They created an interactive news game and a guide on what to do if somebody is arrested, both of which accompany the main article, taking inspiration from The Uber game that the Financial Times developed in 2017.

The game puts players in the shoes of a friend who is entangled in a custodial dilemma between a victim and the police. Along the way, there are fact boxes that teach players about their rights in custody. The real-life case that the game is based on is revealed at the end of the game.

Technologies used for this project: Tabula, OpenRefine, Google Sheets, HTML, CSS, Javascript, UI-Kit Framework, Adobe Photoshop.

 

We’ve changed the way we do maps

Another thing that we’ve learned by looking at all these data journalism projects is that we have changed the way we do maps.

Some newsrooms are really getting better at it. Maps are more interactive, more granular, prettier too, and integrated as part of a narrative instead of standing on their own, making us think that more and more journalists don’t do maps for the sake of doing maps, but for good reasons.

 

 

 

An example of how data journalists have made use of maps this past year is this piece by the BBC called ‘Is anything left of Mosul?’

It is a visually-led piece on the devastation caused to Mosul, Iraq, as a result of the battle to rid the city of Islamic State (IS). The piece not only gives people a full picture of the devastating scale of destruction, it also connects them to the real people who live in the city — essential when trying to tell stories from places people may not instantly relate to.

It was also designed mobile-first, giving users on small screens the full, in-depth experience. The feature uses the latest data from Unosat, allowing the BBC team to map in detail which buildings had suffered damage over time, telling the narrative of the war through four maps.

The feature incorporates interactive sliders to show the contrast of life before the conflict and after — a way of giving the audience an element of control over the storytelling.

They also used the latest data from the UNHCR, which told them where and when displaced people in Iraq had fled to and from. They mapped this data using QGIS’ heatmapping software and visualised it using their in-house Google Maps Chrome extension. They produced three heatmaps of Mosul at different phases of the battle, again telling a narrative of how the fighting had shifted to residential targets as the war went on.

The project got nearly half a million page views over several days in English. They also translated the feature into 10 other languages for BBC World Service audiences around the world.

Technologies used for this project: QGIS mapping software, Microsoft Excel, Adobe Illustrator, HTML, CSS, Javascript, Planet satellite imagery, DigitalGlobe images

 

 

Another example of how the data journalism community has changed the way it does maps, is this interactive piece by the South China Morning Post called ‘China’s Belt and Road Initiative’.

The aim of this infographic is to provide context to the railway initiative linking China to the West.

They combined classic long-form storytelling with maps, graphs, diagrams of land elevations, infrastructure and risk-measurement charts, motion graphics, user interaction, and other media. The variety of techniques were selected to prevent the extensive data from appearing overwhelming. The split screen on the desktop version meant readers could refer to the route as they read the narrative.

We are not talking about boring static maps anymore. And this is an example of how new teams around the world, and not just in western countries, are aiming for more interactivity, and a better user journey through data stories, even when the topic is complex. It is thanks to the interactivity of the piece and the diversity of elements put together that the experience becomes enticing.

They used data from the Economist Intelligence Unit (EIU). Using Google Earth, they plotted and traced the path of each initiative to obtain height profiles and elevations to explain the extreme geographical environments and conditions.

Technologies used for this project: Adobe Creative Suite (Illustrator, Photoshop…), QGIS Brackets io Corel Painter, Microsoft Excel, Javascript, Canvas, JQuery, HTML, CSS — CSS3, Json, CSV, SVG.

 

 

 

New innovative data storytelling practices have arrived

Another thing we saw was that data teams around the world are finding new ways to tell stories. New innovative storytelling practices have arrived and are being used more and more.

 

 

Machine learning

It is probably the most used term in current conversations about news innovation. It has also been used recently to help create data-driven projects, such as ‘Hidden Spy Planes’ by BuzzFeed News in the US, the winner of the JSK Fellowships award for innovation in data journalism at this year’s Data Journalism Awards.

This project revealed the activities of aircrafts that their operators didn’t want to discuss, opening the lid on a black box of covert aerial surveillance by agencies of the US government, the military and its contractors, and local law enforcement agencies.

Some of these spy planes employed sophisticated surveillance technologies including devices to locate and track cell phones and satellite phones, or survey Wi-Fi networks.

Before these stories came out, most Americans would have been unaware of the extent and sophistication of these operations. Without employing machine learning to identify aircraft engaged in aerial surveillance, the activities of many of aircraft deploying these devices would have remained hidden.

In recent years, there has been much discussion about the potential of machine learning and artificial intelligence in journalism, largely centered on classifying and organising content with a CMS, on fact-checking for example.

There have been relatively few stories that have used machine learning as a core tool for reporting, which is why this project is an important landmark.

Technologies used for this project: R, RStudio, PostgreSQL, PostGIS, QGIS, PostGIS, OpenStreetMap

 

 

Drone journalism

Another innovative storytelling practice that we’ve noticed is drone journalism, and here is an example called ‘Roads to nowhere’ from The Guardian.

It is an investigation using drone technology, historical research and analysis, interviews, as well as photomosaic visualizations.

It was a project that specifically looked at infrastructure in the US and the root causes of how cities have been designed with segregation and separation as a fundamental principle. It shows through a variety of means how Redlining and the interstate highway system were in part tools to disenfranchise African-Americans.

People are still living with this segregation to this day.

Most of the photos and all of the videos were taken by drone in this project. This is innovative in that it is really the only way to truly appreciate some of the micro-scale planning decisions taken in urban communities throughout the US.

Technologies used for this project: DJI Mavic Pro drone, a Canon 5Diii camera to take the photos, Shorthand, Adobe Photoshop. Knightlab’s Juxtapos tool to make it come to life with the slide tool

 

 

AR

Another innovative technique that has a lot of people talking at the moment is Augmented Reality, and to illustrate this in the context of data journalism, I am bringing you this project called ExtraPol by WeDoData in France.

Extrapol is an augmented reality app (iOS and Android) that was launched a month before the French presidential campaign in April 2017. Everyday, official candidates posters could be turned into new live data visualisations to inform the audience on the candidates. This data journalism project treated 30 topics in data such as: their geographical travels in France during the campaign, the cumulated number of years they have ruled a political mandate, etc.

This is probably the first ephemeral daily data journalism news app which uses augmented reality. This was the first time that real life materials, the official candidates posters, were ‘hacked’ to fact news on the politicians.

Technologies used for this project: Python, Javascript, HTML, CSS, PHP, jsFeat, TrackingWorker, Vuforia, GL Matrix, Open CV, Three.js, Adobe Illustrator, After Effect and Photoshop

 

 

Newsgames

They aren’t a new trend, but more and more newsrooms are playing with this. And this example, called ‘The Uber Game’ by the Financial Times in the UK, has been a key player in the field this year, inspiring news teams around the world…

This game puts you into the shoes of a full-time Uber driver. Based on real reporting, including dozens of interviews with Uber drivers in San Francisco, it aims to convey an emotional understanding of what it is like to try to make a living in the gig economy.

It is an innovative attempt to present data reporting in a new, interactive format. It was the third-most read by pageviews throughout 2017.

Roughly two-thirds of people who started the game finished it — even though this takes around 10 minutes and an average of 67 clicks.

Technologies used for this project: Ink to script the game, inkjs, anime.js, CSS, SCSS, NodeJS, Postgres database, Zeit Micro, Heroku 1X dynos, Standard-0 size Heroku Postgres database, Framer, Affinity Designer

 

 

Collaborations are still a big thing

And many organisations worldwide have had a go at it, in many regions around the world.

Paradise Papers

Of course we have the Paradise Papers investigation (pictured above) coordinated by the ICIJ with 380 journalists worldwide.

Based on a massive leak, it exposes secret tax machinations of some of the world’s most powerful people and corporations. The project revealed offshore interests and activities of more than 120 politicians and world leaders, including Queen Elizabeth II, and 13 advisers, major donors and members of U.S. President Donald J. Trump’s administration. It exposed the tax engineering of more than 100 multinational corporations, including Apple, Nike, Glencore and Allergan, and much more.

If you want to know more about how this was done, go to the Data Journalism Awards 2018 website where that information is published.

The leak, at 13.4 million records, was even bigger in terms of the number of records than the Panama Papers, and technically even more complex to manage.

The record set came from an array of sources from 19 secrecy jurisdictions. It also contained more than 110,000 files in database or spreadsheet formats (excel, CSVs and SQL). ICIJ’s data unit used reverse-engineering techniques to reconstruct corporate databases. The team scraped the records in the files and created a database with information of companies and individuals behind them.

The team then used ‘fuzzy matching’ techniques and other algorithms to compare the names of the people and companies in all these databases to lists of individuals and companies of interest, including prominent politicians and America’s 500 largest publicly traded corporations.

 

Technologies used for this project:

  • For data extraction and analysis: Talend Open Studio for Big Data, SQL Server, PostgreSQL, Python (nltk, beautifulsoup, pandas, csvkit, fuzzywuzzy), Google Maps API, Open Street Maps API, Microsoft Excel, Tesseract, RapidMiner, Extract
  • For the collaborative platforms: Linkurious, Neo4j, Apache Solr, Apache Tika, Blacklight, Xemx, Oxwall, MySQL and Semaphor.
  • For the interactive products: JavaScript, Webpack, Node.js, D3.js, Vue.js, Leaflet.js and HTML.
  • For security and sources protection: GPG, VeraCrypt, Tor, Tails, Google Authenticator, SSL (client certificates) and OpenVPN.

 

 

 

Monitor da violencia

Now here is an other collaborative project that you may not know of but is also quite impressive. It is called ‘Monitor da Violencia’, and it won the Microsoft award for public choice at this year’s Data Journalism Awards. It was done by G1 in Brazil, in collaboration with the Center for the Study of Violence at University of São Paulo (the largest university in Brazil) and the Brazilian Forum of Public Security (one of the most respected public security NGOs in Brazil).

This project is an unprecedented partnership which tackles violence in Brazil. To make it possible, G1 staff reporters all over Brazil kept track of violent deaths through the course of one week. Most of these are crimes that generally become forgotten — cases of homicides, robberies, deaths by police intervention, and suicides. There were 1,195 deaths in this period — one every 8 minutes on average.

All these stories have been cleared and written by more than 230 journalists spread throughout Brazil. This is a small sample — compared to the 60,000 annual homicide rate — but it represents a picture of the violence in Brazil.

The project aims at showing the faces of the victims; trying to understand the causes of this epidemic of deaths. As a first step, a news piece was written for each one of the violent deaths. An interactive map, complete with search filters, showed the locations of the crimes as well as the victim’s photos.

The second step was a collective and collaborative effort to find the names of unidentified people. A campaign was launched, including online, on TV and social media, so that people could help identify many of the victims.

A database was assembled from scratch, containing information such as the victims’ name, age, race, and gender. Also, the day, time, weapon used, and the exact location of the crime, among others.

Technologies used for this project: HTML, CSS, Javascript, Google Sheets, CARTO

 

 

 

 

Onwards and upwards for data journalism in 2018

The jury of the Data Journalism Awards, presided over by Paul Steiger, selected 13 winners (and one honorable mention) out of the 86 finalists for this year’s competition, and you can find the entire list, accompanied by comments from jury members, on the Data Journalism Awards website.

The insights I’ve listed in this article today show us that not only is the field ever-growing, it is also more impactful than ever, with many winning projects bringing change in their country.

Congratulations again to all of the winners, shortlisted projects, but also to all the journalists, news programmers, and NGOs pushing boundaries so that hard-to-reach data becomes engaging and impactful projects for news audiences.


 

The competition, organised by the Global Editors Network, with support from the Google News Initiative, the John S. and James L. Knight Foundation, Microsoft, and in partnership with Chartbeat, received 630 submissions of the highest standards from 58 countries.

Now in its seventh year, the Data Journalism Awards was launched in 2012. In the first edition, it received close to 200 projects. Over the years it has grown to become the first international awards recognising outstanding work in the field of data journalism, receiving the highest amount of submissions in the history of the competition in 2018.

 

 


marianne-bouchart

Marianne Bouchart is the founder and director of HEI-DA, a nonprofit organisation promoting news innovation, the future of data journalism and open data. She runs data journalism programmes in various regions around the world as well as HEI-DA’s Sensor Journalism Toolkit project and manages the Data Journalism Awards competition.

Before launching HEI-DA, Marianne spent 10 years in London where she worked as a web producer, data journalism and graphics editor for Bloomberg News, amongst others. She created the Data Journalism Blog in 2011 and gives lectures at journalism schools, in the UK and in France.

 

A data journalist’s microguide to environmental data

This article was originally published on the Data Journalism Awards Medium Publication managed by the Global Editors Network. You can find the original version right here.

_______________________________________________________________________________________________________________________

 

Lessons learned from an online discussion with experts

The COP23 conference is right round the corner (do I hear “climate change”?) and many data journalists around the world may wonder: How do you go about reporting on environmental data?

 

With the recent onslaught of hurricanes, such as Harvey, Irma, and Maria, and wildfires in Spain, Portugal and California, data journalists have been working hard to interpret scientific data, as well as getting creative to make it reader friendly.

The COP23 (do I hear climate change?) also serves as a great opportunity for data journalists to take a step back and ask:

What is the best way of reporting on data related to the environment? Where do you find the data in the first place? How do you make it relatable to the public and which challenges do you face along the way?

From top left to bottom right: Kate Marvel of NASA GISS (USA), James Anderson of Global Forest Watch (USA), Rina Tsubaki of European Forest Institute (Spain), Gustavo Faleiros of InfoAmazonia (Brazil), Elisabetta Tola of Formicablu (Italy), and Tim Meko of The Washington Post (USA)

 

We gathered seven amazing experts on the Data Journalism Awards Slack team on 5 October 2017 to tackle these questions. Tim Meko of The Washington Post (USA), Gustavo Faleiros of InfoAmazonia (Brazil), Rina Tsubaki of European Forest Institute (Spain), Kate Marvel of NASA GISS (USA), Elisabetta Tola of Formicablu (Italy), Octavia Payne and James Anderson of Global Forest Watch (USA), all took part in the discussion.

Here is a recap of what we’ve learned including tips and useful links.

 

Environmental data comes in many formats…only known by scientists

 

When it comes to working with environmental data, both journalists and scientists seem to be facing challenges. The main issue seems not to come from scarcity of data but rather from what journalists can do with it, as Elisabetta Tola of Formicablu (Italy) explained:

‘Things are still quite complicated because we have more data available than before but it is often difficult to interpret and to use with journalistic tools’, she said.

There also seems to be a gap between the speed at which data formats evolve in that area and how fast journalists learn how to work with these formats.

‘I think we are still in a moment where we know just a little about data formats. We know about spreadsheets and geodata, but then there are all these other formats, used only by scientists. And I am not really sure how we could use those’, said Gustavo Faleiros of InfoAmazonia (Brazil).

Environmental data should be more accessible and easy to interpret and scientists and journalists should be encouraged to work hand-in-hand more often. The existing incentive structure makes that hard: ‘Scientists don’t get paid or promoted for talking to journalists, let alone helping process data’, said Kate Marvel of NASA GISS (USA).

 

So what could be done to make things better?

 

“We need to open up more channels between journalists and scientists: find more effective ways of communicating’, said Elisabetta Tola of Formicablu.

We also need more collaboration not just among data journalism folks, but with larger communities.

‘Really, it is a question of rebuilding trust in media and journalism’, said Rina Tsubaki of European Forest Institute.

‘I think personalising stories, making them hyper-local and relevant, and keeping the whole process very transparent and open are key’, said James Anderson of Global Forest Watch.

Indeed, there seems to be a need to go further than just showing the data: ‘People feel powerless when presented with giant complex environmental or health problems. It would be great if reporting could go one step further and start to indicate ‘what’s the call to action’. That may involve protecting themselves, engaging government, responding to businesses’, said James Anderson of Global Forest Watch.

Top idea raised during the discussion: “It would be great to have something like Hacks&Hackers where scientists and journalists could work together. Building trust between these communities would improve the quality of environmental reporting but also the reward, at least in terms of public recognition, of scientists work.” Suggested by Elisabetta Tola of Formicablu.

 

To make environmental data more ‘relatable’, add a human angle to your story

 

As the use of environmental data has become much more mainstream, at least in American media markets, audiences can interact more directly with the data than ever before.

‘But we will have to find ways to keep innovating, to keep people’s attention, possibly with much more personalised data stories (what does the data say about your city, your life in particular, for example)’, said James Anderson of Global Forest Watch.

‘Characters! People respond to narratives, not data. Even abstract climate change concepts can be made engaging if they’re embedded in a story’, said Kate Marvel of NASA GISS.

For example, this project by Datasketch, shows how Bogotá has changed radically in the last 30 years. ‘One of the main transformations’, the website says ‘is in the forestation of the city as many of the trees with which the citizens grew have disappeared’.

This project by Datasketch, shows how Bogotá has changed radically in the last 30 years and include citizen’s stories of trees

 

With this project, Juan Pablo Marín and his team attached citizen stories to specific trees in their city. They mapped 1.2 million trees and enabled users to explore narrated stories by other citizens on a web app.

‘I like any citizen science efforts, because that gets a community of passionate people involved in actually collecting the data. They have a stake in it’, James Anderson of Global Forest Watch argued.

He pointed out to this citizen science project where scientists are tracking forest pests through people’s social media posts.

One more idea for engaging storytelling on climate change: Using art to create a beautiful and visual interactive:
Illustrated Graphs: Using Art to Enliven Scientific Data by Science Friday
Shared by Rina Tsubaki of European Forest Institute

 

Tips on how to deal with climate change sceptics

 

‘Climate denial isn’t about science — we can’t just assume that more information will change minds’, said Kate Marvel of NASA GISS.

Most experts seem to agree. ‘It often is more of a tribal or cultural reaction, so more information might not stick. I personally think using language other than ‘climate change’, but keeping the message (and call to action to regulate emissions) can work’, said James Anderson of Global Forest Watch.

A great article about that, by Hiroko Tabuchi, and published by The New York Times earlier this year can be found here: In America’s Heartland, Discussing Climate Change Without Saying ‘Climate Change’

‘Keeping a high quality and a very transparent process can help people who look for information with an open mind or at least a critical attitude’, Elisabetta Tola of Formicablu added.

A great initiative where scientists are verifying media’s accuracy:
Climate Feedback
Shared by Rina Tsubaki of European Forest Institute

 

Places to find data on the environment

The Planet OS Datahub makes it easy to build data-driven applications and analyses by providing consistent, programmatic access to high-quality datasets from the world’s leading providers.

AQICN looks at air pollution in the world with a real-time air quality index.

Aqueduct by the World Resources Institute, for mapping water risk and floods around the world.

The Earth Observing System Data and Information System (EOSDIS) by NASA provides data from various sources — satellites, aircraft, field measurements, and various other programs.

FAOSTAT provides free access to food and agriculture data for over 245 countries and territories and covers all FAO regional groupings from 1961 to the most recent year available.

Global Forest Watch offers the latest data, technology and tools that empower people everywhere to better protect forests.

The Global Land Cover Facility (GLCF) provides earth science data and products to help everyone to better understand global environmental systems. In particular, the GLCF develops and distributes remotely sensed satellite data and products that explain land cover from the local to global scales.

Google Earth Engine’s timelapse tool is useful for satellite imagery, enables you to map changes over time.

Planet Labs is also great for local imagery and monitoring. Their website feature practical examples of where their maps and satellite images were used by news organisations.

 

News from our community: In a few months, James Anderson and the team at Global Forest Watch will launch an initiative called Resource Watch which will work as an aggregator and tackle a broader set of environmental issues.

“It was inspired by the idea that environmental issues intersect — for example forests affect water supply, and fires affect air quality. We wanted people to be able to see how interconnected these things are,” said Anderson.

 

What to do if there is no reliable data: the case of non-transparent government

 

It is not always easy or straightforward to get data on the environment, and the example of Nigeria was brought about during our discussion by a member of the DJA Slack team.

‘This is because of hypocrisy in governance’, a member argued.

‘I wish to say that press freedom is guaranteed in Nigeria on paper but not in reality.

You find that those in charge of information or data management are the first line of gatekeepers that will make it practically impossible for journalists to access such data.

I can tell you that, in Nigeria, there is no accurate data on forestry, population figure and so on’.

So what is the way out? Here are some tips from our experts:

‘I would try using some external, no official sources. You can try satellite imagery by NASA or Planet Labs or even Google, then distribute via Google Earth or their Google News Lab. Also you can download deforestation, forest fires and other datasets from sites of University of Maryland or the CGIAR Terra-i initiative’, Gustavo Faleiros of InfoAmazonia suggested.

Here is an example:

Nigeria DMSP Visible Data By NOAA/NGDC Earth Observation Group

‘I think with non-transparent governments, it is sometimes useful to play both an “inside game” (work with the government to slowly [publish] more and more data under their own banner) and an “outside game” (start providing competing data that is better, and it will raise the bar for what people [should] expect)’, said James Anderson of Global Forest Watch.

‘It’s a really tough question. We’ve worked with six countries in the Congo Basin to have them improve their data collection, quality-control, and sharing. They now have key land data in a publicly-available portal. But it took two decades of hard work to build that partnership’, he added.

‘I think this is exactly the case when a good connection with local scientists can help’, said Elisabetta Tola of Formicablu. ‘There are often passionate scientists who really wish to see their data out. Especially if they feel it could be of use to the community. I started working on data about seismic safety over five years ago. I am still struggling to get the data that is hidden in tons of drawers and offices. I know it’s there’, she added.

‘For non-transparent governments, connect with people who are behind facilitating negotiations for programmes like REDD to get insider view’, added Rina Tsubaki of European Forest Institute.

CARTO is the platform for turning location data into business outcomes.

 

What tools do you use when reporting on environmental data?

 

Here is what our data journalism community said they played with on a regular basis:

CARTO enriches your location data with versatile, relevant datasets, such as demographics and census, and advanced algorithms, all drawn from CARTO’s own Data Observatory and offered as Data as a Service.

QGIS is a free and open source geographic information system. It enables you to create, edit, visualise, analyse and publish geospatial information.

OpenStreetMap is a map of the world, created by members of the public and free to use under an open licence.

Google Earth Pro and Google Earth Engine help you create maps with advanced tools on PC, Mac, or Linux.

Datawrapper, an open source tool helping everyone to create simple, correct and embeddable charts in minutes.

R, Shiny and Leaflet with plugins were used to make these heatmaps of distribution of tree species in Bogotá.

D3js, a JavaScript library for visualizing data with HTML, SVG, and CSS.

Flourish makes it easy to turn your spreadsheets into world-class responsive visualisations, maps, interactives and presentations. It is also free for journalists.

 

Great examples of data journalism about the environment we’ve come across lately

 

How Much Warmer Was Your City in 2015?
By K.K. Rebecca Lai for The New York Times
Interactive chart showing high and low temperatures and precipitation for 3,116 cities around the world.
(shared by Gustavo Faleiros of InfoAmazonia)

 

What temperature in Bengaluru tells about global warming
By Shree DN for Citizen Matters
Temperature in Bengaluru was the highest ever in 2015. And February was the hottest. Do we need more proof of global warming?
(shared by Shree DN of Citizen Matters in India)

 

Data Science and Climate Change: An Audience Visualization
By Hannah Chapple for Affinio Blog
Climate change has already been a huge scientific and political topic in 2017. In 2016, one major win for climate change supporters was the ratifying of the Paris Agreement, an international landmark agreement to limit global warming.
(shared by Rina Tsubaki of European Forest Institute)

 

Google’s Street View cars can collect air pollution data, too
By Maria Gallucci for Mashable
“On the question of compelling environmental stories to prioritize, (this was a bit earlier in the thread) I feel like hyper-local air quality (what is happening on your street?) is powerful stuff. People care about what their family breathes in, and its an urgent health crisis. Google StreetView cars are now mapping this type of pollution in some places.”
(shared by James Anderson of Global Forest Watch)

 

This Is How Climate Change Will Shift the World’s Cities
By Brian Kahn for Climate Central
Billions of people call cities home, and those cities are going to get a lot hotter because of climate change.
(shared by Rina Tsubaki of European Forest Institute)

 

Treepedia :: MIT Senseable City Lab
Exploring the Green Canopy in cities around the world
(shared by Rina Tsubaki of European Forest Institute)

 

Losing Ground
By ProPublica and The Lens
Scientists say one of the greatest environmental and economic disasters in the nation’s history — the rapid land loss occurring in the Mississippi Delta — is rushing toward a catastrophic conclusion. ProPublica and The Lens explore why it’s happening and what we’ll all lose if nothing is done to stop it.
(shared by Elisabetta Tola of Formicablu)

 

Watergrabbing
A Story of Water, looks into the water-hoarding phenomenon. Every story explains a specific theme (transboundary waters, dams, hoarding for political and economic purposes), and shows the players involved, country-by-country. Take time to read and discover what water grabbing means. So that water can become a right for each country and every person.
(shared by Elisabetta Tola of Formicablu)

 

Ice and sky
By Wild-Touch
Discover the history and learn about climate changes — the interactive documentary
(shared by Gustavo Faleiros of InfoAmazonia)

 

Extreme Weather
By Vischange.org
The resources in this toolkit will allow communicators to effectively communicate extreme weather using strategically framed visuals and narratives. Watch the video to see it in action!
(shared by Rina Tsubaki of European Forest Institute)

Plus, there is a new version of Bear 71 available for all browsers:
Bear 71 VR
Explore the intersection of humans, nature and technology in the interactive documentary. Questioning how we see the world through the lens of technology, this story blurs the lines between the wild world, and the wired one.
(shared by Gustavo Faleiros of InfoAmazonia)

 


 

To see the full discussion, check out previous ones and take part in future ones, join the Data Journalism Awards community on Slack!

 


marianne-bouchart

Marianne Bouchart is the founder and director of HEI-DA, a nonprofit organisation promoting news innovation, the future of data journalism and open data. She runs data journalism programmes in various regions around the world as well as HEI-DA’s Sensor Journalism Toolkit project and manages the Data Journalism Awards competition.

Before launching HEI-DA, Marianne spent 10 years in London where she worked as a web producer, data journalism and graphics editor for Bloomberg News, amongst others. She created the Data Journalism Blog in 2011 and gives lectures at journalism schools, in the UK and in France.

 

How three women are influencing data journalism and what you can learn from them

This article was originally published on the Data Journalism Awards Medium Publication managed by the Global Editors Network. You can find the original version right here.

________________________________________________________________________________________________________________________

 

Stephanie Sy of Thinking Machines (Philippines), Yolanda Ma of Data Journalism China and Esra Dogramaci of Deutsche Welle, formerly Al Jazeera (Germany), new members of the Data Journalism Awards jury, talk innovation, data journalism in Asia and the Middle East, and women in news.

left to right: Yolanda Ma (Data Journalism China), Esra Dogramaci (Deutsche Welle, formerly BBC and Al Jazeera), and Stephanie Sy (Thinking Machines) join DJA Jury

 

We welcomed three new members to the Data Journalism Awards jury last year (pictured above). They are all women, strong-willed and inspiring women, and they represent two regions that are often overlooked in the world of data journalism: Asia and the Middle East.

What was your first project in data journalism or interactive news and what memory do you keep from it?

Esra Dogramaci: In 2012, Invisible Children launched a campaign to seek out Lord’s Resistance Army(LRA) leader Joseph Kony and highlight the exploitation of child soldiers. Then, at Al Jazeera, we wanted to see what people in North Uganda, who lived in one of the areas who were affected by the LRA actually had to say about it. They would ‘speak to tweet’ and we would map their reactions on Ushahidi using a Google Fusion table in the background.

 
Uganda Speaks by Al Jazeera

 

Although Al Jazeera had started doing this kind of projects back in 2009 during the war on Gaza (the experiment’s page of the Al Jazeera Lab website has now disappeared but can be viewed through WebArchive.org), it picked up steam during Egypt’s 2011 Arab Spring where, due to lack of broadcast media coverage, protesters were using social media to bring attention to what was happening.

Interactive story by Thinking Machines

 

Stephanie Sy: Our first data journalism project as a team at Thinking Machines was a series of interactive stories on traffic accidents in Metro Manila. We cleaned and analysed a set of Excel sheets of 90,000 road accidents spanning 10 years.

It was the first project we worked on as a mixed team of journalists, designers, and data scientists, and the first time we tried to build something from scratch with d3.js! I worked on the d3 charts, and remember being in utter despair at how hard it was to get the interactive transitions to render nicely across different browser types. It was surprisingly well received by the local civic community, and that positive feedback emboldened us to keep working.

 
Connected China, Thomson Reuters

 

Yolanda Ma: One of my first projects was Connected China for Thomson Reuters, which tracked and visualised the people, institutions and relationships that form China’s elite power structure (learn more about it here).

This project taught me the importance of facts and every piece of data in it (thousands, if not millions in total) went through a rigid fact-checking process (by human beings, not machines, unfortunately). I learned by doing that facts are the bones of data journalism, not fancy visualisations, even though this project turned out to be fancy and cool, which is good too.

 

Now, what was the latest project you worked on and how do the two compare?

 

ED: Towards the end of last year, I taught a data journalism module to City University London Master’s students who were able to pull together their own data visualisation projects in the space of an hour. The biggest difference is how vastly the interfaces have improved and how quick and intuitive the designs and interactive softwares are now. There are a lot more companies switched on to storytelling beyond TV or text and that knowledge combined, how do you stand out in the world of online news?

Complementary to that Al Jazeera was always a front runner because they were willing to take risks and try something new when no one else was. In the newsrooms I’ve worked at or see since, there is still a general aversion to risk taking in preference of safety — though everyone knows that to survive and thrive in this digital media landscape, its risk taking, innovation that is going push those boundaries and really get you places.

SS: Our latest related data story is a piece we put together visualising traffic jams across Metro Manila during the holiday rush season. This time we were looking at gigabytes of Waze jams data that we accessed through the Waze API. It definitely grew out of our early work in transit data stories, but reflects a huge amount on growth in our ability to handle complex data, and understanding of what appeals to our audience.

One big piece of learning we got from this is that our audience in the Philippines mainly interacts with the news through mobile phones and via Facebook, so complex d3 interactives don’t work for them. What we do now is to build gifs on top of the interactives, which we then share on Facebook. You can see an example of that in the linked story. That gets us a tremendous amount of reach, as we’re able to communicate complex results in a format that’s friendly for our audience.

YM: I’ve been doing data journalism training mostly in the past few years and helping others do their data projects, so nothing comparable really. The latest project I worked on is this Data Journalism MOOC with HKU in partnership with Google News Lab. It is tailored-made for practitioners in Asia, and it’s re-starting again soon (begins March 6), so go on and register before it’s too late!

 

What excites you about the future of data journalism and interactive news?

 

ED: The ability to tell stories in a cleaner, more engaging way. Literally everything can be turned into a story just by interrogating the data, being curious and asking questions. The digital news world has always been driven by data and it’s exciting to see how “traditional” journalism is embracing this more. I love this example from Berliner Morgenpost where they charted this bus line in Berlin, combined with a dash cam comparing various data such as demographics, voting. Its an ingenious way of taking complex data and breaking it into a meaningful, engaging way rather than pie charts.

M29 from Berliner Morgenpost

 

SS: There are tremendous amounts of data being generated in this digital age, and I think data journalism is a very natural evolution of the field. Investigative journalists should be able to use computer science skills to find their way through messy datasets and big data. It’s absolutely reasonable to expect that a news organization might get a 1 terabyte dump of files from a source.

YM: It excites me because it is the future. We live in the age of data, and the inevitable increasing amount of data available means there is growingly huge potential for data journalism. People’s news consumption is also changing and I believe personalisation is one of the key characteristics for the new generation of consumers, which means interactive news — interactive in many different ways — will thrive.

 

How are Asian and Middle Eastern media organisations (depending on your experience) doing in terms of data journalism and interactive news compared to the rest of the world?

 

ED: I think Al Jazeera has always been a pioneer in this. They have a great interactive team that drew together people from various disciplines within the organisation — coders, video people, designers, journalists — before everyone else was doing it and they’ve been able to shed light on stories that wouldn’t usually be picked up on by mainstream media radars.

Example that illustrates my point: The project “Broken homes, a record year of home demolitions in occupied East Jerusalem” by Al Jazeera

“Broken homes, a record year of home demolitions in occupied East Jerusalem” by Al Jazeera

 

SS: We have a few media organisations like the Philippine Center for Investigative Journalism, Rappler, and Inquirer who have been integrating data analysis into their reporting, but there isn’t anyone regularly producing complex data journalism pieces.

Our key problem is the lack of useful datasets. A huge amount of work goes into acquiring, cleaning, and triple checking the raw data. Analysis is “garbage in, garbage out” and we can’t create good data journalism without the presence of good data. This is where the European and North American media organisations have an edge. Their governments and civic society organisations follow open data standards, and citizens can request data [via FOIA]! The Philippine government has been making serious progress towards more open data sharing, and I hope they’re able to sustain that commitment.

Example that illustrates my point: PCIJ’s Money Politics project is a great example of an organisation doing the data janitorial work of acquiring and validating hard-to-find data. During our last presidential elections in 2015, GMA News Network and Rappler both created hugely popular election tracking live data stories.

PCIJ’s Money Politics

 

YM: Media organisations in Asia are catching up on data journalism and interactive news. There are some challenges of course, for example, lack of data in less developped countries, lack of skills and talents (and limited training opportunities), and even poor infrastructure or unstable internet especially in rural areas that would limit the presentation of news stories. Despite the difficulties, we do see good works emerging, though not necessarily in English. Check out some of the stories from the last GIJN’s Investigative Journalism Conference held in Nepal and you’ll get an idea.

Example that illustrates my point: This Caixin Media data story analysed and visualised the property market in China for the past few years.

 

Another New Normal, Caixin Media

 

What view do you have on the role of women in the world of news today? How is it being a woman in your respective work environment? Do you feel it makes a difference? If so, which one and why?

 

ED: Women are underrepresented not just in news coverage but in leadership positions too. I have to admit though that being at Deutsche Welle, I see a lot more women in senior management and it feels like a much more egalitarian working environment. However looking at my overall experience as a woman in news, you do face a lot of sexism and prejudice. Every woman I know has a story to tell and when the latest story about Uber came out a lot of my female colleagues around me were nodding their heads.

What got me through challenging times is having a fantastic network of female role models and mentors who are there to support you. That was one piece of advice I gave to prior teams, get a mentor. A lot of women feel isolated or feel the way they are treated is normal but it’s not. Women should also be aware that there is a real risk you will be punished if you speak up, challenge the status quo and tow the party line. If this happens, it’s an environment or team you probably shouldn’t be in anyway.

SS: It’s alarming to see parties around the world trying to stifle the voices of anyone who doesn’t belong and calling any news that doesn’t flatter them as “fake news.”. It’s important for us to speak up as women, and to practice intersectionality when it comes to other marginalised communities. As people who work with data, we can see past the aggregates and look at the complex messy truth. We must be able to communicate that complexity in order for our work to make a difference.

YM: Most of the data journalism teams in China are led by woman, and I think they are doing really well 🙂

 

What do you think makes a great data journalism project? What will you be looking for when marking projects for the Data Journalism Awards this year?

 

ED: Simplicity. It’s easy to get lost in data and try to do too much, but it’s often about taking something complex and making it accessible for a wider audience, getting them to think about something they haven’t or perhaps consider in a different way. I’ll be looking for the why — why does this matter, does this story or project make a dent in the universe?

After all, isn’t that what telling stories is about? The obvious thing that comes through is passion. It’s also something obvious but you can tell when a person or team has cared and really invested into the work versus projects being rolled off a conveyor belt.

SS: A great data journalism project involves three things: novel data, clever analytical methods, and great communication through the project’s medium of choice. I’m hoping to see a wide variety of mediums this year!

Will someone be submitting an audio data journalism project? With all the very exciting advances in the field of artificial intelligence this year, I’m also hoping to see projects that incorporate machine learning, and artificial intelligence.

YM: I believe data journalism is after all journalism — it has to reveal truth and tell stories, based or driven by data. I’ll be looking for stories that do make an impact in one way or another.

 

If you had one piece of advice for people applying for the Data Journalism Awards competition, what would it be?

 

ED: Don’t be intimidated by the competition or past award winners. Focus on what you do best. I say this especially for those applying for the first time, I see a lot of hesitation and negative self talk of ‘I’m not good enough’ etc. In every experience there’s something to learn, so don’t hesitate.

SS: Don’t forget to tell a story! With data science methods, it’s easy to get lost in fancy math and lose track of the narrative.

YM: Tell us a bit about the story behind your story — say, we may not know how hard it might be to get certain data in your country.

 

What was the best piece of advice you were ever given in your years of experience in the media industry?

 

ED: Take every opportunity. That’s related to a quote that has been coming up over and over again for the past week or so, “success is when preparation meets opportunity.”

SS: One of my best former bosses told me to imagine that a hungover, unhappy man with a million meetings that day was the only reader of my work. He haunts me to this day.

YM: I started my career with the ambition (like many idealistic young people) to change China. My first (and second) boss Reg Chua once said to me, don’t worry about changing China but focus on making small changes and work with a long-term vision. Sounds cliche.

He said that to me in 2012. The next year, together with two other friends I started DJChina.org, which started in 2013 as a small blog and now grown to be one of the best educational platforms for data journalism practitioners in China. The year after, in 2014, Open Data China was launched (using the domain name I registered a few years back), and indicated a bottom-up movement to push for more open data, which was incorporated into national policy within a year. So I guess all these proved that Reg was right, and it could be applied to anywhere, or anything. Think big, act small, one story (or project) at a time, and changes will happen.

 


left to right: Yolanda Ma (Data Journalism China), Esra Dogramaci (Deutsche Welle, formerly BBC and Al Jazeera), and Stephanie Sy (Thinking Machines)

 

Stephanie Sy is the founder of Thinking Machines, a data science and data engineering team based in the Philippines. She brings to the jury her expertise in data science, engineering and storytelling.

Yolanda Ma is the co-founder of Data Journalism China, one of the best educational platforms for data journalism practitioners in China. Not only representing the biggest country in Asia, she also has experience teaching data skills to journalists and a great knowledge of data journalism from her region.

Esra Dogramaci has now joined Deutsche Welle and formerly worked with the BBC, Al Jazeera in Qatar and Turkey, as well as the UN Headquarters and UNICEF. She brings to the DJA jury significant experience in digital transformation across news and current affairs, particularly in social video and off platform growth and development.

 


The Data Journalism Awards are the first international awards recognising outstanding work in the field of data journalism worldwide. Started in 2012, the competition is organised by the Global Editors Network, with support from the Google News Lab, the John S. and James L. Knight Foundation, and in partnership with Chartbeat. More info about cash prizes, categories and more, can be found on the DJA 2017 website.


marianne-bouchart

Marianne Bouchart is the founder and director of HEI-DA, a nonprofit organisation promoting news innovation, the future of data journalism and open data. She runs data journalism programmes in various regions around the world as well as HEI-DA’s Sensor Journalism Toolkit project and manages the Data Journalism Awards competition.

Before launching HEI-DA, Marianne spent 10 years in London where she worked as a web producer, data journalism and graphics editor for Bloomberg News, amongst others. She created the Data Journalism Blog in 2011 and gives lectures at journalism schools, in the UK and in France.