What is data and computational journalism?

30 Mar 2015

What does the Reuters data and computational journalism team do? For that matter, what is data and computational journalism, anyway? We asked Maurice Tamman, Editor in Charge of Data and Computational Journalism to explain.


Let’s start by saying what it isn’t. It’s not doing a Google search and copy and pasting a couple of numbers from a PDF; it’s not looking up an indictment online; it’s not opening a spreadsheet attached to an email, sorting a column and finding the largest or smallest value. Although those are all useful skills to have.

At its core, data journalism is reporting, albeit a different kind from door-stopping politicians or cold-calling bankers. But it’s just as hard-core and relentless, and requires a unique combination of reporting instincts and technical skills. And that has created new capabilities that let us extend Reuters journalism in fresh and important ways.

In the last few months, Reuters has published three remarkable sets of stories that illustrate how we have become one of the global leaders in this journalism specialty. These are stories that would not have existed but for the technical skills of data and computational journalism team members Janet Roberts, Ryan McNeill, Charlie Szymanski and Mike Pell. All of them, working alongside other reporters, created compelling narratives that formed the flesh around a backbone of data reporting.

And in each example, they told stories around subjects that had never been told so precisely and with data that had largely been ignored.

Reuters Investigates: Water’s Edge


Starting in September, the “Water’s Edge” series examined the global issue of rising seas and sinking land. At the heart of the series was Ryan McNeill’s analysis of hundreds of tidal gauges from around the world, some of which have been documenting the daily ebb and flood tides for a century. The sea levels they measure have been religiously recorded but few have bothered to examine what they have to say.

Ryan did. He looked at tens of millions of records and found that some  locations have seen sea level rises measured in feet over the last 50 years. He was able to identify the places that have seen the greatest impact in the US and elsewhere, and reporting from those locations – along the eastern US seaboard, Texas, south England and Jakarta – documented the economic, social and personal consequences of this shift in the seas.

Going beyond the politics and contentious debates about the causes of climate change, the series focused on the measurable data and offered the clearest evidence of the issues already confronting governments and companies – key information that both our financial and media customers need.

The series was awarded third-place in the prestigious Phil Meyer Award, the data journalism contest run by Investigative Reporters & Editors.

Special Report: The Echo Chamber

Then in December, the “Echo Chamber” series examined the growing influence of a small cadre of lawyers on the U.S. Supreme Court. To do that, Janet Roberts examined about 15,000 petitions to the court, accessing the data through our sister company Westlaw. She created a database from those records that allowed her to track lawyers, their law firm, the issues raised in the petition, and if the petition was granted by the court. case

Again, this was something that had never been documented with such precision. She and Charlie Szymanski then deployed algorithms hardly ever used in journalism to “read” each petition and assign specific topics to each case.

Working with other reporters, they produced a series that documented the insularity and influence of the Supreme Court bar – critical information not just for our legal customers, but for anyone whose business or livelihood might be affected by a Supreme Court ruling. The series also  included interviews with eight of the nine justices.

The series received widespread attention in both the legal and mainstream press and is one of six finalists for the prestigious Goldsmith Award for investigative journalism.

Special Report: Pemex Contracts

And in mid-January, Michael Pell used a Mexican federal government database of suspect contracts granted by Pemex, the government-owned oil company, and compared those audits with suspect contracts taken up by theoretically independent investigators working inside the oil giant. He found that the investigators almost never acted.

Michael’s use of the data was a first in Mexico and underscores the opportunities to employ data journalism skills beyond the U.S., in what should be a core competitive advantage for a global news organization such as Reuters.  All of the data was publicly available in Mexico but no one, until Mike, connected the dots.

Using the data to identify particularly egregious examples, Mike and two other reporters illustrated the dysfunction of the Pemex contracting system and how the company is turning a blind eye on wrong-doing.

Within a few weeks of publishing, the company announced it was revamping its contract reviews.

Those are just a few examples of how data journalism is changing Reuters and our profession; as data proliferates and computing power increases, the opportunities for us to surface more such insights and tell better stories continue to grow.

  • Tiberiu Cazacioc

    Yeap, but there where public system with check and balances fails, the costs to do this is huge…It is an excellent development but still…being a watchdog is so expensive…