Expert Commentary

Research chat: CUNY’s Barbara Gray on Web-based research

Interview with CUNY teacher and librarian Barbara Gray about research techniques and tips.

Barbara Gray (CUNY)
Barbara Gray (CUNY)

Barbara Gray is Distinguished Lecturer and Interim Chief Librarian at the CUNY Graduate School of Journalism, where she oversees the school’s Research Center and coordinates the research components of its curriculum. She has served as news research director at the New York Times.

In both her professional and educational work, Gray has continually sought the latest and most efficient tools that help reporters push their stories further and find the key sources, details and context that enrich the experience and understanding of the audience. In the digital age, she says, this means constantly updating and evaluating research techniques. As part of our “Research chat” series, Journalist’s Resource recently sat down with her at CUNY to get her thoughts on the research process. (See her quick tips on tools at bottom.)

 ____

Journalist’s Resource: What does good research look like from the inside?

Barbara Gray: At the professional level, it usually means good collaboration between researchers, reporters and editors — when possible, of course, and not every news outlet can afford to have a research staff. Professional researchers can really make a difference, though, because that’s what we do all day. We don’t have to worry about who to call to get a scoop, or writing. We’re just keyed in on the best and the newest and fastest way to get things.

As for individual articles or news items, I tell my students that in any well-researched piece, I love to see some data, some documents, and some experts. This is what you see in a Times story, typically. I don’t like seeing stories where the documents aren’t available for the audience to further the reporting and the conversation. And then you have to find people — the right people to bring the story to life. That is also a big part of research, showing how the background meets up with the real world.

JR: What are some of the research tools that journalists should use?

Barbara: If you’re a student, get plugged in to your school library and your state library. There are a lot of databases you can dive into. I would recommend getting a subscriber, paid account on Spokeo: you can find phone numbers, emails, addresses. At a basic level of research, if you are doing people-finding, I think that’s a great thing. If you marry that with a commercial database that you can access through your school — Reference USA, for instance — that’s a powerful combination, and you can find the right people you need for stories.

I also like Twitter; it’s a great RSS feed, and it’s a great way to communicate with people and to stay on top of your beat. You need to figure out who is “in the know” in your community or district. And there are great tools to use on Twitter, like Snap Bird, to focus in on archived information on Twitter that isn’t in their regular 10-day archive. Also, I love Twiangulate as a tool for finding connections among people. If you’re starting a story or a new beat — say politics in Brooklyn or you’re generally covering a certain area in Los Angeles or Boston — go on Twiangulate and find the local newscasters and politicians. See who those people follow in common — who follows them, and who they follow. You can get to know the influencers and the people who know the news there really quickly using that. This doesn’t replace good, on-the-street shoe-leather reporting. Ian Fisher, who is the day editor at the New York Times, calls it the “new media way of traditional reporting.” When you combine social media with boots on the ground — going into the neighborhood and talking with people — you just have so much more information so much more quickly.

JR: You mentioned data earlier. Let’s say a young journalist is in a new town or city and just getting used to things. What are some of the local sources of information that you would recommend locating?

Barbara: In New York City, for example, real property records are all online, meaning that you can see the complete record of ownership going back to the 1960s — including the actual documents of ownership. This means you can also find spouses, you can find addresses, previous addresses, as well as business names and partners, middle initials — things that you can find that allow you to go even deeper. That’s what good researchers and reporters do really well. They can find bankruptcy papers, for instance, and use the information there — the number of children they have, their spouse, who their health insurer is, employer — to open other avenues of information. Always check out your local government website and search it for “databases” to see what comes up. Look for databases that relate to your beat. If you’re covering business and real estate, go to the Department of Planning databases or their statistical tools or whatever you can find. Then go to the researcher or statistician or librarian, or whoever works in whatever area you’re covering, and talk to that person. Then go out and find local experts, people in area institutions who use information, and ask them which databases they use and what sorts of data is available within the beat you’re researching.

The amount of information is almost endless. The Web is around 500 times bigger than Google can crawl, apparently. I’ve heard that Lexis-Nexis, Factiva and databases like GuideStar, which provides information on nonprofit organizations, only uncover about another 1% of what Google can’t search. So with all the rest of that structured data, and pay-wall protected data, you have to find a way into it. You should also look for social data sites like those powered by Socrata, Data.dc.gov for instance.

Just as a example of how this works, the Times has an excellent computer-assisted reporting team, and they recently found 11,000 records in counties around the nation of oil and gas leases related to hydrofracking. This great series, “Drilling Down,” was done by reporter Ian Urbina and a computer-assisted reporter named Jo McGinty. It was the kind of data that was public but not publicly available, so they had to use the Freedom of Information Act to obtain it. We should all be looking to find places where information that has emerged through the FOIA process can be made available and shared. Document Cloud is one great example.

JR: What does your general research checklist look like for young journalists reporting a story?

Barbara Gray: First, survey what has already been written. Go into Lexis, Factiva and the Ebsco databases and see what has been reported on your topic before, so you’re not reporting in a vacuum. That’s the first thing. Second, try and find experts and stakeholders and ask about further data — where to find it and how to interpret it. Third, find people who are going to help to drive the story.

Find the statistics that show how an issue is affecting your community generally and then locate people who are being affected specifically. And finally, show your work to your audience. If you get documents, make them available so the readership and community can continue the dialogue.

____

Barbara Gray’s quick list of recommended free, or almost free, sites for journalists:

  • Spokeo: For finding people, pay version includes phone numbers and emails.
  • Open CRS: Archive of Congressional Research Service Reports that have been leaked to the public. These reports contain a lot of sourced statistics, data and analysis.
  • Wayback Machine: Webpage archive where you can find some interesting stuff like these restricted records the New York Times found in an archived version of Newt Gingrich’s Center for Health Transformation website.  Note: Also check the caches of multiple search engines like Google, Bing, Blekko, DuckDuckGo; they all have different webpage caches.
  • Twitter search tools: Topsy and Snapbird for searching archived tweets.
  • Twiangulate: Site for finding out who follows who.

Tags: research chat

About The Author