/// ScraperWiki Tries to Turn Journalists Into Hackers

June 23, 2012  |  All Things Digital

More and more, regular people are learning how to program: CodeAcademy is drawing hundreds of thousands of students; hackathons have become a seemingly everyday happening across tech-savvy cities like San Francisco; and now some are even calling for children to learn to program in elementary school. So, are journalists setting themselves up to get left behind by technology again? Not this weekend, at least. At the offices of the San Francisco Chronicle, a hackathon called NewsHack Day kicked off yesterday with a day-long session aimed at teaching reporters how to scrape data for their stories. Led primarily by ScraperWiki’s Thomas Levine, attendees got a crash course in how to find and pull down data, clean it up, and apply it to projects. “Computers can do anything that a team of interns can do,” Levine said. My project? Instantly scraping Securities and Exchange Commission filings for the “compensation tables” that detail what company execs get each year in salary and perks. Done manually, digging those tables out of multiple documents can be slow, but they often contain interesting insights . Even if some of the drudgery of document-reporting can be outsourced to a computer, though, technical know-how isn’t enough to make a strong story. Levine used a Department of Labor Web page that linked to unions’ collective bargaining agreements as part of his introductory lesson. But, while looking at the site, he said he didn’t know the difference between “private sector” and “public sector” unions

The rest is here:
ScraperWiki Tries to Turn Journalists Into Hackers

Leave a Reply

You must be logged in to post a comment.