Install Theme

ChicagoTalks Looks & Sees

\

What this means is that Google Analytics can calculate Average Time on Page and Average Visit Duration based only on the difference between timestamps for successive page requests on your site. Average Time on Page is calculated as the average difference between the request timestamp for that page and the request timestamp for the next pageview that occurred within your site. If only a single page is viewed during a visit, that pageview does not figure into Average Time on Page, since there is no second timestamp to subtract from. (via Average Time on Page, Average Visit Duration, and Browser Timestamps | Google Analytics Tip of the Day)

People traditionally think that the only thing that does well on Facebook is ‘top 10 cats.’ Actually our serious journalism does really well as well. People are realizing that looking at what people read is not evil. It shouldn’t be the only thing you chase and it’s merely one input into the editorial process but it’s not necessarily a negative thing. (via The Guardian Already Has An In-House Tool for “Attention Analytics.” Do You? - 10,000 Words)

Medium’s metric that matters: Total Time Reading (via Medium’s metric that matters: Total Time Reading — Medium Data Lab — Medium)

Factchecking of statements made by politicians and pundits is emerging as a much more common practice in political reporting. The American Press Institute will use its extensive networks within the news media, along with its credibility as a research group, to deliver research that improves the practice of fact checking. The institute will also work with outlets to significantly increase the adoption of fact checking practices as well as contribute to public debates on the topic. (via American Press Institute: Democracy Fund)

Consumers today have “contextual” analysis coming out of their ears. What they’re getting less of is the hard information — “what’s happening” — around which context is built. There are fewer reporters from general-interest publications covering city halls and statehouses, fewer devoted to issues such as the environment and healthcare, fewer keeping an eye on state and federal regulators. (via Supply of news is dwindling amid the digital media transformation - latimes.com)

Verification Handbook: Necessary tools for breaking news
The Verification Handbook, released yesterday by the European Journalism Centre, features tools and advice on verifying content in breaking news situations (via Verification Handbook: Necessary tools for breaking news | Media news | Journalism.co.uk)

very few members of the public would be interested in reading an “unvarnished” data journalism story so traditional journalism skills will always be necessary in finding and telling the stories. Where data journalists might be very good at finding the “what” of a story they need to be aware of the “why”, he said, and that is where collaboration and communication in the newsroom can lead to the best results. (via Are journalists being ‘intellectually outgunned’? | Media news | Journalism.co.uk)

Why Is This Year’s Flu So Dangerous for Young Adults? (via Why Is This Year’s Flu So Dangerous for Young Adults? | Mother Jones)

winter sky over Chicago http://ift.tt/1mKWrq2 By nobody@flickr.com (biverson)

Mining Deep Twitter To Turn History into a Storify

socialmediadesk:

image

This week marks the fifth anniversary of the horrific terrorist attacks in Mumbai, India. Over the course of three days, chaos ruled across the city formerly known as Bombay, as terrorists targeted hotels, cafes and train stations. It was one of the first major international news stories to break out across Twitter, before the Arab Spring and the 2009 Iranian election protests – and for me, it became one of the most difficult stories to reconstruct so many years after the fact.

I’ve been using Storify to create social media narratives since late 2010, and put it to use frequently during the protests and revolutions around the Arab World. While it was sometimes time-consuming, it was relatively straightforward, as I could construct each Storify in real time, or soon after a particular event took place. Social media has always been extremely ephemeral, so the faster I could build out a Storify, the greater likelihood I’d be able to capture whatever social media I was interested in documenting.

Back in November 2008, though, things were very different. There weren’t any social media archiving tools like Storify. Tweets contained relatively limited amounts of metadata. And it didn’t occur to most of us to make note of all of this historic tweets to utilize them later.

Fast forward to November 2013. All of those tweets from five years ago still exist. They’re part of Deep Twitter, buried in the archives and very difficult to surface through typical search results. Nonetheless, I wanted to reconstruct what happened that fateful week using Storify. This is how I went about doing it – and it wasn’t particularly easy.

Read More