Pages

Thursday, February 23, 2012

How ODI uses digital tools for measuring success in research uptake

A fascinating peer exchange last week in DFID discussed issues (and possible solutions) to the challenges related to monitoring and evaluating research uptake and communication efforts. The session, facilitated by my colleague Pete Cranston, was the first in a series of 3 meetings organised as part of the R4D project, to address different aspects of social media and engagement in and around development research.

The conversation was started by an excellent presentation from ODI’s Nick Scott. Over the past months, Nick has put a lot of effort into looking at how ODI was doing its own M&E of research uptake. As a result, the organisation has adopted a new analytical framework - and a new set of tools to track usage and uptake of their research outputs. The ODI approach and dashboard is well described by Nick in a recent post and presented in the slides below - if you have missed them and are interested in the subject, make sure you have a read both of them.





What the ODI’s experience shows is why and how research organisations should escape the “tyranny of downloads”. Downloads and pageviews alone are simply not enough to capture the different ways in which users interacts with the digital content in today’s social web. For example, for ODI a good 10% of their traffic is generated by social media. Tweets, shares, Facebook likes they all add a new layer of information that needs to be taken into account. An efficient use of digital tools can help in capturing much more of this information and do a better job in assessing the uptake and impact of research outputs.

Additionally, even when you are able to track all these elements, you need to make up your mind and decide on which metrics you are actually going to focus. When it comes to Google Analytics for example, ODI focuses on just a couple of the many different things you can track - unique visitors, entry pages, country of origin and a few others. But the choice of metrics and indicators is very much related to the goals you want to achieve, and how the information gathered can be put to use.

But why track stats and pile up information if you can’t make use of it? ODI M&E reports are now more intuitive to read, more colorful and graphic and able to present insights in easy formats. They are shared throughout the organisations in a much more open way. Further, because the different statistics are tracked at the level of the single outputs produced, they provide ODI’s communications team with the evidence to back up what its already known, but not always proved: using a balanced mix of channels and a more comprehensive communication approach creates a positive circle that gives more mileage to the research outputs. ODI researchers are now more inclined to write opinion pieces and blog posts that complement research reports, or to tweet about their new publications.

Finally, looking at the specific technical implementation that ODI has put in place, this seems to be rather advanced and complex. It uses APIs, servers logs and business intelligent applications to combine a set of different statistics into a structured dataset that can be enquired at any given moment though a dashboard interface. This is the result of 6 months of work and quite some investment in time and technology, so it wouldn’t probably be a viable solution smaller organisations. However, as Nick suggests in the video below, there is a huge need of a ‘think tank M&E’ tool that is affordable and allows to track the footprint of research outputs across the social web. Hopefully we won’t have to wait long before such an application is available.