In the previous post, I described the importance of having a digital analytics measurement plan and I presented some essential elements for correct and efficient use of Google Analytics (GA). However, recent work I’ve conducted make me wonder how advanced (or not) the use of digital analytics - and GA specifically - is amongst development organizations. My recent experience was limited to 6 organizations (different in size, resources and capacities) so the sample is clearly limited. But some trends are probably more common than not.
The use of different reporting views, as well as filters and advanced segments is also not very common. This means that Analytics data are just analysed in aggregate, without telling you much about the specific audience you intend to reach. For example, if your website is targeting users in North Africa and the Middle East, you need to be able and single out traffic from these regions, to better analyze your target audience.
Tracking goals and ‘conversions’ is not always common practice. Goals can be set up in various ways in GA to track users’ interaction with the website - when they scroll on the page, click a link, decide to print a page, comment or spend a certain amount of time on the page. This can provide a great deal of information to website managers and editors, to improve the way information is presented and webpages are organized, as well to increase users’ engagement.
Only one out of 6 organizations stood out in using an advanced configuration of Google Analytics to create different reporting views, filter data, track goals and conversions.
On the one hand, while Analytics is the default tool to track digital analytics, in most cases is also the only monitoring tool. On the other hand, when digital analytics are collected from different sources, (e.g. website, newsletter, RSS feeds, social media, etc), more often than not they are not presented and analysed in aggregate. Finally, not all organizations are regularly producing actionable reports on the basis of their analytics, to inform future actions and improvements on the website.
Only one organization presented a more advanced understanding of its’ digital analytics process, with multiple data collection points (e.g. website, newsletter, RSS feeds, social media, etc) that fed into a dashboard spreadsheet, using formulas and calculations to avoid double counting and over-reporting of metrics. Even if there was no document describing a strategy, this is already a great step towards more efficient use of digital analytics.
The majority of information and guidance available online, while comprehensive, in general tends to focus on e-commerce and more business oriented websites. Other sources such as the Digital Analytics Programme (DAP) provides a good example of guidance and best practices, training and support in digital analytics. However, the target audience is also very specific, DAP being designed for US Government government agencies that provide information to the public. Eventually, there is not much available that focuses specifically on digital analytics for development - and information and knowledge services specifically.
Secondly, I think website administrators and managers should be more open about how they do digital analytics, as ODI has been doing by sharing their M&E dashboard. Knowledge sharing and learning opportunities should be created for users to exchange notes and learn from each other, to identify good practices and examples that can be replicated. Ideally, I believe that web managers should also be open about the actual number of their website stats. Especially for publicly-funded websites, this would mean more transparency and the possibility to compare and benchmark different websites.
Finally, I think donors should play their part in fostering better use of digital analytics in projects and programs they fund. Besides acting as convenor for peer learning initiatives around good use of digital analytics, donors should provide stronger guidance and support in this area, to make uniform data tracking and collection across different projects. Ideally, for donors’ funded websites and knowledge services, there should not just be the mention of few, poorly selected web metrics in the project logframe. A digital analytics and measurement plan should be developed as part of the project inception phase.
In the next post in this series, I’ll look specifically at what metrics and indicators could be most useful, amongst the dozens available, for development websites and knowledge services.
Google Analytics is not always used to its full potential
In reviewing how different websites use GA, I discovered huge differences. For some organizations, the setup of Analytics is far from optimal. For example, one organization didn’t have a great understanding of the differences between GA account, profile and property, which resulted in unstructured proliferation of accounts.The use of different reporting views, as well as filters and advanced segments is also not very common. This means that Analytics data are just analysed in aggregate, without telling you much about the specific audience you intend to reach. For example, if your website is targeting users in North Africa and the Middle East, you need to be able and single out traffic from these regions, to better analyze your target audience.
Tracking goals and ‘conversions’ is not always common practice. Goals can be set up in various ways in GA to track users’ interaction with the website - when they scroll on the page, click a link, decide to print a page, comment or spend a certain amount of time on the page. This can provide a great deal of information to website managers and editors, to improve the way information is presented and webpages are organized, as well to increase users’ engagement.
Only one out of 6 organizations stood out in using an advanced configuration of Google Analytics to create different reporting views, filter data, track goals and conversions.
Google Analytics too often stands alone
I have highlighted before the importance of a digital analytics and measurement plan - and how Google Analytics may eventually just be a part (even if the most important) of your data collection and analysis system. On the contrary, I didn’t find a lot of this in the websites I’ve recently reviewed.On the one hand, while Analytics is the default tool to track digital analytics, in most cases is also the only monitoring tool. On the other hand, when digital analytics are collected from different sources, (e.g. website, newsletter, RSS feeds, social media, etc), more often than not they are not presented and analysed in aggregate. Finally, not all organizations are regularly producing actionable reports on the basis of their analytics, to inform future actions and improvements on the website.
Only one organization presented a more advanced understanding of its’ digital analytics process, with multiple data collection points (e.g. website, newsletter, RSS feeds, social media, etc) that fed into a dashboard spreadsheet, using formulas and calculations to avoid double counting and over-reporting of metrics. Even if there was no document describing a strategy, this is already a great step towards more efficient use of digital analytics.
What can be done?
I think a lot could be achieved through the availability of more specific content for international development and the open exchange of experiences around digital analytics for the development sector.The majority of information and guidance available online, while comprehensive, in general tends to focus on e-commerce and more business oriented websites. Other sources such as the Digital Analytics Programme (DAP) provides a good example of guidance and best practices, training and support in digital analytics. However, the target audience is also very specific, DAP being designed for US Government government agencies that provide information to the public. Eventually, there is not much available that focuses specifically on digital analytics for development - and information and knowledge services specifically.
Secondly, I think website administrators and managers should be more open about how they do digital analytics, as ODI has been doing by sharing their M&E dashboard. Knowledge sharing and learning opportunities should be created for users to exchange notes and learn from each other, to identify good practices and examples that can be replicated. Ideally, I believe that web managers should also be open about the actual number of their website stats. Especially for publicly-funded websites, this would mean more transparency and the possibility to compare and benchmark different websites.
Finally, I think donors should play their part in fostering better use of digital analytics in projects and programs they fund. Besides acting as convenor for peer learning initiatives around good use of digital analytics, donors should provide stronger guidance and support in this area, to make uniform data tracking and collection across different projects. Ideally, for donors’ funded websites and knowledge services, there should not just be the mention of few, poorly selected web metrics in the project logframe. A digital analytics and measurement plan should be developed as part of the project inception phase.
In the next post in this series, I’ll look specifically at what metrics and indicators could be most useful, amongst the dozens available, for development websites and knowledge services.