Forgotten your password? 

27 March 2017
From our Tech Guru: Digital Platforms and Reporting Print
Thursday, 18 August 2016 12:46

Universities in the DRUSSA programme have actively been growing the presence of Research Uptake Communication material, in the form of stories about uptake of evidence-based development research happening at their Universities. An important element of that process is gauging the website-audience response to the digitally published content. site manager Caite McCann has some experience to share on digital platforms with a focus on useful insights for reporting. 

The Communications and Engagement unit of the DRUSSA project has, using a common digital platform,, worked with the member universities’ Research Uptake Communication teams throughout the project duration. As a result the blog site is a record of the work of these teams. Over five years has published hundreds of blogs, eighteen e-Digests, and hundreds of documents in the document index. The platform was designed so that relevant data could be drawn. We’ve evaluated and reported on the usage data every quarter, which has given us some experience to share. 

Digital platforms are great when you’re wanting to keep a geographically diverse audience up to date on activities and sharing experiences. Because they are not face to face though, it is not so easy to judge how effective they are – are people interested in what you are sending out? Do they find the stories that are shared on the blogsite useful?

These are issues all digital Research Uptake Communicators will face, and it’s good to be able to share some experience and basic insights that may be useful to you as you tackle digital reporting.

  1. Terms you need to know – what are Descriptive and Inferential Statistics?
  2. Tips for planning your reporting
  3. Translating the data into a meaningful story
  4. Sharing our learnings: A quick overview of what data we drew for reports on 

1. Terms you need to know – what are Descriptive and Inferential Statistics?

Descriptive Statistics are basic statistics that describe what your users are up to - how often individual users visit your site (frequency), how long they stay on your site (duration) and where they go on your site (what information is of use, usage).

This is your proof that the site is used, that your audience finds your site useful and gives you an indication of what information holds the most value for them. Of course what you actually have is a series of numbers showing frequency, duration and usage. You still have to quantify this data so that it makes sense to your reader.

"Over time you can look at comparisons of comparatively good use or inadequate use, compared to previous reporting periods"

Inferential Statistics: One way of doing this is to use inferential statistics. With inferential statistics you are trying to reach conclusions that extend beyond the immediate data alone, and you have some options here. If you have a site with different levels of users you can use them for comparison. Another level of comparison is between those who registered and unregistered visitors to your site. Over time you can look at comparisons of comparatively good use or inadequate use, compared to previous reporting periods.. How and what you compare is dependent on what your site is meant to do.


2. Tips for planning your reporting
The usage data collection for your site must be enabled from the moment your site goes live on the Internet. An effective manager would have worked with the web site developer to make sure that the site is built in such a way that you can track what is happening and that you can generate reports that show that the site is fulfilling it’s purpose and that any external analytics engine that you wish to use are correctly linked, such as Google Analytics. This requires forethought and strategic goal setting – what do you want your website to achieve and what data do you need to evaluate if you are achieving your objectives? If you need to, find a specialist in the field of site analytics and ask them to help you create a site that will allow you to automate reports that provide you with perceivable outcomes. Make sure that the person who draws those reports has a good understanding of the field and will be able to help you provide evidence that your website is achieving the goals you have set.

"In telling the story you need to think about your report-reader audience, as you translate data into a written report"

3. Translating the data into a meaningful story

You need to figure out the story the data is telling, interpret what that says about your content and your website audience in relation to your goals. In telling the story you need to think about your report-reader audience, as you translate data into a written report. Like all statistical data – analyse and think it all through considering all angles and contextual circumstances, and that will give you a very interesting and nuanced story to share, and to draw from as you strategise for future development.


4. Sharing our learnings: A quick overview of what data we drew for reports on

Without direct feedback, as one would get at a conference or workshop, DRUSSA .net has relied on usage statistics as a proxy for feedback. compares data on quarterly and annual cycles.

For descriptive statistical data we track

  • Unique users, that is, the number of people who have visited the site once in a period.  
  • The number of unique users over an extended period like a year, or three years, is a measure of how the user-group network has widened as the project matured.
  • We track various user groups, the most important of which is the self-registered group, which shows a steady growth of interest. From the registration details, we can check that the people in this group are in fact our target audience. Given that the only reason to sign up is for access is to receive the DRUSSA Digest and eAlerts, and that the population size of that group is small and specialised, our signup rate has been very good over a three year period. (see graph below for reference)


  • Since we have a number of contributors who come from these primary stakeholders, for blogs and documents we track and report on these as measure of participation in the programme.
  • We use social media as another way of tracking interest and participation through open and closed groups, sign up rates, reach of posts, likes, follows and contributions.
  • We also have stats (provided by Mailchimp) for our quarterly email DRUSSA Digest that allow us to see the number of open rate, click rate (how many followed a link through to a site) and the amount of time spent reading the email. Mailchimp also allows you to track related site usage by including a google analytics tag in selected emails and track the site usage related to each email. This allows us to gauge the level of interest and to see which topics are engaging our readers the most.


For inferential statistics:

  • From our site we can track the number of page hits, which shows us how many people visit our site, and by combining this with a average duration of interest we can infer whether our site is interesting to readers.
  • Using IP address logging we can see if visitors are return users or new users. Return users means that our site is interesting enough that users return to it and this infers use of the information on the site. Also by tracking the number of new users and return users we can measure audience growth.
  • Avg. Pages per Session and Page views per session – give us an idea of whether the visitors to our site have run across it by accident or if they are actually interested in the content of the site, the higher these numbers the more we are reaching our target audience.
  • The same is true for Avg. Session Duration, the longer the duration the higher the level of interest and participation.


Caite McCann is the Information Systems Manager for OSD and the DRUSSA Programme