Published on March 6, 2014
Outline About EPRC Results Framework at EPRC Why Measure Communication What we measure How we measure Challenges of Impact Measurement Lessons Learned 2
About EPRC EPRC is a Uganda based Policy Think Tank Mission: Foster sustainable growth & development of Ugandan economy by advancing the role of research in policy processes Conducts research, policy analysis & advice, engage in policy outreach and engagement Have a Four (4) year strategic plan that guides programmes and; A Policy Engagement & Communication (PEC)Strategy 3
EPRC Results Management Framework To improve performance and implementation of the Strategic Plan, EPRC adopted a results management culture that focuses on outcomes and impact EPRC has developed an in-house M& E data collection tool for tracking and reporting on results M&E function and tool administered by the Information and Dissemination Unit Indicators for measuring communication are integrated in the tool 4
Why Measure Communication? Ensure that our information products and services remain of the highest quality and reach our target audiences in most effective manner Specifically to: Have well-crafted information products and services Improve management of product development, production & distribution Ensure we are reaching the intended audiences, in the right way and right time Assess the effect or impact of our products and services Increase use of the products & services and in turn improve uptake of research in policy processes. 5
What we Measure? Reach: That is the extent to which information is distributed, redistributed and referred to i.e breadth and saturation Usefulness: The quality of products and services i.e. if appropriate, applicable and practical Use: What is done with knowledge gained from our information products and/or services? 6
How we Measure? Standard of monitoring if products & services meet requirements that make them effective and useful to policy makers Monitoring tool used to monitor outputs and outcomes Incorporated 10 communication related indicators within the tool Rely on other specific tactics for collecting information for each indicator as will be discussed 7
Indicators--Reach Indicator Data captured Data source Significance 1. # of copies of a # of copies sent product distributed to # of people on existing lists in mailing list hard and or electronic forms EPRC contacts Database Helps to reach out to many people with our information products 2. # of copies of a # of copies, product venue, and date distributed thru distributed. additional distribution e.g. training, workshop or meetings Administrati ve record in form of Distribution form Helps track distribution that occurs in tandem with related events, hence increases chances of information being understood and applied. Helps build demand for the products. 8
Indicators—Reach & Demand Indicator Data captured Data source Significance 3. Numbers of products distributed in response to orders/user initiated requests # of phone orders, # of email requests # of interpersonal requests. Admin records/form by Knowledge Management Specialist Helps to measure demand and thus indirectly gauges perceived value, appropriateness, and quality. 4. Number of file downloads in a time period refers to Internet user’s transfer of content from EPRC Website to own storage medium. Web server log files. Web analysis softwareGoogle Analytics run by IT Specialists Helps to know the information products and topics that are used most on the website and which countries or regions are using the website most 9
Indicators—Reach & Demand Indicator Data captured Data source Significance 5. Media mentions of EPRC events, staff and products Media outlets, Admin records/form by Knowledge Management Specialist Helps to measure demand and thus indirectly gauges perceived value, appropriateness, and quality. Admin records/form by Knowledge Management Specialist Helps not only to capture reach but also proxy measure of quality since librarians will ask for what they believe is beneficial to their clients/users # and description of articles and radio/TV broadcasts; Audience reach of media outlets. 6. Number of instances that products are selected for inclusion in a library or online resource #number and type of publications selected; # and type of library or information Centre/online resource 10
Indicators—Reach & Quality Indicator Data captured 7. # of events held Participants Capture reach but also the registration demand and feedback on Number of form events participants by gender and type of sector e.g. NGO, public, private, Media or donors. Number of policy engagements organized e.g. workshops, conferences to share and/ or discuss policy emerging issues or to disseminate research findings. Data source Significance 11
Indicators—Usefulness Indicator Data captured Data source Significance 8. Percentage of users who are satisfied with a product or service Qualitative covering user likes and dislikes, attitudes using scales e.g. how strongly they agree or disagree with statements on frequency, subject categories and technical quality Feedback forms distributed with the product by the KMS who receives feedback via email However the ideal would be to use other forms of surveys, online, telephone surveys or interviews with users but this requires time to develop tools and analyze data 9. Percentage of users who rate the content of a product or service as useful Qualitative recording relevance and practical applicability of the content e.g. Was the topic(s) covered in the product interesting and useful to you?” Data sources Ideally user surveys would be most used are appropriate but not done due to feedback forms reasons same as above distributed with the product 12
Indicators—Use Indicator Data captured Data source Significance 10. Number of users using an information product or service to inform policy Data captured is number of policy recommendations provided to clients; Mainly use informal (unsolicited) feedback; that comes via email on the various policy recommendations, (phone or in person); The challenge with getting this information is that users may not recall what particular information source they used, information may be used but not referenced Number of recommendations used by clients, Evidence to show how recommendations have been used Review of copies of policies, guidelines, or protocols, referencing or incorporating information from products or services. 13
Challenges Harmonization of different data requirements from different projects and data sources Consistency: How to capture information systematically in ways that are straightforward and on a regular basis Measurement: of outcomes from different information products and services varies, and lead to a wide range of impact and influence; most of which is intangible and very hard to measure Relationship building: not everyone within the department or organization may be motivated to embed M&E within their work; Time: The process of collecting data and information as well as analyzing it may span over a long period of time to reflect the impact of communication products and services. Tools and data collected keep changing e.g. advent of social media and web based resources has brought new ways of sharing information and thus monitoring 14
Lessons learned Not one tool suits all communication M& E requirements, there is need to have various tools that capture specific communication products and services The tool design requires full participation of all parties that will be involved in using it Continuous capacity building in tool use to facilitate regular capturing of data 15
Podcast 3: Measuring Communications Impact at EPRC [Please post your comments and questions at the bottom of this page]. Presenter: Elizabeth Birabwa
Communications Impact Articles, experts, jobs, and more: get all the professional insights you need on LinkedIn
... (1998) Economic and Social Cohesion in the European Union: the Impact of Member ... Measuring Disparities for ... Energy and Communications, Sweden ...
... Energy and Communications), ... EPRC United Kingdom: Rona Michie and Dr Martin Ferry, ... Measuring the impact of the New Regional Policy projects in
Measuring Communications. by goneil. on Nov 01, 2014. Report Category: Documents
Measuring Communications; Measuring Communications Nov 01, 2014 Documents goneil. of 13
Official Full-Text Publication: Measuring the impact of EU support ... Measuring the impact of EU support ... http://www.eprc.strath.ac.uk/eprc ...
Home > European Institute > Research and Impact > Research Awards and Prizes. Home. ... Research awards: ... in collaboration with EPRC, ...