Measuring the impact of your communications, dissemination and exploitation in EU-funded projects

Image of a spinning wheel

Authors: Nicole McNeilly, Nikolaos Partarakis, Xenophon Zabulis, Merel van der Vaart

Waag and the Mingei partners have been collaborating intensely to optimise the project’s impact for the participating organisations and in wider heritage, crafts and technical communities (and beyond). In our previous blog, we shared key barriers and strategies to delivering impactful communication and dissemination. To move us forward, we now share ways for you to measure the impact of your communication, dissemination and exploitation (CDE), based on research conducted in the early months of Mingei and a more recent survey of professionals with experience in EU-funded projects. 

  1. The impact of your communication 

To measure the success of online communication, the Mingei consortium followed the models and methods developed as part of the Let’s Get Real research project. In particular Let’s Get Real 1: How to evaluate online success, and Let’s Get Real 2: Measuring digital engagement

Figure 1. This image from our deliverable shows a number of impact metrics for journals and publications drawn from the Let’s Get Real project. The pros and cons of these are discussed in this Taylor and Francis article.

The list below presents a summary of some of these tips for you, alongside suggestions shared in our survey. 

  • Analyse growth and interaction on social media platforms, including LinkedIn, Facebook, Twitter, Youtube and Instagram. The image above shows some specific indicators on some of these platforms. 
  • Explore online analytics to get insight into who is engaging with your website content and how. Google analytics can help you collect data about the geographic locations of your visitors; how people find your content (e.g. through searching or social media); what people explore most often and for how long they stay; how your content is accessed (e.g. mobile or laptop); and the differences between the reach of organic and paid content, if you use this feature. 
  • Assess if you have been mentioned on or by platforms that are key to your project: e.g. for Mingei, have we been featured on any crafts channels? 
  • Assess to what extent your project hashtag(s) have been used across different channels (though it’s easier if this hashtag is only used in relation to your project, which might not always be the case). 
  • What is your reach through your newsletters? Consider growth (and what encouraged growth) and compare interaction (through open and click-through rates) against other benchmarks, as well as your past results.
  • Set a Google alert to track how often your project, activities or partners are discussed (and therefore contributing to sector or policy debates). Tools like Mention provide more extended versions of this functionality.
  • If you have a platform people can sign-up to, assess how many users have signed up and when, and analyse how many of these users are active (e.g. regular users).

  1. The impact of your dissemination

In Mingei, the consortium endeavours to offer open access to its scientific results reported in publications, to the relevant scientific data and to data generated throughout the project lifetime. 

Open access publishing can have more impact in terms of citations than publishing in closed-access journals. Where possible, we aim for “gold” open access. Gold means that the output is published in a fully open online archive for free and immediately. Wherever “gold” is not possible, “green” open access is pursued. This is when initial publishing may not be open, but free alternative access is provided by the author (e.g. in an institutional repository). The target is to maximise the impact on scientific excellence in ways that include publication in open access yet highly appreciated journals as well as blogs and publicly available White Papers. 

The impact of your publication relies to a great extent on the reputation of the journal in which you publish. Taylor and Francis suggest that you should consider the reputation of the editorial board in your community, the readership of the journal, and its traction with policymakers. 

Figure 3. This image from our deliverable shows a number of impact metrics for journals and publications. The pros and cons of these are discussed in this Taylor and Francis article.

When reporting on the impact of your dissemination activities, you might mention the impact factor of the journals in which you have published; the number of downloads or views of the research; and the number of citations. For the citation analysis, you might have to check several databases to review how often you have been cited. 

Mingei’s measurable outputs include:

  1. Scientific impact:
    The consortium contributed with more than 35 scientific publications in prestigious journals and conferences in the area of Cultural Heritage. All the papers are in Gold or Green Open Access.
    All publications have been uploaded on OpenAir Zenodo. Furthermore, community pages were created in Zenodo and ResearchGate for the Mingei project and all publications are listed on these pages (besides the Mingei project website).
  2. Tool adoptions by stakeholders: 10
  3. Digital assets integrated: 10000
  4. New digitisations: 2000
  5. Heritage crafts digitised: 7
  6. Adopted or curated content and digital assets in international repositories: 10 (but many in backlog)
  7. Contribution to public knowledge (i.e. number of Wikipedia entries or edits): 3

You should also report how many conferences or events you are presenting at, whether physical or in person. You should be able to track how many people were in the (digital or physical) room when you presented and how many might now have awareness of your results by reading the conference programme (that is, the total conference sign-up). 

  1. The impact of exploitation: look at impact in terms of outcomes and not just outputs

All the usual metrics (likes, comments […]) can be quite meaningless. Tell a story using some of those, but focus on impact and outcomes. 

A common theme in the responses to our survey was that the impact of CDE is not just about numbers (the outputs). Rather, impact can – where possible and where capacity exists – be measured more qualitatively, looking to quality metrics over quantity. It is generally thought that an impact narrative is strongest when you can ‘make a story out of it’ and when qualitative indicators are presented together with more quantitative results. 

Some more qualitative indicators that were suggested to us relating to exploitation include: 

  • Assessing for how long, by whom and for what purposes the project resources are being used after the project ends.
  • The number and type of project/research collaborations that begin as a result of the project.
  • Measuring understanding and satisfaction of the users of the project resources.

Survey respondents also suggested indicators that could help you to think about the impact of the project’s resources and the exploitation of these, including, any changes in practice as a result of using the project resources and professional development amongst partners and participants. It is challenging to measure influence on projects on policy change as this sort of change takes time. Nonetheless, some examples were shared where materials published openly and for broad audiences were picked up and used by policy makers as well as in education. 

Was it useful for someone in the end? How many people have used it, quoted it, got inspired by it? 

Interestingly, one survey respondent reported project outputs being politicised and misused, but most commonly, we heard that it was usually difficult to discern usage and influence after the project ends – perhaps because people are already moving on to the next project and lack the capacity to reflect and measure change. 

Measuring the potential impact of exploitation in Mingei

‘Having [crafts] knowledge on the [Mingei] Platform is not the goal, but the goal is also to exploit it, meaning that the presentation modalities that we build can help automate the process of creating educational materials, creating demonstrations, and so on’. Xenophon Zabulis (FORTH) at the Mingei Day online seminar

In the Mingei project, we are focusing on measuring impact for the partners involved in the project. At the same time, we have also reported on the potential impact through exploitation in the following areas: 

  • Capacity building for heritage professionals (where 83 professionals have already benefited from a curriculum developed for the DigiTraining project)
  • Creative industries exploitation and impact (where we have seen how technology and crafts combine in this example of a digitally enhanced woven handbag)
  • Future research projects and closer research collaboration (including two Greek crafts projects, close working with Europeana and the CRAFTED project, and plans to further extend the technology)
  • The possibility to bring crafts to previously less-catered for audiences (including creating technology that can be reused for the benefit of those who are hearing disabled or blind, e.g. in the sign-language narration in the mastic pilot in Chios)
  • Future heritage crafts education (where training materials was captured for apprentices in each of the Mingei crafts contexts)
  • Immersive learning experiences for museum visitors (with the open-source technology in terms of crafts representation and its gamification being made available for further development)
  • The reuse of the crafts content for any number of uses (because the content is now published in several open source platforms, including, for example, the Mingei Open Platform, Europeana and Zenodo). 

Such reuse has the potential to support sustainable tourism and destination management, and a stronger, more competitive Europe. 

Figure 3. Design pattern game (left) and punch card game (right), part of the silk pilot at HdS, Krefeld, Germany. 

Final impact measurement tips

We have learned that people are striving to tell more meaningful stories about the impact of their CDE activities and the project’s outputs and learnings. We finish this article with five top tips to help you measure and report on your project’s impact right from the start to the very end. 

  1. Start off with a benchmark – that is to say, know where you started from and when! Capture change on a regular basis (e.g. every six months) or as often as you are required to report on your progress. 
  2. Do your research right at the start. For example, what channels should you engage with? What hashtags can you use? What hashtag could you make for your project to assess its relevance?
  3. Use a tool like the Europeana Impact Playbook to help you think about who you will have impact for and how you can measure this. 
  4. Report to your funders in a standardised way – set out how you will report at the beginning of the project and follow-through with this so that they can easily see the progress in your communications and dissemination. 
  5. Maintain a log of direct and indirect impact that all project partners can contribute to. This could be as simple as a Google document where you share examples of tweets, publications or citations related to your project outputs. 

Bibliography

Malde, S., Finnis, J., Kennedy, A., Ridge, M., Villaespesa, E. & Chan, S. (2013), Let’s Get Real 2. ‘A journey towards understanding and measuring digital engagement’. Report from the second Culture24 Action Research Project.

https://www.keepandshare.com/doc/6593572/lets-get-real-2-colour-pdf-11-2-meg?da=y (date accessed: 18/05/2020)

Finnis, J., Chan, S. & Clements, R. (2011), Let’s Get Real. ‘How to Evaluate Online Success?’ Report from the Culture24 Action Research Project.

https://www.keepandshare.com/doc/3148918/culture24-howtoevaluateonlinesuccess-2-pdf-september-19-2011-11-15-am-2-5-meg?da=y&dnad=y (date accessed: 18/05/2020)

Finne, H., Day, A., Piccaluga, A., Spithoven, A., Walter, P. & Wellen, D. (2011), A Composite Indicator for Knowledge Transfer. Report from the European Commission’s Expert Group on Knowledge Transfer Indicators. European Commission. https://ec.europa.eu/research/innovation-union/pdf/kti-report-final.pdf (date accessed: 18/05/2020)