In 2020, FAIRsFAIR selected 10 repositories from an open call to receive support in CoreTrustSeal certification. At the end of this programme, repositories were asked to reflect on the experience through a questionnaire and interviews with the FAIRsFAIR support team. This blog series showcases their perspective.
In this post, let's consider the experience of participating.
“At the beginning I wasn't so keen on the topic [digital preservation], but now I like it”
– Programme participant.
As of writing, nine of the participating repositories have submitted a CoreTrustSeal application. In the interviews, FAIRsFAIR support was mentioned by some as the only reason they were able to apply. So, it's no surprise to find on a scale of 1 to 5, repositories felt the programme lived up to expectations.
Programme expectations: How did the programme live up to these expectations?
Sentiment analysis of participation experience
The questionnaire included free text variables allowing participants to provide responses in their own words. Applying sentiment analysis gives a summary of how positive or negative replies are, in this case, by counting positive, negative, and neutral words to determine feeling in the text.
Using the sentimentr package in R (Rinker, 2019), which controls for intensifiers like "really" and negation words like "not", we can derive a value for each sentence of text. The presence of many positive words, and few negative words, gives a value above zero. The inverse would give a negative one. Neutral would be zero.
Keep in mind this is applying a general lexicon, which itself is based on consensus ratings for words, so it is only indicative. Nonetheless, each dot represents a sentence in the repository's response to that variable's question and we see the contributions were mostly positive, reinforcing the participants' quantitative evaluations.
Sentence sentiment scores: positive/negative sentiment totals for sentences used in repository questionnaire free text variables
How did the programme live up to expectations? Given the chance to select which areas of the programme were "most helpful", support workshops scored highest. Peer-review of applications, and one-to-one support, were mentioned by over half the repositories.
Helpful areas: Which areas of the programme did you find most helpful?
Reflecting on the workshops, participants felt they reduced the time needed to prepare an application by learning from questions asked and shared experiences of others in the sessions.
COVID-19 necessitated a switch to online meetings. The switch enabled continuing contact and reduced the burden on resources through not having to travel to events, but there was a negative impact mentioned in the lack of time for question-led discussion. The ability to forge strong connections through in person events being compromised by the switch was also mentioned.
For other areas, one-to-one support with peers provided expert, tailored support, and repositories appreciated the quick responses from mentors. They also found practice reviews an informative aspect of the programme as it helped them understand what kind of input reviewers were looking for and what kind of feedback could be expected.
Nine repositories had to address domain or discipline specific areas as part of their certification, yet even when managing different types of preservation, needs were consistent across all. Only three felt that their support route preferences had changed over time. When it came to approaches though, preferences were set, and did not alter through the programme*.
The project provided repositories with 10,000€ to help with CoreTrustSeal preparation. Only two felt this sum was insufficient. There wasn't a lot of variation in what the money was spent on. Indeed, for some, financial support was the only reason they could pursue CoreTrustSeal.
Helpful areas: What parts of the work did you spend the FAIRsFAIR financial support on?
In the next blog post, we will look at recommendations for repositories thinking about applying for CoreTrustSeal certification, and for programmes to help them.
This series complements FAIRsFAIR Deliverable 4.3 Report on the certification support and guidance for repositories and reviewers, which describes the support programme, challenges, lessons learned, and recommendations from the perspective of the FAIRsFAIR team. Learn about the Repository Support Programme on the FAIRsFAIR website.
Rinker, T. W. (2019). sentimentr: Calculate Text Polarity Sentiment version 2.7.1. http://github.com/trinker/sentimentr
* Based on the CESSDA Trust Group overview of support approaches: https://doi.org/10.5281/zenodo.3621378