An edited version of the webinar is now available on the Australian National Data Service's YouTube channel. If you didn't get the chance to attend in person, we'd love to hear your feedback and any questions about our work.
While our project found the currently available tools and methods for data metrics to be immature, Griffith now has some of the necessary building blocks in place to take advantage of new developments as they evolve. These building blocks include both prototype technical infrastructure (e.g. scripts for minting DOIs), a draft policy framework for managing DOIs, and some suggestions for citation-related enhancements to our content repositories and discovery services.
We used this project to explore formal and less formal ways of measuring impact. Part of the project involved evaluating the new Thomson Reuters Data Citation Index (DCI). Some of our subject librarians assisted with a trial of this new product during April 2013. To summarise the findings:
- The Data Citation Index is a good fit for the suite of Web of Knowledge products that Thomson Reuters offers. This is a positive start, in that data can be seen as just another product in the citation databases.
- However, the DCI is still an immature product. Issues identified include the quality of the data (which is dependent on journal policies and discipline conventions that are not yet well evolved to meet these needs), the limited coverage of disciplines, and the small number of Australian repositories that have been harvested.
- For Griffith, currently the cost of the DCI outweighs the benefits, but we should re-evaluate this regularly as costs change (e.g. if a national site licence were to be negotiated) and as the content improves and expands.
Our project aimed to raise awareness of data citation and impact with data collection owners. This was an important part of the project but also the most challenging. When communicating with researchers, we've identified that our credibility will be improved if we are:
- aware of disciplinary differences in citation practices
- honest about the still small and partial evidence base for the citation advantages associated with open data, and
- realistic about the lack of rewards for researchers for sharing data and having it cited by others.
Promoting a culture of data citation will be a long-term ongoing process. Here at Griffith, we are now at the point where we have infrastructure in place for data to be deposited and for DOIs to be minted, and procedures that ensure these processes are understood. Griffith’s new best practice guidelines (to be released soon) will incorporate data citation as part of a holistic view of data management, and over time we would hope that information resources and training courses will reflect data citation practices better than they do now.
There are still many external drivers that are just as important, if not more important, than these institutional efforts. These factors include:
- the scope of current qualitative and quantitative assessments of research quality
- the policies and guidelines of scholarly journal publishers, and
- the poor support for data citation in commonly used style manuals and bibliographic management software.
In light of these external factors, one of our final lessons learned was about being realistic about what can be achieved by a single institution. If we want to realise the benefits of data citation fully, collective action will be needed on many fronts.
We'd like to thank ANDS for funding some of the work described here, and for providing information and opportunities for discussion with other institutions embarking on similar projects. While the Data Citation Infrastructure Establishment Program has now formally ended at Griffith, our work in the areas of infrastructure, impact and outreach will continue. We look forward to contributing more as the broader ANDS partner community works together to develop data citation solutions and services.