Universities

Digital Curation Grants in US Library/Information Academic Departments

The US Institute of Museum and Library Services (IMLS) has recently awarded 38 Laura Bush 21st Century Librarian Program grants totalling $22,623,984.

Amongst the awards list I was struck by the following:

University of Illinois at Urbana-Champaign – Champaign, IL : Project Title: “Data Curation Education in Research Centers (DCERC)”

Award Amount: $988,543; Matching: $179,822

The University of Illinois Urbana-Champaign Graduate School of Library and Information Science, the University of Tennessee School of Information Sciences, and the National Center for Atmospheric Research have partnered to establish Data Curation Education in Research Centers (DCERC). DCERC will develop a model, including a field experience in a data intensive scientific environment, for educating LIS master’s and doctoral students in data curation. It will implement a graduate research and education program to address the need for professionals with scientific expertise who can manage and curate large digital data collections. Six doctoral students will benefit from this project.

Purdue University – West Lafayette, IN: Project Title: “Understanding Curation through the use of Data Curation Profiles”

Award Amount: $187,242; Matching: $104,868

Purdue University will create a series of workshops to expand the expertise of academic librarians about data curation issues. The needs of researchers and data producers are changing radically because of the disruptive effects of technology on research and its dissemination. This continuing education program will teach an estimated 370 librarians to be more effective data curators.

University of North Carolina at Chapel Hill – Chapel Hill, NC: Project Title: “Workforce Issues in Library and Information Science 3 (WILIS 3): Sustaining the Career Tracking Model through Data sharing”

Award Amount: $298,385; Matching: $85,637

The School of Information and Library Science, the Institute on Aging, and the Howard Odum Institute for Research in Social Science at the University of North Carolina at Chapel Hill will collaborate to document the process of data archiving and sharing. The major aims of the WILIS 3 project are to create publicly accessible de-identified datasets; to develop an interactive program-specific data system to enable library and information science programs to explore their own data and benchmark with other programs; and to produce a data archiving toolkit for use by other researchers.

And in UK/European Library and Information schools we have…….???

Knowledge Management Marketplace, University of Bath 17th June 2010

The University of Bath and the UK Council for Electronic Business (UKCeB) are hosting the second Knowledge Management Marketplace (KMM10), taking place at the University of Bath on 17th June 2010. It focuses on knowledge management lessons learned for SMEs. There will also be a number of larger companies there such as Airbus, BAE, BMT, Korteq, IBM, etc.

KMM10 will be of interest to:

  • Those who face issues related to knowledge management in their working day;
  • Vendors, consultants and developers who can assist in addressing such issues;
  • Researchers with interests in this area.

The marketplace is preceded by scene-setting keynotes, and followed by a panel session where issued raised throughout the day may be debated in a group setting.

Economic Impact of Research Data Sharing

Zoe Locke, Lead Technologist at the UK Technology Strategy Board has made an interesting post Impact of Data to their blog requesting any information on the economic impact of research data sharing. Extract as follows:

“I am currently in Manchester attending a JISC workshop on Managing Research Data…

Yesterday, there was an interesting keynote speech from the Director of the Digital Curation Centre (DCC).  However, I noted that ‘Impact’ was the 3rd reason for why researchers should care about data curation.  I asked about the meaning of impact.  In the context of the talk, impact was about whether or not the research for which the data was used got published (and had an effect on the researcher’s career).  The DCC focuses on transferring knowledge on curation into and around the higher education sector so this seems like an appropriate definition of impact.  However, given the potential socio-economic impact of research and resultant data, not to mention the business opportunities it could create (though we don’t really know where or what these are, let alone how big they might be), I can’t help feeling that we need to widen the definition to stimulate greater sharing and exploitation of data.  If businesses could generate wealth or increase the quality of life with this data then surely it would be easier for anyone to justify footing the bill for curation…

Does anyone out there have any specific case studies of money being made or saved through the exploitation of research data (specifically that data generated in a different organisation to the one exploiting it)?”

You will need to register with the Connect Network to post a reply to Zoe direct but I am happy to forward any examples readers may add as comments to this posting on the Charles Beagrie blog.

Keeping Research Data Safe 2: Final Report Published

I am pleased to announce that the final report for Keeping Research Data Safe 2 (KRDS2) is now available from the JISC website. This KRDS2 study report presents the results of a survey of available cost information, validation and further development of the KRDS activity cost model, and a new taxonomy to help assess benefits alongside costs.

KRDS2 has delivered the following:

• A survey of cost information for digital preservation, collating and making available 13 survey responses for different cost datasets;

• The KRDS activity model has been reviewed and its presentation and usability enhanced;

• Cost information for four organisations (the Archaeology Data Service; National Digital Archive of Datasets; UK Data Archive; and University of Oxford) has been analysed in depth and presented in case studies;

• A benefits framework has been produced and illustrated with two benefit case studies from the National Crystallography Service at Southampton University and the UK Data Archive at the University of Essex.

One of the key findings on the long-term costs of digital preservation for research data was that the cost of archiving activities (archival storage and preservation planning and actions) is consistently a very small proportion of the overall costs and significantly lower than the costs of acquisition/ingest or access activities for all the case studies in KRDS2. As an example the respective activity staff costs for the Archaeology Data Service are Access (c.31%), Outreach/Acquisition/Ingest (c.55%), Archiving (c.15%).This confirms and supports a preliminary finding in KRDS1.

A range of supplementary materials in support of this report have also been made available on the KRDS project website. This includes the ULCC Excel Cost Spreadsheet for the NDAD service together with a Guide to Interpreting and Using the NDAD Cost Spreadsheet. The NDAD Cost Spreadsheet has previously been used as an exercise in digital preservation training events and may be particularly useful in training covering digital preservation costs. The accompanying Guide provides guidance to those wishing to understand and experiment with the spreadsheet.

US National Science Foundation to mandate research data management plans

During the May  meeting of the National Science Board, National Science Foundation (NSF) officials announced a change in the implementation of the existing policy on sharing research data. In particular, on or around October, 2010, NSF is planning to require that all proposals include a data management plan in the form of a two-page supplementary document. The research community will be informed of the specifics of the anticipated changes and the agency’s expectations.

The changes are designed to address trends and needs in the modern era of data-driven science. Ed Seidel, acting assistant director for NSF’s Mathematical and Physical Sciences directorate acknowledged that each discipline has its own culture about data-sharing, and said that NSF wants to avoid a one-size-fits-all approach to the issue. But for all disciplines, the data management plans will be subject to peer review, and the new approach will allow flexibility at the directorate and division levels to tailor implementation as appropriate.

Full details can be found in the NSF press release.

Data Management Plans are also required by a growing number of research funders in the UK. The Digital Curation Centre provides a useful overview of current UK funder requirements for data management and sharing plans and a Data Management Plan Content Checklist.

Ensuring Perpetual Access – German National Hosting Strategy for electronic resources – Study now available

I am pleased to announce that our study Ensuring Perpetual Access: establishing a federated strategy on perpetual access and hosting of electronic resources for Germany is now available.

Concepts and Properties of Archives and Hosting in the Strategy and their Relationships ©Charles Beagrie Ltd 2009

Concepts and Properties of Archives and Hosting in the Strategy and their Relationships ©Charles Beagrie Ltd 2009. CreativeCommons Attribution-Share Alike3.0 Key: solid colour represents core properties and fading colour represents weaker properties of archives and hosting services.

The study was commissioned by the Alliance of German Science Organisations to help develop a strategy to address the challenges of perpetual access and hosting of electronic resources. In undertaking the study we were requested to focus on commercial e-journals and retro-digitised material.

Although developed for Germany, there is substantial discussion and recommendations around the issues of perpetual access, archiving, and sustainability of hosting and access services for these materials which will be of interest to an international audience.

Contents include:

  • Discussion, definition, and glossary of terms;
  • Review of relevant  international activity;
  • Review of current and future desired position in Germany;
  • Gap analysis;
  • A series of use cases;
  • Scenarios, potential solutions, and recommendations

Model used for discussion of the Federated Strategy on Perpetual Access and Hosting of Electronic Resources for Germany  ©Charles Beagrie Ltd 2009. CreativeCommons Attribution-Share Alike3.0

The members of the Alliance of German Science Organisations are the Alexander von Humboldt Foundation, the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation), the Fraunhofer-Gesellschaft, the German Academic Exchange Service (DAAD), the German Rectors’ Conference (Hochschulrektorenkonferenz – HRK), the Helmholtz Association, the Leibniz Association, the Max Planck Society, and the Wissenschaftsrat (German Council of Science and Humanities). For further information on the Alliance Hosting Working Group that steered the study see:

English webpage:

http://www.allianzinitiative.de/en/core_activities/national_hosting_strategy/working_group/

Deutsch:

http://www.allianzinitiative.de/de/handlungsfelder/nationale_hosting_strategie/arbeitsgruppe/

Elsevier and PANGAEA Data Archive Linking Agreement

An interesting press release from last week particularly when seen in the context of previous announcements on this blog: an emerging trend of journals and publishers linking to open-access data repositories?
Extract: Amsterdam, 24 February 2010 – Elsevier, a world-leading publisher of scientific, technical and medical information products and services, announced today that the data library   PANGAEA – Publishing Network for Geoscientific & Environmental Data – and Elsevier have implemented reciprocal linking between their respective content in earth system research. Research data sets deposited at PANGAEA are now automatically linked to the corresponding articles in Elsevier journals on its electronic platform ScienceDirect and vice versa. This linking functionality also provides a credit mechanism for research data sets deposited in this data library.
Dr. Hannes Grobe, data librarian of PANGAEA at the Alfred Wegener Institute for Polar and Marine Research commented, “Through this fruitful cooperation, science is better supported and the flow of data into trusted archives is promoted. The interaction of a publisher with an Open Access data repository is ideal to serve the requirements of modern research by diminishing the loss of research data. It also enables the reader of a publication to verify the scientific findings and to use the data in his own work. The Elsevier-PANGAEA cooperation consequently follows the most recent recommendations of funding bodies and international organizations, such as the OECD, about access to research data from public funding.”
“Our goal is to continuously improve user experiences, and this is one of the ways we make this happen” added Dr. Christiane Barranguet, executive publisher at Elsevier. “This is the beginning of a new way of managing, preserving and sharing data from earth system research. It also highlights the value ScienceDirect can deliver on its platform by giving researchers the papers they need and helping them put those papers in context, delivering unique value to user.”
Working with the scientific community to preserve scientific research data is also an objective of the Elsevier Content Innovation programme. Through this agreement and development Elsevier supports long-term storage, wide availability and preservation of large research data sets.

Results of Digital Preservation Costs Survey now available

I am pleased to announce that the findings from the Keeping Research Data Safe 2 (“KRDS2) survey of digital preservation cost information are now available on the KRDS2 project webpage.

One of the core aims of the KRDS2 project was to identify potential sources of cost information for preservation of digital research data and to conduct a survey of them. Between September and November 2009 we made an open invitation via email lists and the project blog and project webpage for others to contact us and contribute to the data survey if they had research datasets and associated cost information that they believe may be of interest to the study.

13 survey responses were received: 11 of these were from UK-based collections, and 2 were from mainland Europe. Two further potential contributions from the USA were unfortunately not available in time to be included.

The responses covered a broad area of research including the arts and humanities, social sciences, and physical and biological sciences and research data archives or cultural heritage collections. Each survey response is approximately 6-8 pages in length.

A summary analysis plus individual completed responses to the data survey that provide  more detail, are available.

We have also made the revised versions of the KRDS2 activity model available to download.

We aim to release the KRDS2 report via JISC in March following peer review and final editing. Further supplementary materials from KRDS2 will also be placed on the project webpage in March.

You will also notice that we have recently undertaken a major website re-design and made additions, should you wish to browse other information on the web site.

Scholarly Journals introduce Supplementary Data Archiving Policy

An important editorial has just appeared online in the February issue of The American Naturalist.
To promote the preservation and fuller use of data, The American Naturalist, Evolution, the Journal of Evolutionary Biology, Molecular Ecology, Heredity, and other key journals in evolution and ecology will soon introduce a new data archiving policy. The policy has been enacted by the Executive Councils of the societies owning or sponsoring the journals. For example, the policy of The American Naturalist will state:

This journal requires, as a condition for publication, that data supporting the results in the paper should be archived in an appropriate public archive, such as GenBank, TreeBASE, Dryad, or the Knowledge Network for Biocomplexity. Data are important products of the scientific enterprise, and they should be preserved and usable for decades in the future. Authors may elect to have the data publicly available at time of publication, or, if the technology of the archive allows, may opt to embargo access to the data for a period up to a year after publication. Exceptions may be granted at the discretion of the editor, especially for sensitive information such as human subject data or the location of endangered species.

This policy will be introduced approximately a year from now, after a period when authors are encouraged to voluntarily place their data in a public archive. Data that have an established standard repository, such as DNA sequences, should continue to be archived in the appropriate repository, such as GenBank. For more idiosyncratic data, the data can be placed in a more flexible digital data library such as the National Science Foundation–sponsored Dryad Archive.

Authors of the editorial, Michael C. Whitlock, Mark A. McPeek, Mark D. Rausher, Loren Rieseberg, and Allen J. Moore present the case for the importance of data archiving in science.   This is the first of several coordinated editorials soon to appear in major journals.

US Scholarly Publishing Roundtable calls for Open Access and Digital Preservation

The Association of American Universities and the American Institute of Physics have issued the following press release:

WASHINGTON, D.C., January 12, 2010 — An expert panel of librarians, library scientists, publishers, and university academic leaders today called on federal agencies that fund research to develop and implement policies that ensure free public access to the results of the research they fund “as soon as possible after those results have been published in a peer-reviewed journal.”

The Scholarly Publishing Roundtable was convened last summer by the U.S. House Committee on Science and Technology, in collaboration with the White House Office of Science and Technology Policy (OSTP). Policymakers asked the group to examine the current state of scholarly publishing and seek consensus recommendations for expanding public access to scholarly journal articles.

The various communities represented in the Roundtable have been working to develop recommendations that would improve public access without curtailing the ability of the scientific publishing industry to publish peer-reviewed scientific articles.

The Roundtable’s recommendations, endorsed in full by the overwhelming majority of the panel (12 out of 14 members), “seek to balance the need for and potential of increased access to scholarly articles with the need to preserve the essential functions of the scholarly publishing enterprise,” according to the report.

“I want to commend the members of the Roundtable for reaching broad agreement on some very difficult issues,” said John Vaughn, executive vice president of the Association of American Universities, who chaired the group. “Our system of scientific publishing is an indispensible part of the scientific enterprise here and internationally. These recommendations ensure that we can maintain that system as it evolves and also ensure full and free public access to the results of research paid for by the American taxpayer.”

The Roundtable identified a set of principles viewed as essential to a robust scholarly publishing system, including the need to preserve peer review, the necessity of adaptable publishing business models, the benefits of broader public access, the importance of archiving, and the interoperability of online content.

In addition, the group affirmed the high value of the “version of record” for published articles and of all stakeholders’ contributions to sustaining the best possible system of scholarly publishing during a time of tremendous change and innovation.

To implement its core recommendation for public access, the Roundtable recommended the following:

  • Agencies should work in full and open consultation with all stakeholders, as well as with OSTP, to develop their public access policies.
  • Agencies should establish specific embargo periods between publication and public access.
  • Policies should be guided by the need to foster interoperability.
  • Every effort should be made to have the Version of Record as the version to which free access is provided.
  • Government agencies should extend the reach of their public access policies through voluntary collaborations with non-governmental stakeholders.
  • Policies should foster innovation in the research and educational use of scholarly publications.
  • Government public access policies should address the need to resolve the challenges of long-term digital preservation.
  • OSTP should establish a public access advisory committee to facilitate communication among government and nongovernment stakeholders.
  • In issuing its report, the Roundtable urged all interested parties to move forward, beyond “the too-often acrimonious” past debate over access issues towards a collaborative framework wherein federal funding agencies can build “an interdependent system of scholarly publishing that expands public access and enhances the broad, intelligent use of the results of federally-funded research.”

The report, as well as a list of Roundtable members, member biographies, and the House Science and Technology Committee’s charge to the group, can be found here.

« Prev - Next »