ARMA Conference

I spent the first few days of this week in Glasgow attending the Association of Research Managers and Administrators (ARMA) UK conference. I presented a poster on some of the findings from our Chemists and Economists survey, and had a delightful time speaking with many Research Administrators and Managers, all who seemed quite educated about Open Access and also more interested in the topic than I expected.

I attended a variety of sessions and learned quite a bit about research management and administration, gaining a new insight into this profession. Below are a few notes onwhat I saw as the highlights.

The opening Plenary had two speakers, Professor Anton Muscatelli, the Principal from the University of Glasgow, and Ehsan Masood, the Editor of Research Fortnight and Research Europe. Both speakers gave engaging talks, and both, of course, identified that we are in challenging times when it comes to research funding.  Professor Muscatelli identified a number of things that institutions could focus on in order to meet these challenges. These were: 1) recognise the value of research (knowledge transfer, identifying and quantifying impact, etc.), 2) disseminate research imaginatively (changing approaches to IP), and 3) manage research efficiently and effectively. Mr. Masood discussed some of the other ongoing issues: funding cuts, concerns about using metrics, and using research assessment to allocate funding (which he noted encourages game-play and concentration).

I attended a session on the REF Assessment Framework, presented by Chris Taylor, Deputy REF Manager. Although a lot of the details about the REF will not be released until later in the summer, this session did give me a good idea of what will be expected in the REF process. The conference delegates had many questions of course, and the thing that I found particularly interesting (which I hadn’t realised before) was that for the next REF, the “impact” will be measured for the unit as a whole and not linked to submitting staff (this, I think is the attempt to get away from Impact Factor measurements, which is good!).

I also attended an interesting session on choosing a Research Management system – with Jonathan Cant discussing Hull’s experience using AVEDAS- CONVERIS and Jill Golightly describing Newcastle’s experience with a built in-house system. Ellie James, from Keele, did a session describing her experiences as a Research Planning and Project Manager (responsible for Keele’s REF submission) setting up a repository. It was interesting to see repositories from a different perspective – and it reminded me how important it is that institutions have set goals and objectives when setting up repositories.

It was a really interesting conference – and most importantly I learned that researcher managers and administrators definitely know how to have a good time! 🙂

Advertisement

Successful Event: Research Management – Smoothing the Way

It proved to be a successful event this past Thursday as Research Managers and Senior Library and Information Services Managers came together for a full day of presentations and discussion. The conference, organised by the CRC (specifically us working on the JISC funded RCS project), ARMA, RLUK and SCONUL focused on the growing need for integration between research support and information services.

The morning started off with introductions from Bill Hubbard (CRC), David Prosser (RLUK) and Ian Carter (ARMA), who set the appropriate tone for the day.

Susan Ashworth (University of Glasgow), Jill Golightly (Newcastle University), and Jackie Proven (University of St Andrews) then each discussed the current research managment situations at their universities.  From all three it seemed clear that these systems need to:

  • Provide only one place for researchers to input (and include integration with other systems),
  • Have the ability to mass import and check data,
  • Include ongoing advocacy to research staff,
  • Meet the needs of the different players / stakeholders (have it work for REF, and OA, etc.)

Stephen Pinfield (University of Nottingham) then gave us an introduction to the work being done at Nottingham with their OA Publishing Fund, put in place to meet the need set by Funders’ mandates. Stephen went on to describe the cost of OA publishing (Gold road) at the University of Nottingham, pointing to the Houghton Report – and commenting that it is probably the most important report for those working in this area. Stephen also described how OA publishing is generally cheaper for the University of Nottingham using this modelling.

Robert Kiley (Wellcome Trust) and Gerry Lawson (NERC) each gave us a funders’ perspective. They described some of things funders need, one thing in particular that was discussed was the need for proper grant acknowledgement and attribution, with grant number, in a standard form.

We finished off the day with small groups and then a panel discussion. There were some interesting ideas that arose, and will hopefully be taken forward:

  • Using the REF as a potential driver for OA content
  • Extending the grant period so that funds can be used for OA publishing
  • Standardisation of terms within these systems – including standardisation of grant acknowledgement
  • Further sharing of good practice

Many key players were present and it was good to get them all in the same room and let them hear each other’s thoughts and concerns.

The full programme and some of the slides are available here.

We may try and repeat the event or do something similar in the future so please do let us know if you are interested.

Cross-linking between repositories

A thread on JISC-Repositories this week has been discussing whether to delete repository records when an academic leaves.  This set me thinking about such policies in general and how the interaction of different policies between repositories may affect access or collections in the long run. It is an example, I think, of the way that institutional repositories work best when seen as a network of interdependent and collaborative nodes that can be driven by their own needs but produce a more general collective system.

Our policy in Nottingham is that we see our repository as a collection of material that is produced by our staff.  Therefore, our policy developed that when a member of staff leaves, we will not delete their items as this is a record of their research production while they were here.

More than that, authors should not expect such deletion even upon their request, except in very unusual circumstances.  If repositories are to be used as trusted sources of information, the stability of the records they hold is very important.

If authors have put material into the repository which includes their “back-catalogue” produced at previous institutions, then that is fine too — we will accept them and keep them.  Strictly, they did not produce this material while they are employed at Nottingham, but if they are not openly accessible elsewhere, why not take them?  It might be slightly anomalous to hold this material but if it opens access to research information, that’s the basis of what it’s all about.

I think there is a transition period here, while academics adopt the idea of depositing material.  I think it’s likely that academics will put their back-catalogue to date into the first major repository that they use in earnest, if they have the right versions available.  Thereafter, as this material should be kept safe and accessible, they can always link back to it.  In other words, once they have deposited their back catalogue, there are unlikely to want to do it at every subsequent institution they move to: as long as they know it will be safe and that they can link to it.  There is an advocacy theme here to help researchers understand that repositories are linked and that the repository – and repository network – will serve them throughout their career.

For a newly-arrived member of staff with material in a previous institution’s repository, then it all depends on the new institution’s collection policy as to whether the institution would prefer them to just deposit outputs they produce from that time on; deposit all their own material again; or create a virtual full record of outputs by copying the metadata and linking back to full-text in the previous repository(ies). This will depend in turn as to whether the previous repositories are trusted to match the new institution’s own terms for access and preservation.

Maybe if the material is held on a repository without long-term assurance of durability — maybe on a commercial service — and if the institution’s repository works on a level which cannot be matched, then there would be a rationale for holding a local copy of the full-text.  This may be held and exposed, or possibly be held in reserve in case of external service failure. Otherwise, simply linking back to the full-text held on the previous repository seems most practical if a full record is required.

If the previous repository is trusted to provide the same level of service in access, preservation, and stability, then it does not really matter which URL or repository underlies the “click for full text” link.  Academics can compile their list of publications and draw from the different institutions at which they have worked: repositories can hold their own copy of metadata records and link to external trusted repositories; and as far as the reader-user is concerned it’s still “search for paper — find — click — get full-text paper”.

This kind of pragmatic approach may well mean that some duplicates (metadata record and/or full-text) get into the system by being held at more than one location.  Duplication/ close-to-duplication will have to become a non-issue. I cannot see that duplication can be completely avoided in future: it already happens.  As such, handling close and exact duplicates is an issue we cannot avoid and must solve in some way as it inevitably arises. That is not to say that the publisher’s version will automatically become the “official” record in the way that it tends to be used now. We do not know how versions/ variants/ dynamic developments of papers will be used and regarded by researchers: we are just at the start of a period of change in research communications. Therefore if a process offers solutions and benefits, associated risks of duplication are not sufficient to dismiss the process as impractical.

After all, what is the alternative?  If as repository managers we start deleting records when folks leave and have to create/import/ask the academic for a complete set of their outputs when they arrive at a new institution, I think we, and significantly the users of open access research, will very quickly get into a situation where we lose track of what is where.

Even if we try to create policies or systems to replace an old link (to the now-deleted full-text), with a new link (to the full-text in the new repository), I cannot see this working seamlessly and things will get lost.  In addition I think that subsequent moves by the author would create daisy chains of onward references which would be very fragile.

While the use of repository references in citation of materials relates to research practice and so is for resolution between researchers rather than between ourselves, I don’t think we should deliberately disrupt longer-term references to material. Rather, I would see the system building on existing stable records and all institutional repositories able to play a part in the system-wide provision of information as stable sources.

Therefore, I would suggest that repositories should continue to hold staff items after they have left, as this helps fulfil their role as institutional assets and records. Repositories can accept an academic’s back-catalogue, even if it has not been produced at the institution, as being anomalous but in line with our joint overall aim of providing access to research information. Adopting standard practices will help reassure each institution that other repositories can be trusted with access and curation and allow stable cross-linking. Once a repository has material openly accessible, then, given matching service levels, the whole system supports linking to that material, without anything but possible local needs for additional, duplicate copies.   Overall, repositories can follow their institutional self-interest and still create a robust networked system.

Bill

Welcome Trust and research mandates

The Wellcome Trust held an event yesterday (24th Sept 2009) on Open Access and Funder Mandates at the Wellcome Collection Conference Centre on Euston Road. This drew together representatives of research funders, institutional research support offices and institutional repository managers to discuss compliance with funder mandates. This is a very useful mix of roles: it is this mix of people that will make funders’ mandates work in practice.

Robert Kiley, Head of Digital Services at the Wellcome Library, characterised the main issue when he reported that compliance with OA grant conditions for Wellcome Trust authors was only running at 36% or thereabouts (although the percentage was climbing). How could compliance with all funders’ mandates be raised, to achieve the OA benefits they are meant to bring?

One significant difficulty was identified as a lack of clarity for academics in what, quite, to do in order to comply: another was the ready identification, by authors and by institutions, of funding for OA publication charges where these are necessary.

After presentations and discussions, the joint response was for developing support structures and information; improving workflows and clarity; and for increasing collaboration between funders, research support offices, and other institutional staff.  All these efforts are vital if we are to move forward.

Teasing out some of the ideas and proposals:

*  authors need to be alerted to the need for compliance and its importance as a part of their funding requirements. This is a task for research support offices as much as libraries and their other open access work.

*  institutional mechanisms to check compliance have to be put in place.  Someone needs to know what outputs have come from a grant and check that publishers have filled their contracts by making them openly accessible, or that the author has complied in some other way. Who is that someone and where do they get their information?  There is an explicit responsibility on institutions from the Wellcome Trust to check compliance: OA support staff (typically repository managers), research support offices and funders can and should work together on solving this.

*  funding for OA charges is in place within grants from most funders (and all of the UKPMC funders except Cancer Research UK). Identifying or reserving this funding is a task for institutions – and then support has to be given to make sure that authors are given clear and step-by-step instructions for using this funding. Establishing a central OA fund is one option to assist this.

*  authors need a single smooth, supported workflow to allow them to comply with mandates, with relevant information on their responsibilities, the options they have and the funding available to them.

The presentations from this event are available from the UKPMC Blogspot

I was invited to speak on the University of Nottingham’s experience in creating a central fund for OA publication charges and on using RoMEO and JULIET to support processes. Chad Pillinger spoke on administering such a fund at Cambridge University – a clear workflow; Robert Kiley spoke on funder policies, including a very concise summary of the common features of UKPMC funders’ policies; Alison Henning, Ernie Ong and Paul Davey previewed developments to UKPMC and Nicola Perrin led the discussion groups which developed our thinking during the day.

Bill