Summary of the survey

conducted by

TSIG RDA Training Needs Assessment Working Group

Survey designed: 2009/2010

Survey administered: April-June 2010

Survey summary: prepared by Chris Oliver; opinions and inferences in this summary do not represent conclusions reached by the Working Group.

Purpose of the survey:

(1) to assess the level of awareness about RDA and to provide an opportunity for Canadian library and information services staff to suggest content and priorities for RDA training

(2) to gather information from staff in Canadian libraries and information organizations about their experiences and preferences with different training methods, especially the level of familiarity with different types of web training; to discover whether there are training methods to which people react negatively.

Background

RDA: Resource Description and Access (RDA) is the new cataloguing standard that will replace the Anglo-American Cataloguing Rules (AACR)[1]. AACR has been the cataloguing standard in Australia, Canada, Great Britain, the United States and many other countries around the world for the past thirty years. Library and Archives Canada, the British Library, Library of Congress and the National Library of Australia are working towards a coordinated implementation of RDA. Implementation is expected to take place in 2011.

The transition from AACR to RDA will require a fundamental re-orientation in the way library staff, especially cataloguing staff, approach the function of describing resources and creating access to them. The release of RDA is not the release of a revised standard. The changes go beyond a set of new guidelines. RDA represents a paradigm shift in the understanding of the cataloguing process. The key to understanding RDA is the fact that it is built upon a conceptual framework expressed in the models known as FRBR and FRAD, Functional Requirements for Bibliographic Records[2], and Functional Requirements for Authority Data[3]. This framework enables a more consistent and effective treatment of bibliographic data, whether traditional resources such as books and journals, or new digital resources such as databases and web pages. Thus, implementation will not just be a question of imparting information about new instructions. Implementation must also communicate this deepened understanding of the structure of bibliographic data, and communicate a new approach to a traditional activity.

Training must cover both the new theoretical framework as well as the content of the new guidelines. The information collected through this survey is intended to help shape the Canadian implementation plan. The Technical Services Interest Group of the Canadian Library Association is intending to play a role during RDA implementation in terms of proposing training plans, providing and/or disseminating training documentation, and organizing training opportunities.

The chief purpose of the survey was to gather data to inform decisions related to RDA training in Canada. As training plans and documents are being developed, it will be useful to have a picture of Canadian training experiences and the methods of training preferred by Canadian cataloguers. The information collected from the Canadian cataloguing community is intended to influence decisions about the delivery of training, the number of training options available, and the order in which training modules are presented.

From the introduction to the survey:

Your feedback about training experiences and RDA training needs will provide important background information. There will be moments when decisions about alternatives must be made. While it may not be possible to honour everyone's preferences, the information we collect from the Canadian cataloguing community may influence decisions about the delivery and content of RDA training.

The survey also provided an opportunity to assess the level of awareness about RDA, and an opportunity for Canadian library staff to communicate their highest priorities in terms of RDA training.

Investigators

The survey was carried out by TSIG RDA Training Needs Assessment Working Group. The members of the group were:

Investigators:

Alison Hitchens, University of Waterloo

Brian Rountree, Red River College

Chris Oliver, McGill University

Marianne Reid , Brandon University

Trina Grover, Ryerson University

Liaison with the Executive of the Technical Services Interest Group:

Marcia Salmon, York University

All investigators in the group received approval from the ethics review board at their respective institutions.

Methodology

The research was conducted using an online survey. Participants received an email invitation to participate in the survey. The email message gave a quick overview of the purpose of the survey and identified who was conducting the survey. It included a link to the cover letter and to the survey on the Survey Monkey website. The cover letter gave more details about the purpose and context of the research, as well as underlining the measures taken to protect the anonymity of participants’ responses.

The survey questions were developed through a group decision process. Since Chris Oliver was on sabbatical from Sept. 1st, 2009, she had the time to develop the draft design and draft questions. The members of the group provided feedback, honed the questions, reviewed design and structure, revised wording, etc. The members of the group also tested the survey, acting as the preliminary pretest group. Then three additional librarians, who were familiar both with RDA and with the aims of the group, were asked to pretest the survey and give comments. Again, more changes were made in response to the feedback.

Two members of the group, M. Reid and C. Oliver, consulted with Prof. Glenn Cockerline from Brandon University about the design of survey questionnaires. His advice and his recommended readings influenced the final design of the survey.

When the group reached consensus on the design of the survey and the content of the questions, a small official group of pre-testers was identified and asked to take the survey. The survey was attached to a new pre-test data collector. The pre-testers were also invited to submit email comments about the survey questions, especially noting problems with the clarity of the questions, the appropriateness of categories, etc. Nine pre-testers were asked to participate. One declined due to heavy work commitments. The other eight responded by completing the survey; three also submitted comments. The pre-testers included librarians and technicians, staff from academic, public and school libraries, and different areas of Canada.

While the survey was being developed, there were changes in the timeline for the release of RDA. When work began on the survey, the date of release was expected to be November 2009, but this date was changed to May/June 2010. To ensure a good response rate, participants needed to feel that RDA training was something important and urgent. RDA had to be on the horizon. The survey was put on hold for several months. Invitations to participate were sent in April 2010.

Privacy and Informed consent

The data was collected using Survey Monkey, the professional version. The online survey did not collect any identifying data so we have no link between the individual participant and their response. Access to the survey results is password protected. Only the principal investigator and the co-investigators have access to the raw data. The raw data is protected so that it cannot be inadvertently accessed by someone outside the investigating group.

Participation in the survey was voluntary. Respondents received an e-mail inviting them to complete the questionnaire, with links to the survey and to a cover letter. In the cover letter, the purpose and context of the research were explained, as well as underlining the commitment to maintain the anonymity of their answers.

From the cover letter:

The online survey tool does not collect any information that personally identifies you. Your answers will be completely anonymous. In addition, we will carefully review the findings before reporting them, to ensure that the findings do not identify any individual respondents.

Your participation is entirely voluntary. Submitting the completed survey will constitute your consent to participate. The questionnaire is only submitted when you click on the final “Done” button. If you decide to begin the questionnaire and then change your mind, you may simply close the tab or window. Your results will not be recorded. Your results are recorded only when you click the “Done” button on the last screen.

Selection of participants

Target population: Canadian library staff, and staff from other types of Canadian information organizations, who catalogue or create other types of metadata (such as metadata for institutional repositories). They may catalogue as a full time responsibility or as a part time responsibility.

The survey was aimed at assessing the experiences and perceived needs within the Canadian context. Thus, there was limited recruitment using e-mail lists. The email lists that were used were specific lists with a targeted Canadian audience, such as provincial and regional library associations, consortiums, the Technical Services Interest Group list, the listserv of the Canadian Committee on Cataloguing (members of CCC represent the major cataloguing constituencies in Canada).

Response rates from email distribution of surveys are reported in the literature as usually being low. Thus, the group did not intend to use email distribution as the principal way to deliver the survey. The principal way was delivery of the survey to identified individuals. There were two strategies used: 1) compile a list of the names and emails of staff involved with metadata creation in Canada, monitoring for a balanced representation from different types of institutions, different sizes, urban and rural, as well as representing all the provinces and territories (we used a variety of sources of information to compile the list); 2) include in the email invitation a special request addressed to team leaders, managers or supervisors, asking them to forward the survey to members of their staff who would be interested in providing feedback.

To ensure a wide range of representation, the first page of the survey asked questions that identified the type of library, the broad geographical area, the respondent’s responsibilities, etc. During the survey, results were monitored to ensure that responses were received from a sufficiently wide range of types and places. If responses seemed to over-represent one group, such as academic libraries, the intention was to generate additional invitations in order to balance out the representation. However, there was no need to supplement with additional invitations. For example, respondents from university libraries were 23% of the sample, from public libraries 25%, from school libraries 12%, etc. In addition, the volume of participation was greater than expected, with 487 responses and 358 completed responses. The invitations said that the survey would end on May 7th, but participants continued to respond after that date. The survey was left open until the end of June, after responses had ceased.

The survey was completed at the participant’s convenience. There was no financial compensation. Since the results will contribute to shaping the Canadian RDA training plan, by participating in the survey, the participant contributed to the improvement of training for all affected library staff, thus accruing a benefit for the Canadian cataloguing community, and for themselves, as a member of that community.

Results of the survey

The Respondents

Responses have been filtered to include only the results from the completed surveys.

Total started 487

Total completed 358

About the respondents:

For 55.9% of the respondents, cataloguing or metadata creation is at least 75% of position responsibilities (including the supervision of cataloguing). For the remaining respondents, the other areas of responsibility were predominantly public services, acquisitions and collection development.

Librarians/Technicians

The survey should perhaps have used the term “technician/paraprofessional”, rather than just “paraprofessional”. The question included the possibility of an open-ended answer. Most of the responses in the open-ended answer section can be considered as part of the “paraprofessional” category.

Responses Adjusted to remove technicians from “other”

Librarians 47.6% 47.6%

Paraprofessionals 32.7% 45.6%

Other 19.7% 6.8%

directors

archivists

managers, etc

The survey also inquired about the use of manuals and documentation and whether respondents have access to print and online, online only or print only. For cataloguing standards, such as AACR2 and MARC 21, 56% use a combination of print and online sources. For cataloguing procedures, training documents and ILS documentation, respondents also mostly use a combination of print and online sources, but the percentage is lower: percentage is in the high 30s (39%, 36%, 35%). It is only when using documentation from their bibliographic utility that the majority use online documentation only (40%). The surprising finding was the degree to which documentation sources are not consulted. The exception was cataloguing standards, with only 2.8% reporting that they do not use documentation.

The “not used” category may also include responses when the documentation is not available or not required. If one controls for type of institution, it becomes clear that specialized cataloguing manuals are mostly used in university libraries, but are rarely used in most school and public libraries. Likewise, online tools, such as Cataloger’s Desktop, are used predominantly by staff in university libraries.

When there is access to both print and online documentation, many do prefer online, or use a combination of online and print.

RDA

In terms of current knowledge of RDA, participants were asked how they acquired this knowledge. They were allowed to check off as many sources as applicable. The percentages are percentage of people who picked this source out of the total pool of respondents. The percentages in this list simply indicate a ranking of the most popular sources of information.