Surveys of Scientists and Engineers: Ensuring Reliable Research Evidence for Good Practice

Sponsored by: SIG STI, SIG USE

Cecelia Brown

Associate Professor, School of Library and Information Studies, The University of Oklahoma. Email:

Carol Tenopir

Professor of Information Sciences and Interim Director, School of Information Sciences, University of Tennessee. Email: .

Bradley M. Hemminger

Assistant Professor, School of Information and Library Sciences, University of North Carolina at Chapel Hill. Email: .

Jon Jablonski, Reactor

Biology, Chemistry, and Neuroscience Librarian, University of Oregon. Email: .

K.T.L. Vaughan, Moderator

Librarian for Bioinformatics & Pharmacy, Health Sciences Library, University of North Carolina at Chapel Hill.Email: .

Abstract

Information scientists around the globe have made tremendous progress in understanding how scientists and engineers find and use information by through the use of a wide variety of survey instruments. Librarians and publishers increasingly turn to the data generated by these instruments in an effort to design and implement information products and services that the scientific community needs and desires. Three researchers experienced in studying the information seeking behavior of academics will describe their current investigations distinguishing between the survey mechanisms found to be effective or ineffective. Discussion will be led and provoked by the moderator and reactor who both have experience providing library services to this community. The last third of the session will be an open discussion soliciting comments and questions about trends, survey design, and survey experiences.

Case Studies: Three Researchers Discuss Their Survey Experiences

Tenopir & King: Longitudinal Studies of Scientists Using the Critical Incident Model

Carol Tenopir and Donald W. King will present highlights of results of their longitudinal surveys of reading patterns of over 20,000 scientists and social scientists in university and non-university settings in North American and Australia. (Earlier summaries can be found in Tenopir & King, 2000 and Tenopir & King, 2004). They will discuss how the way questions are asked influences the results and the conclusions that can be made, specifically discussing the critical incident technique in user surveys. Critical incident questions focus the respondent on the last article reading and can be used to measure purpose and outcomes of readings. Conducting such surveys of the same or similar groups over time allows changes in behavior and attitudes to be explored.

Brown et al.: Information Flow in Civil and Environmental Engineering Laboratories in Oklahoma and Thailand

Through the use of anonymous surveys of faculty and students, this project characterizes what Pettigrew (1999) describes as the “information grounds” of engineering graduate students and faculty at ChulalonghornUniversity in Bangkok, Thailand and at the University of Oklahoma in Norman, Oklahoma. Surveys were found to be a fast, simple, and effective method to provide insight into the information seeking behavior of and information usage by these microcosms of extremely hard-working and time-constrained science graduate students and faculty who value the speed and efficiency of the Internet to retrieve information to support their research activities. However, several of the survey responses garnered about social and behavioral aspects of information seeking were found to raise further interesting questions that may be best explored in depth using focus groups or personal interviews or rephrased to provide more meaningful data. In additional to briefly discussing the findings of the surveys, this paper will highlight the questions found to be best suited to a survey platform as well as propose methods to restructure items to insure valid results that accurately describe the information seeking behavior of academic engineers.

Hemminger: Recent Changes in the Information Seeking Behavior of Scientists

Brad Hemminger will present the preliminary results of two survey mechanisms. The first is a short survey focused on capturing the changing information seeking behavior of scientists, especially as brought about by electronic searching and retrieval of scholarly materials. The second survey consists of recording a period of scientist’s actual behavior through logging and diaries followed by in-depth structured interviews similar to the grounded theory approach of Ellis (1993). We will contrast the difference between the studies, which have different aims. The first survey is designed to be easily administered and universal so that is can be conducted at many institutions to achieve a large and comprehensive sample. Additionally, we have worked to design the survey instrument to be similar to past studies, as well as potential future ones, so that meta-analysis can be done incorporating results from surveys based on this tool. We invite researchers interested in participating to contact Dr. Hemminger. The second survey mechanism has the long range aim of producing a model of how scientists, and especially bioinformaticians, search out and use information in their daily research tasks, similar to the initial work done by Bartlett (2005).

Response: Applying the Research

Librarians are increasingly applying the principles of evidence-based practice (EBP) to the development and maintenance of services, collections, and physical and virtual spaces for their constituencies. In order for librarians and other practitioners to apply EBP to their situations, they must have reliable evidence. With the rapid changes in means of access to information that have happened in the last decade, this evidence must be not only credible and reasonable but also timely and applicable to the local circumstance. One response by librarians has been to create and administer small, informal scans of local populations. This helps somewhat, but does not address issues of generalized information behavior. In the end, practitioners usually turn to the work of information science researchers for the data they need. Jon Jablonski will react to the research perspective of the above cases with commentary on how these surveys inform the practice of librarianship and “informationistship” as well as where they fall short of true utility.

Creating a Question Pool

It can be difficult to design a survey that adequately captures useful information about a new population. All three researchers have made their survey instruments public either electronically or as appendices to published work, at least partly in an effort to share successful tools. Example surveys will be made available online for participants to view before and after the session. It is hoped that these surveys may form the kernel of a pool of the most useful and interesting questions for researchers and practitioners to use on subsequent surveys. The ideal outcome for this panel is not just the edification of the local participants but also the creation of such a valuable knowledgebase that can be expanded over time and space for the use of both researchers and practitioners.

REFERENCES

Bartlett J, 2005. Developing a Protocol for Bioinformatics Analysis: an Integrated Information Behaviorand Task Analysis Approach. JASIST 56(5): 469-482.

Booth, A. and A. Brice. 2004. Evidence-based Practice for Information Professionals: A Handbook. London: Facet Publications.

Ellis D, 1993. Modeling the Information Seeking Patterns of Academic Researchers: A Grounded Theory Approach. Library Quarterly, 6(4): 469-486.

Pettigrew, K.E. 1999. Waiting for chiropody: Contextual Results From an Ethnographic Study of Information Behavior Among Attendees at Community Clinics. Information Processing & Management 35(6): 801-817.

Tenopir, C. and D. King. 2000. Towards Electronic Journals. Special Libraries Association.

Tenopir, C. and D. King. 2004. Communication Patterns of Engineers. IEEE/John Wiley & Sons.