The Impact of Free Access to Computers and the Internet in Public Libraries
Research participation guide for
web survey libraries

A research project of:

with support from:

Thank you for participating in theU. S.IMPACTpublic library web survey!

By agreeing to host the U.S. IMPACT web survey in your library, you are helping to provide evidence that public access computing makes a difference in the lives of library patrons, families, and communities. This information will help public libraries across the United States evaluate and improve the ways they provide access to computers and the Internet. You will receive valuable information about your patrons’ computing preferences, learn about trends across the country, and help to inform and persuade policy makers about the benefits of public access computing.

This field guide and the other documents we are sending youcontain all of the information you need to take part in the study, including:

  • instructions for connecting your library to the web survey
  • sample news release and email
  • background information on the studies
  • directory of all the libraries participating in the web survey
  • sample flier

Following the administration of the web survey, the IMPACT research team will analyze your patron’s responses and return them in a report to your library. We expect these reports to be ready for delivery in September, 2009.

Thank you for your help with this important study. We look forward to working with you.

Sincerely,

U.S. IMPACTResearch participation guide for web survey libraries | 1

Karen E. Fisher, Ph.D.

Associate Professor

Michael Crandall, MLIS

Senior Lecturer& Chair, Master of Science inInformation Management Program

U.S. IMPACTResearch participation guide for web survey libraries | 1

University of Washington

The Information School
Box 352840
Mary Gates Hall, Ste 370
Seattle, WA 98195-2840
Phone: (206) 685-9937
Fax:(206) 616-3152

U.S. IMPACT public library web survey participation instructions

U.S. IMPACTResearch participation guide for web survey libraries | 1

General Contacts:Elizabeth MitchellWeb Survey

Samantha Becker

Project Manager

206-616-2841

U.S. IMPACTResearch participation guide for web survey libraries | 1

7 easy stepsto participate in theIMPACT web survey:

Receive yourfielding period dates and unique URL from the IMPACT web survey coordinator. Staggering fielding periods is important for ensuring data quality and the unique URLs will allow us to analyze results for just your library.

Inform library staffabout survey procedures and purpose. Information about the research project is available in this guide or on the project website at:

When your fielding date comes up, install a link to the survey from your library’s website using the options available at the web survey codebox: Linking to the survey will take about 15-30 minutes depending on your system. You will need your library’s unique URL assignment to connect to the survey.

Call or email for technical support. Technical support is also available through our Web Junction discussion board at

Set the browser home page on all library computers directly to your survey URL. If this is not possible, the library’s website (with the link) should be set as the browser homepage.

Use sample flyers and email/newsletter/pressrelease text to provide information and promote the surveyto your patrons. Encouraging participation will increase response rates and quality of the analysis. As much as possible during your two week fielding period:

  • Have fliers about the survey at located at computers and PAC service points.
  • Announce the survey at the beginning of library programs and events.
  • Ask library staff to mention the survey in their interactions with patrons.

Take down the survey link at the end of your 2-week fielding period by removing the code from your website.

That’s all there is to it! We will send you the results from your library in September, 2009.

Sample text for email/newsletter/press release

From [insert start date] to [insert end date], [insert library name] is participating in a nationwide Internet survey to find out how people use the free computers and Internet connections in public libraries. The U.S. IMPACT web survey is being conducted by the University of Washington Information School with support from the Bill & Melinda Gates Foundation.

Until now, there has been no nationwide research about how library computing services fit into peoples’ lives. Some believe that library computers are used mostly for entertainment, but librarians report that people use them to find jobs, stay connected with family and friends, or to get health information. The goal of the U.S. IMPACT studies is to collect evidence about the ways computers in public libraries help people and their communities across the United States. This information will be used to improve these services and to inform policy makers about how best to fund and support them. In these hard economic times, this information will be invaluable.

You can access the web survey from the[insert library name] website at [insert library URL] from [insert start date] to [insert end date]. The survey is completely anonymous and takes 10 to 15 minutes to fill out. The [insert library name] and researchers at the University of Washington encourage you to take a few minutes to help improve public library computing services across the United States. For more information, ask at the library information desk, or visit the IMPACT studiesweb site at

U.S. IMPACTResearch participation guide for web survey libraries | 1

U.S.IMPACTstudies information

A research team led by Mike Crandall and Karen Fisher of The University of Washington Information School, with support from theInstitute of Museum and Library Services and theBill & Melinda Gates Foundation,is examining the impact of free access to computers and the Internet on the well-being of individuals, families, and communities. Additional information about the U.S. IMPACT studies can be found at the project web site:

Purpose

Public libraries have provided free access to the Internet and computers since the 1990s. Libraries have also provided access to digital resources, databases, networked and virtual services, training, technical assistance, and technology-trained staff. Past decision-making regarding public access computer services has been based on such measures as number of users/sessions, length of time computers are in use, number of unfilled requests, and results of satisfaction surveys (e.g., Jaeger, Bertot, & McClure, 2007).However, little research has examined the relationship between free access to computers and outcomes that benefit individuals, families, and communities.

Working with libraries, users, and communities, and an expert committee of library leaders, researchers, and public policy organizations, the IMPACT research team is

  • documenting the positive and/or negative results from the presence or absence of public access computing resources and services in public libraries; and
  • developing robust and broadly applicable indicators for the impact of free access to computers, the Internet, and related services.

The researchers are specifically interested in outcomes and indicators related to seven domains: (1) civic engagement, (2) eCommerce, (3) education, (4) eGovernment, (5) health, (6) employment, and (7) social inclusion. These domains are relevant to broad policy goals and consistent with the public library mission. The ultimate aim is for these indicators to guide decision-making and generate public support for public access computing in public libraries. According to Hatry (2006), to be useful, indicators need to be:

  • specific (unique, unambiguous);
  • observable (practical, cost effective to collect, measurable);
  • understandable (comprehensible);
  • relevant (measures important dimensions, appropriate, related to the program, of significance, predictive, timely);
  • time bound (covered a specific period of time); and
  • valid (provided reliable, accurate, unbiased, consistent, and verifiable data).

The IMPACT studies will test and validate indicators to ensure their usefulness to libraries and policy makers and will work towards developing an outcome evaluation system that is cost effective easy to use.

Research summary

To identify key areas of public access computing (PAC) impact and build outcome indicators, theIMPACT teamis currently engaged in two projects:

  • A mixed methods analysis of PAC users funded by the Institute of Museum and Library Services (IMLS) consisting of a nationwide telephone survey to generate generalizable findings and four case studies to help us understand analytic findings and stimulate policy insights.
  • A nationwide web survey administered in public libraries. This initiative is funded by the Bill & Melinda Gates Foundation and will extend the value of the IMLS telephone survey by augmenting the data collection, allowing us to gain responses from individuals commonly missed by conventional survey methods (low income, youth, homeless),and linking user outcomes to library resources.

Together, these two efforts will establish and test candidate indicators and provide valuable information about users of public access computing.

IMLS mixed methods study

Mixed methods research has been used in research and program evaluation in many different fields, but has not been used extensively in library research (Fidel, 2009). Research involving both qualitative and quantitative methods provides researchers and policy advocates the opportunity to increase the validity of research, better understand conditionalities and context, and counteract biases inherent in any research method.

The IMLS mixed methods study will generate two rich sources of data. The telephone survey data will provide a representative picture of the prevalence of different types of people using public access computers and how it benefits them. The case studies will provide information about contextual influences on library outcomes, such as available resources or the policy environment, as well as testing the relevance of candidate indicators with library stakeholders. Analyzed together, they will capture a holistic picture of computer and Internet use in public libraries and test the validity of findings.

The telephone surveywas developed through an iterative process with the research team and experts from the library, research, and public policy communities. The survey asks general questions about PAC use (frequency/alternative access), specific types of use across the seven domains, use on behalf of others, use of other library resources, and demographics. It has been extensively tested in field conditions and will be offered in English and Spanish.

The goal for the telephone survey is to complete 1130 interviews with users of public access computing in libraries. This will allow us to estimate the number of people in the United States who use PAC resources at a +/- 3.5% margin of error at the 95% confidence interval. The survey will also provide a rich data source for examining how people use PAC and will allow us to evaluate candidate indicators.

Four case study visits will be conducted by teams of researchers from the University of Washington Information School. Teams will spend approximately one week at selected sites performing structured interviews and focus groups with PAC users, librarians, administrators, IT staff, persons from allied organizations (e.g., other community technology centers, city councils, senior centers, local schools and colleges) as well as mini-interviews with people at Internet cafes, tourist offices, hair salons, book stores, and other community focal points. In-depth library and community profiles gathered from additional data sources along with extensive research note-taking by members of the research team round out the case study methodology. These case study instruments and techniques were field tested with the Mount Vernon City Library in WashingtonState in August 2008.

Public library web survey

The web survey is essentially the same instrument as the telephone survey, with only minor variations to account for the different platform. The web survey has also been translated into Spanish. Web survey data willaugment data gathered through the telephone survey and, coupled with data from the NationalCenter for Educational Statistics (NCES) on the selected libraries’ resources, will allow for the analysis of the relationship between available PAC resources and user outcomes. Because Internet surveys are an emerging method and may be a low-cost alternative to gathering patron-level data, the results from the web and telephone surveys will also be compared in order to gauge the effectiveness of web surveys for future library research.

Library systems selected for participation are being asked to link to the web survey through their websites during a designated two-week period.Libraries will be provided with a unique URL and a variety of methods for linking to the survey including buttons, float-in/pop-up scripts, and HTML code. Libraries participating in the web survey will be provided a comprehensive report on the data collected through their public access system if enough respondents are obtained to ensure confidentiality and statistical validity of the results. If not, a comparative report of national averages of peer libraries will be provided instead.

The web survey includes 636 randomly selected library systems (of the country’s 9,198 administrative units across 50 states) and is expected to yield approximately 80,000 completed surveys with an overall response rate of approximately 12%. Once aligned with the telephone survey, data from the web survey are expected to be generalizable with a margin of error of approximately +/- 2.2% at the 95% confidence interval, offering substantial flexibility for the development of an insightful portrait of PAC users.

Outcomes and Dissemination

Through our replicable, transportable, and triangulated methodology, we will identify measurable indicators of the social, economic, personal, and/or professional impact of free access to computers, the Internet, and related services at public libraries, and of negative impact where service is weak or absent.

We will also provide new, reliable data on the benefits to individuals, families, and communities of these services and resources at public libraries.This will allow the public library community to document and use data collected about the impacts of PAC to assist with improvements in services, support local and national advocacy and funding efforts, and provide a solid basis for future research efforts in this domain.

Dissemination efforts will begin in June 2009, with the release of preliminary analysis from the IMLS mixed methods study. Results from the U.S. IMPACT Study web survey will be disseminated in August, 2009. Along with library and information science researchers, our targeted audience includes librarians and policymakers who can use our research for evaluation and policy decisionmaking. Delivery of individualized library web survey reports is scheduled for September.

References & resources

References

Fidel, R. (2008). "Are we there yet?: Mixed methods research in library and information science." Library & Information Science Research (07408188)30(4): 265-272.

Hatry, H. P. (2006). Performance measurement: Getting results.Washington, D.C.: Urban Institute Press.

Jaeger, P. T., Bertot, J. C., & McClure, C. (2007).Public libraries and the Internet 2006: Issues, findings, and challenges. Public Libraries, 46 (5), 71-78.

Indicators

Hatry, H. P., L. Lampkin, et al. (2003). Developing community-wide outcome indicators for specific services. Washington, D.C., Urban Institute.

Lampkin, L. , Winkler, M., Kerlin, J., Hatry, H., Natenshon, D., Saul, J., et al. (2006). Building a common outcome framework to measure nonprofit performance.Washington, D.C.: Urban Institute. Available at

Mixed method research

Greene, J. C. and V. J. Caracelli (1997). Advances in mixed-method evaluation : the challenges and benefits of integrating diverse paradigms. San Francisco, Jossey-Bass Publishers.

Jick, T. D. (1979). "Mixing Qualitative and Quantitative Methods: Triangulation in Action." Administrative Science Quarterly24(4): 602-611.

Web survey research

Sue, V. M. and L. A. Ritter (2007). Conducting online surveys. Los Angeles, Sage Publications.

Thomas, S. J. (2004). Using web and paper questionnaires for data-based decision making : from design to interpretation of the results. Thousand Oaks, Calif., Corwin Press.

U.S. IMPACTResearch participation guide for web survey libraries | 1