/ American Institutes for Research®
Report on Corporation for Public Broadcasting’s 2008 “Ready To Learn” Marketing Outreach Assessment
Prepared for
Corporation for Public Broadcasting
Prepared by
American Institutes for Research
Draft
December 15, 2008
“American Institutes for Research” is a registered trademark. All other brand, product, or company names are trademarks or registered trademarks of their respective owners.
10720 COLUMBIA PIKE, SUITE 500|SILVER SPRING, MD 20901-4400|TEL 301 592 8600|FAX 301 593 9433|WEBSITE WWW.AIR.ORG

Contents

Contents 2

Introduction 3

Methods 4

Findings 5

A. Survey Returns and Response Rates by Market, Cohort, and Survey Mode 5

Table 1. Surveys Received by Survey Mode and Target/Non-Target ZIP Codes 6

Table 2. Response Cards and Response Rates Within Target ZIP Codes by Market 7

B. Describing the Dataset: Demographic Characteristics of Respondents 8

C. Significant Demographic Differences by Survey Mode, Cohort, and Target Versus Non-Target ZIP Codes 9

Table 3. Contingency Table of Gender by Cohort 9

Table 4. Contingency Table of Federal Poverty Level Status by Survey Mode 10

Table 5. Contingency Table of Education by Survey Mode 10

Table 6. Contingency Table of Having a 2-Year-Old Child by Survey Mode 11

Table 7. Contingency Table of Having a 6- to 8-Year-Old Child by Survey Mode 11

Table 8. Contingency Table of Federal Poverty Level Status by Target/Non-Target ZIP Code 12

D. Viewership of the RTL-Supported PBS Television Shows 12

Figure 1. Past Week Viewing Behaviors for RTL-Supported PBS Television Shows 13

E. Parental Involvement 13

F. Ready To Learn and Raising Readers Awareness 14

Table 9. Contingency Table of Campaign Awareness by Cohort 14

G. Parent/Caregiver Attitudes About Early Literacy Learning 15

Discussion 16

Introduction

Under contract with the Corporation for Public Broadcasting (CPB) and as part of a larger effort funded by the U.S. Department of Education’s Ready To Learn (RTL) initiative, the American Institutes for Research (AIR) designed a 26-question survey to identify the reach of the 2008 RTL marketing outreach activities and assess several factors related to the messaging of the RTL project and the PBS KIDS Raising Readers campaign it encompasses.

PBS KIDS Raising Readers is a national literacy campaign that uses educational media to help parents and other caregivers help young children get ready to read. It is managed by CPB, the Public Broadcasting Service (PBS), and the Ready To Learn Partnership. This innovative 5-year project consists of two programming awards (CPB/PBS and WTTW/Ready To Learn Partnership) and one outreach award (CPB/PBS) designed to locate and connect with children from low-income families, their parents, and caregivers in 20 target markets. Markets were selected in two cohorts—Cohort 1 in 2007 and Cohort 2 in 2008—after careful review of factors including children’s reading scores, ethnic and geographic diversity, and concentration of children from low-income families. Local public television stations in these markets are important partners in the outreach effort. Each public television station was actively involved in selecting one or more target ZIP codes within its viewing area. The target ZIP codes are the intended focus of heightened RTL media events and outreach activity in each market.

It was the goal of the 2008 marketing outreach assessment to obtain information from the intended RTL audience about the following parent/caregiver attitudes, perceptions, and behaviors that could be impacted by the messaging of Raising Readers media and outreach activities:

o  Parent/caregiver attitudes about caregiver role in early literacy learning,

o  Parent/caregiver perceptions about the kinds of activities that encourage and foster literacy skills in young children, and

o  Parent/caregiver behaviors suggested by the “Any Time is Learning Time” (ATLT) messaging and marketing outreach activities (e.g., playing word games in everyday activities); as well as adherence to more “traditional”/formal literacy behaviors (e.g., reading books with children).

In August 2008, the survey was provided to prospective respondents from both Cohort 1 and Cohort2 RTL markets to examine “point-in-time” attitudes, perceptions, and behaviors for:

o  The set of RTL markets with 1 year or more of campaign exposure and marketing outreach to date (i.e., Cohort 1 stations), including one cycle of paid media activities, and

o  The set of RTL markets with little or no campaign exposure or marketing outreach to date (i.e., Cohort 2 stations).

This approach allows for comparing markets with varying degrees of exposure to the Raising Readers messaging and RTL-paid media and outreach support, as well as provides a baseline for Cohort 2 markets upon which to assess change over the course of the RTL grant. Two additional rounds of marketing outreach assessment are planned for Cohort 2 markets in 2009, one prior to and one following the 2009 paid media activities being spearheaded by Ampersand.

Methods

AIR developed an initial set of questions for the marketing outreach assessment that included the following types of questions:

o  Demographic questions:

o  Gender

o  Date of birth

o  Education

o  ZIP code

o  Number of children/children’s ages

o  Income range

o  Attitude/perception measures:

o  Public television’s role in reading readiness

o  Role of literacy-related activities in reading readiness

o  Parental role in preparing children for reading

o  Parental confidence/ability to fulfill this role

o  Behavior measures:

o  Viewing of public television and the four RTL shows

o  New viewership of the four RTL shows

o  Parents’ literacy-related activities with children

o  Visiting ReadyToLearnReading.org Web site

o  Exposure measures:

o  Received RTL promotional DVD

o  Familiarity with Raising Readers campaign

AIR’s original survey (included in Appendix A) was pretested in March 2008 on a small convenience sample of 24 parents in the Baltimore target ZIP codes, utilizing Maryland Public Television’s (MPT)’s partnership with its Judy Center childcare partners. Cognitive interviews were also conducted with six Judy Center parents who did not take the written survey, to ensure that questions were being understood as intended by the target audience. The cognitive interviews suggested several enhancements to the wording of the original items. The pretest revealed potential ceiling effects for several of the attitude questions (i.e., nearly all parents responding with the most positive response option), resulting in further revisions to the original question set. The final set of 26 items (included in Appendix B) was then translated into Spanish (included in Appendix C); both the final English version and the final Spanish translation were included in each survey packet provided to prospective parent or caregiver respondents.

A dual-mode distribution method was used to get the surveys into the hands of prospective RTL households. The primary survey administration was via U.S. mail using a mailing list obtained from a marketing firm/mail house. Criteria provided to the list company were to identify all households in the Cohort 1 and Cohort 2 RTL stations’ target ZIP codes that met the criteria of household income under $35,000 per year and children ages 8 and under in the home. The mail house identified 5,647 households meeting the criteria in Cohort 1 target ZIP codes and 3,221 households meeting the criteria in Cohort 2 target ZIP codes. Altogether 8,868 surveys were prepared and sent to prospective respondents via U.S. mail in the primary survey administration mode. Surveys were mailed in July and August 2008. To attract interest in the mail survey, AIR obtained permission to include colorful licensed character artwork for the outside of the survey mailing envelope (i.e.,the character Dog from PBS KIDS WordWorld). Additionally, AIR worked with CPB and the station partners on an “advance” message to inform parents/caregivers in the target ZIP codes that they may be receiving an RTL survey in the mail at their home address.

To help ensure a satisfactory response rate, a secondary survey administration mode involved enlisting the RTL stations to recruit partners to distribute a total of 75 surveys per station to parents in the target RTL target audience. Thus, an additional 1,500 survey packets were prepared (75 for each of the 20 stations) and distributed to stations through AIR in August 2008. A cutoff date for data collection was originally set at October 1 and later extended to October 31, 2008, to permit stations additional time to follow up with their partners regarding their survey distribution efforts. Surveys were returned via postage-paid envelope to the survey firm that fielded the survey for AIR. Data from the surveys were entered via optical scanning, with the final file cleaned and prepared by the survey firm into a data analytic file. AIR further worked with the data to classify, where possible, various non-target ZIP codes as belonging to a particular market. Appendix D outlines the procedures followed for classifying non-target ZIP codes.

A total of 792 surveys were received by the October 31, 2008, cutoff date. After all cleaning and preparation of the electronic data file, the data were analyzed using SAS for Windows. In addition to descriptive analyses, the data were tested for significant relationships between several variables of interest using using t-tests for continuous variables and chi-square tests for independence for relationships among categorical variables. Chi-square tests were accompanied by measures of effect size to determine the strength and meaningfulness of any statistically significant relationships. In most cases, the measure of effect size used was Cramérs V, since the chi-square tests were typically applied to contingency tables larger than 2×2. For 2×2 contingency tables, the phi coefficient was used as the measure of effect size. Effect sizes as measured by phi or Cramérs V can range in value from .00 to 1.00. Throughout this report, we generally interpret effect sizes less than .20 as signifying no real relationship of note. Effect sizes of .20 to .40 are considered to demonstrate “weak” relationships. Effect sizes ranging from .41 to .60 are considered to be “moderate” in strength, while effect sizes of .61 or greater indicate strong relationships. Demographic variables and variables measuring key attitudes, perceptions, and behaviors were analyzed and tested for significant differences by: RTL cohort, target/non-target ZIP codes, and survey administration mode (i.e., mail versus partner distribution). The number of Spanish-language returns was too small to permit analysis by language of respondent. Where appropriate, multivariate analyses such as multiple regression and logistic regression also were utilized.

Findings

A. Survey Returns and Response Rates by Market, Cohort, and Survey Mode

Table 1 provides the breakdown of the 792 survey responses received by both survey mode (i.e.,mail survey versus “supplemental” data collection) and ZIP code. As shown in the table, the majority of responses received were from the mail survey:

o  569 of the 792 surveys (71.8%) were returns from the mail survey.

o  223 of the 792 surveys (28.2%) were returns from the supplemental data collection through the stations’ childcare partners.

Table 1. Surveys Received by Survey Mode and Target/Non-Target ZIP Codes

ZIP Code

Survey Mode / Not RTL target ZIP
(Col %) / RTL target ZIP
(Col %) / Total
Mail survey / 89
(56.0) / 480
(75.8) / 569
Cohort 1 / 67
(42.1) / 295
(46.6) / 362
Cohort 2 / 22
(13.8) / 185
(29.2) / 207
Supplemental data collection / 70
(44.0) / 153
(24.2) / 223
Total surveys / 159
(100) / 633
(100) / 792

After examining all study variables for any significant differences by survey mode, the respondent’s ZIP code and other survey data were used to classify all “supplemental” surveys as belonging to either Cohort 1 or Cohort 2. Because the number of returns was already lower than desired by AIR, this method permitted us to maximize the number of usable surveys for inclusion in the analyses by retaining responses from both target and non-target ZIP codes, regardless of survey administration mode. As shown in Table 2, after recoding the supplemental survey responses, the revised number of survey responses by cohort was as follows:

o  481 of the 792 surveys (61%) were classified as being from Cohort 1 respondents, and

o  311 of the 792 surveys (39%) were classified as being from Cohort 2 respondents.

The number of returned surveys and response rates, by market and RTL cohort, is provided in Table2 for both the U.S. mail survey and the supplemental data collection method. Mail survey response rates were the same across the two cohorts, ranging from 3.7% returns to 11.8% returns for individual markets. The average mail survey response rate in each cohort was 6.4%, typical of direct mail replies. Although negligible in about half of the markets, response rates for the 75 mail surveys per market (“supplemental” data collection method) reached roughly 50% in three of the 20markets and at least 20% in seven markets. The overall response rates for cohort markets for the supplemental data collection method was 15.9% for Cohort 1 and 13.6% for Cohort2.

Report on CPB’s 2008 “Ready To Learn” Marketing Outreach Assessment December 15, 2008 36

Table 2. Response Cards and Response Rates Within Target ZIP Codes by Market

Mailed surveys (target ZIP codes) / Supplemental data collection
Cohort / Market / Number of mail surveys sent / Number of mail surveys returned / Response rate of mailed surveys (%) / Number of survey packages sent / Number of surveys returned / Response rate (%) / Row total
1 / Baltimore, MD / 327 / 25 / 7.6 / 75 / 1 / 1.3 / 26
Jackson, MS / 1065 / 62 / 5.8 / 75 / 6 / 8.0 / 68
San Antonio, TX / 893 / 50 / 5.6 / 75 / 23 / 30.7 / 73
SF/Oakland, CA / 243 / 10 / 4.1 / 75 / 0 / 0.0 / 10
Toledo, OH / 1110 / 73 / 6.6 / 75 / 37 / 49.3 / 110
Birmingham, AL / 530 / 44 / 8.3 / 75 / 0 / 0.0 / 44
Buffalo, NY / 367 / 29 / 7.9 / 75 / 1 / 1.3 / 30
Carbondale, IL / 259 / 26 / 10.0 / 75 / 40 / 53.3 / 66
San Diego, CA / 729 / 36 / 4.9 / 75 / 1 / 1.3 / 37
State College, PA / 124 / 6 / 4.8 / 75 / 10 / 13.3 / 16
Cohort 1 subtotal / 5647 / 361 / 6.4 / 750 / 119 / 15.9 / 480
2 / Phoenix, AZ / 365 / 17 / 4.7 / 75 / 0 / 0.0 / 17
Washington, DC / 401 / 18 / 4.5 / 75 / 0 / 0.0 / 18
Pensacola, FL / 502 / 30 / 6.0 / 75 / 0 / 0.0 / 30
Tallahassee, FL / 413 / 26 / 6.3 / 75 / 35 / 46.7 / 61
Johnston, IA / 81 / 3 / 3.7 / 75 / 17 / 22.7 / 20
Baton Rouge, LA / 312 / 22 / 7.1 / 75 / 25 / 33.3 / 47
Nashville, TN / 429 / 32 / 7.5 / 75 / 0 / 0.0 / 32
West Tennessee / 510 / 44 / 8.6 / 75 / 18 / 24.0 / 62
Norfolk, VA / 191 / 12 / 6.3 / 75 / 5 / 6.7 / 17
Charleston, WV / 17 / 2 / 11.8 / 75 / 2 / 2.7 / 4
Cohort 2 subtotal / 3221 / 206 / 6.4 / 750 / 102 / 13.6 / 308
Total from mailed surveys / Total from supplemental surveys / Grand total
567 / 221 / 788*

* Full ZIP code could not be determined for two “mail” surveys and two “supplemental” surveys; total N = 792