Skip to Content

Notice

2007 American Community Survey Methods Panel Testing

Document Details

Information about this document as published in the Federal Register.

Published Document

This document has been published in the Federal Register. Use the PDF linked in the document sidebar for the official electronic format.

Start Preamble

ACTION:

Proposed collection; comment request.

SUMMARY:

The Department of Commerce, as part of its continuing effort to reduce paperwork and respondent burden, invites the general public and other Federal agencies to take this opportunity to comment on the proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995, Public Law 104-13 (44 U.S.C. 3506(c)(2)(A)).

DATES:

Written comments must be submitted on or before July 17, 2006.

ADDRESSES:

Direct all written comments to Diana Hynek, Departmental Paperwork Clearance Officer, Department of Commerce, Room 6625, 14th and Constitution Avenue, NW., Washington, DC 20230 (or via the Internet at DHynek@doc.gov).

Start Further Info

FOR FURTHER INFORMATION CONTACT:

Requests for additional information or copies of the information collection instrument(s) and instructions should be directed to Wendy D. Hicks, U.S. Census Bureau, Room 2027, SFC 2, Washington, DC 20233, (301) 763-2431 (or via the Internet at Wendy.Davis.Hicks@census.gov).

End Further Info End Preamble Start Supplemental Information

SUPPLEMENTARY INFORMATION:

I. Abstract

Given the rapid demographic changes experienced in recent years and the strong expectation that such changes will continue and accelerate, the once-a-decade data collection approach of a decennial census is no longer acceptable. To meet the needs and expectations of the country, the Census Bureau developed the American Community Survey (ACS). The ACS collects detailed socio-economic data every month and provides tabulations of these data on a yearly basis. In the past, these sample data were collected only at the time of each decennial census. The ACS allows the Census Bureau to focus only on the basic demographic content in the 2010 Census, thus reducing operational risks in the Decennial census as well as improving the accuracy and timeliness of the detailed housing and demographic items by collecting those data as part of the ongoing ACS.

The ACS includes an annual sample of approximately three million residential addresses a year in the 50 states and District of Columbia and another 36,000 residential addresses in Puerto Rico each year. This large sample of addresses permits production of single year estimates for areas with a population of 65,000 or more annually. Producing estimates at lower levels of geography requires aggregating data over three- and five-year periods. The ability to produce estimates at low levels of geography makes the ACS an incredibly useful source of data for Federal agencies for monitoring progress, administering programs and so forth. However, collecting data from such a large sample of addresses also requires that the Census Bureau continues to review and test methods for containing costs of data collection. The 2007 ACS Methods Panel will include two tracks of research, one addressing content and another addressing cost containment strategies.

The first track of the 2007 Methods Panel will test a new question that collects information about a person's primary field of study for their bachelor's degree. Additionally, this track of the Methods Panel will include modifications to the basic demographic questions in all three modes of data collection—mail, Computer Assisted Telephone Interviewing (CATI) and Computer Assisted Personal Interviewing (CAPI). In the mail operation, the test will include a comparison of two different layouts of the basic demographic questions, a sequential person design and a matrix design. The sequential person design repeats each question and answer category for each person. The matrix layout lists people down the left side of the form and questions across the top. The modifications to the CATI and CAPI basic demographic questions reflect the first test implementation of the draft Decennial Census guidelines for improving the consistency of the basic demographic question across modes of collection (i.e., mail, CATI, CAPI). The modifications to the CATI and CAPI instruments will include a comparison of a topic-based approach versus a person-based approach to collecting the basic demographic questions. A topic-based implementation asks a question for everyone in the household prior to moving to the next question. For example, the interviewer would ask the gender of the first person, the second person, the third person, etc. for everyone in the household. Once answered for everyone, the interviewer moves to the next question and asks that question for each person in the household. In contrast, a person-based Start Printed Page 28303implementation asks all the basic demographic questions for a person then proceeds to the next person, repeating all of the basic demographic questions.

The second track of the 2007 Methods Panel will include two components, both of which test different methods for increasing mail response in the ACS, the least expensive mode of data collection. The first component tests whether the ACS can increase mail response by sending an additional mailing piece to mail nonrespondents for whom we don't have a phone number and thus, cannot include in the CATI operation. The second component of this track tests whether we can increase mail response in Puerto Rico or targeted areas of the United States with the lowest levels of mail cooperation by mailing a brochure or other mailing piece that incorporates motivational messages and other promotional or outreach techniques.

First Track

As noted, in this first track, the ACS will test one new content item in all three modes of collection, as well as modifications to the basic demographic questions in the CATI and CAPI instruments. Testing of the new content item reflects the recent ACS Content Policy developed jointly by the Census Bureau and the Office of Management and Budget (OMB). As stated in that policy (available upon request), OMB works with the Census Bureau to determine whether new content proposed by the Federal agencies will be considered for inclusion in the ACS. If the OMB and the Census Bureau determine the ACS may be an appropriate vehicle for collecting the information, then the Census Bureau will design and implement a testing program to assess the quality of the data collected by the proposed question. OMB will consider the results of that testing in deciding whether the ACS should include the proposed content, and when the ACS should add the new content, if accepted.

In 2007, the ACS Methods Panel will test a question designed to identify the field of study in which a person received his or her bachelor's degree. The National Science Foundation proposed the addition of this content for the purpose of creating a sampling frame for the National Survey of College Graduates (NSCG) which historically used educational attainment and industry and occupation data from the decennial long form to build the sample frame. The ACS would facilitate more recent updates to the sampling frame for the NSCG. Additionally, the inclusion of a ‘field of degree’ question on the ACS would reduce some of the noise in the subsequent sampling frame that resulted from using the proxy measure, occupation type, from the decennial census. Lastly, including a ‘field of degree’ question on the ACS would allow the Department of Education, specifically the National Center for Education Statistics, to create direct estimates of specific fields of study useful to NCES programs.

As noted, this test will also include a comparison of a sequential person design for the basic demographic questions on the mail form, which is comparable to the person-based approach in the CATI/CAPI modes, and a matrix layout on the mail form which is comparable to the topic-based approach to collecting the basic demographic questions in the CATI/CAPI operations. (The ‘field of degree’ question falls in the detailed demographic section of the instrument, and thus is not impacted by the topic-versus person-based comparison.) Testing both a topic- and person-based instrument for the basic demographic questions reflects alternative implementations of the draft Census Bureau guidelines for writing questions in a manner that should facilitate consistent responses regardless of the mode in which a person participates. This test will also include a few other slight modifications to the CATI and CAPI versions of the questions. For example, the CATI and CAPI questions will also manipulate how examples and long lists of response categories are provided in interviewer-administered modes of collection.

Testing in this track includes four phases: (1) Question proposal; (2) question development and pretesting; (3) field test implementation, and; (4) recommendation for final content. The first stage represents the proposal from the National Science Foundation and accepted by the Census Bureau and the OMB to include a ‘field of degree’ question for testing on the ACS. The second stage reflects a series of cognitive laboratory pretesting studies conducted by the Statistical Research Division within the Census Bureau as well as through NSF contracts with outside experts. These pretesting studies will identify two versions of the ‘field of degree’ question and the topic-based and person-based versions of the CATI/CAPI implementation of the basic demographic questions.

In the third stage, the field test will include a national sample field test (excluding Hawaii, Alaska and Puerto Rico) of approximately 30,000 residential addresses. (The test will not include Group Quarters.) Half of these addresses will receive one version of the ‘field of degree’ question and the other half will receive a second version of the question. Within each of those treatments, half the sample will receive a matrix layout in the mail mode or the topic-based implementation of the basic demographic questions in the CATI/CAPI modes. The other half will receive the sequential person design in the mail mode or the person-based implementation in the CATI/CAPI modes.

The data collection methodology for this test will very closely replicate the current ACS data collection methodology. This test will use the same mailing strategy (advance letter, first questionnaire mailing package, reminder postcard, replacement questionnaire mailing package and availability of Telephone Questionnaire Assistance (TQA)), the same CATI data collection operational methods and the same CAPI data collection operational methods as the current ACS. Mail data collection will occur in March of 2007, followed by CATI in April and CAPI in May, using the same data collection schedules as the March ACS panel. The automated instruments will include both English and Spanish language versions.

However, unlike the ACS, the test will not include the Telephone Edit Follow-Up (TEFU) operation used to follow-up with mail respondents who did not fully complete their form or who have households with six or more people. For evaluation purposes, we will follow-up with all respondents to complete a CATI Content Follow-Up (CFU) interview, and if we also conducted a TEFU operation we could potentially contact the same household three times for one survey. Thus, since the CFU better serves the analytical needs of the project, we will drop TEFU and only conduct the CFU operation. The CFU will reask the same version of the basic demographic questions as asked in the initial collection (topic-or person-based), as well as the same ACS education questions, including the field of degree question, and some additional probing questions regarding the reported field of degree for each person with a bachelor degree or higher.

The final stage in this track of the 2007 Methods panel research includes data analysis and the recommendations to OMB regarding whether or not the tested content has sufficient data quality for inclusion in the ACS. While OMB will make the final decision whether or not to include the proposed content on the ACS, the results of this research will help inform that decision. Start Printed Page 28304

Second Track

As noted above, the second track of the 2007 Methods Panel will include two components, both of which test different methods for increasing mail response in the ACS, the least expensive mode of data collection. The first component tests whether the ACS can increase mail response by sending an additional mailing piece to mail nonrespondents for whom we do not have a phone number and thus, cannot include in the CATI operation. Since we do not have a phone number for these sample cases, the ACS can only collect data from them via CAPI, the most expensive mode of data collection. This study will test three different types of mailing pieces and measure which type yields the highest increase in response for the non-CATI eligible universe, given the cost of the additional mailing piece. We will mail to approximately 18,000 sample housing-unit addresses, 6,000 in each treatment, sampling only from addresses for which our frame does not include a phone number. This study will not include the CATI or CAPI data collection. Rather the test will assess whether we get enough response to offset the costs of the additional mailing. The timing of this test will coincide with the May 2007 ACS panel.

The second component of this track tests whether we can increase mail response in Puerto Rico or targeted areas of the United States with the lowest levels of mail cooperation by mailing a brochure or other mailing piece as part of the questionnaire mailing package that incorporates motivational messages and other promotional or outreach techniques. The test will manipulate the content of the motivational messages (for both Puerto Rico and the U.S.). We will test the motivational messages for all of Puerto Rico, but for the stateside component we will apply targeting criteria that considers characteristics such as proportion of city-style addresses, population size, proportion of linguistically isolated (i.e., persons who do not speak English well) and vacancy rates. We anticipate selecting three to four targeted areas for inclusion in the stateside component of the test.

In terms of the motivational messages we will include one of two versions of an insert in the questionnaire mailing packages that provides information about how information from the ACS will benefit their community or has already benefited their community. For the U.S., one version will reflect wording tailored specifically to the targeted geographic area. The second version may use slightly more general language that could apply to a larger geographic area, or may focus on different benefits for the targeted geographic area. For Puerto Rico, we will test two versions relevant to the entire island. Staff from the Census Bureau will work with the state and local data users to identify how information from the survey has benefited or will benefit the targeted area in order to develop the insert. Additionally, we will conduct focus groups to help identify the most meaningful content for the messages.

Like the previous test in this track, this test aims to increase mail response as a way to help contain data collection costs. Thus, this test will only collect data in the mail phase. We will first implement the test in targeted areas of the U.S., coinciding with the July ACS panel, using the same timing for each of the mailing pieces. Implementation in Puerto Rico will coincide with the September Puerto Rico Community Survey (PRCS) panel, again using the same timing for each of the mailing pieces. For both Puerto Rico and the targeted U.S. locations, the comparison group will come from the production ACS/PRCS in the same geographic area.

We anticipate mailing to about 6,000 addresses in Puerto Rico with 3,000 in each of the different treatment groups for the motivational message. (The monthly sample in Puerto Rico is about 3,000.) While the difference in response rate, if any, will likely not reach significance with a sample of only 3,000 housing units, we did not want to test this with a sample larger than the current monthly sample of 3,000 for the production PRCS. Rather, we will estimate the impact on the annual PRCS response and associated costs, based on what we observe in this single panel test.

In the U.S., we will identify several areas based on our targeting criteria for implementing the test. The exact number of areas included in the test will depend on the population size for each area fitting our targeting criteria. We anticipate needing about 10,000 sampled addresses for each of the treatment conditions (i.e., types of motivational messages). However, 10,000 sampled addresses in any one area for a single panel month will likely impact eligibility for production ACS sampling in that area. Thus, we anticipate selecting several areas that meet the targeting criteria, selecting a sample close in size to the ACS sample for the area, and then combining the analysis across the selected areas to reach a sample of about 10,000 for each treatment condition. Since we will combine the analysis across several selected areas meeting the targeting criteria, the motivational message treatments will reflect the same general type of message across the areas, but we will tailor the specifics of the message to each of the areas. In other words, if we identify four different areas for inclusion in the test, all four areas will receive an insert in their questionnaire mailing packages that identifies how the ACS has benefited their specific community. The other treatment group in those areas will receive an insert in their questionnaire mailing packages reflecting any alternative message content suggested by the focus group pretesting (e.g., how the ACS benefits the state in general).

II. Method of Collection

As noted above, the testing in the first track will include all three modes of data collection—mail, CATI and CAPI—as well as a Content Follow Up (CFU) reinterview. Respondents in any of the three modes of data collection for whom we have a telephone number will go to the CFU approximately 2 weeks after receiving their initial response. The start and duration of the mail, CATI and CAPI data collection stages will mirror the production ACS. The CFU reinterview will start approximately two weeks after receipt of the first mail returns and continue for approximately two weeks after the closeout of the CAPI operations.

In the second track, both tests are mail only tests, excluding the CATI and CAPI data collection operations. The test of an additional contact for those mail nonrespondents for whom we do not have a phone number will differ from the production mailing strategy in that we will mail one of three different additional pieces to the test universe. The test of the motivational messages will use the same timing and number of mail contacts as the production ACS, but this test will include one of two different motivational inserts sent as part of both the initial and replacement questionnaire mailing packages.

III. Data

OMB Number: Not available.

Form Number: First track will use ACS-1(X)C1(2007) and ACS-1(X)C2(2007). Second track, additional contact test will use the following: ACS-1(X)M1(2007) for the questionnaire; ACS-0018(L)M1(2007) for a letter and ACS-0019(P)M1(2007) for a postcard. Second track, motivational messages will use ACS-1(X)M2(2007) for the mail questionnaire, ACS-0091(L)M2(2007) for one type of insert, substituting 0091 with the number 0092-0099 for each of the treatments. Start Printed Page 28305

Type of Review: Regular.

Affected Public: Individuals and households.

Estimated Number of Respondents: In the first track, during the period March 1 through May 31, 2007 we plan to contact 30,000 residential addresses and approximately 20,000 responding addresses will be contacted for Content Follow-up. In the second track, we plan to mail to 18,000 households in the U.S. in April 2007; We will mail to 6,000 households in Puerto Rico in July 2007; In September 2007, we will mail to 20,000 households in the U.S.

Estimated Time per Response: Estimated 38 minutes per residential address, 12 minutes per residential address for Content Follow-Up.

Estimated Total Annual Burden Hours: 50,867.

Estimated Total Annual Cost: Except for their time, there is no cost to respondents.

Respondent Obligation: Mandatory.

Start Authority

Authority: 13 U.S.C. 141 and 193.

End Authority

IV. Request for Comments

Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden (including hours and cost) of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.

Comments submitted in response to this notice will be summarized and/or included in the request for OMB approval of this information collection; they also will become a matter of public record.

Start Signature

Dated: May 11, 2006.

Madeleine Clayton,

Management Analyst, Office of the Chief Information Officer.

End Signature End Supplemental Information

[FR Doc. E6-7423 Filed 5-15-06; 8:45 am]

BILLING CODE 3510-07-P