Skip to Content

Notice

Submission for OMB Review; Comment Request

Document Details

Information about this document as published in the Federal Register.

Published Document

This document has been published in the Federal Register. Use the PDF linked in the document sidebar for the official electronic format.

Start Preamble

DOC will submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. chapter 35).

Agency: U.S. Census Bureau.

Title: American Community Survey, 2007 Methods Panel.

Form Number(s): ACS-1(2005), ACS-1(X)Seq, ACS-1(X)Pro.

Agency Approval Number: None.

Type of Request: New collection.

Burden: 46,000 hours.

Number of Respondents: Postage Test—20,000; Grid vs. Sequential Test—40,000; Degree Test Reinterview—32,000.

Avg. Hours per Response: Questionnaires—38 minutes; Reinterview—15 minutes.

Needs and Uses: The U.S. Census Bureau requests authorization from the Office of Management and Budget (OMB) to conduct the American Community Survey 2007 Methods Panel tests.

Given the rapid demographic changes experienced in recent years and the strong expectation that such changes will continue and accelerate, the once-a-decade data collection approach of a census is no longer acceptable as a source for the housing and socio-economic data collected on the census long-form. To meet the needs and expectations of the country, the Census Bureau developed the American Community Survey (ACS). This survey collects detailed socioeconomic data every month and provides tabulations of these data on a yearly basis. The ACS allows the Census Bureau to provide more timely and relevant housing and socio-economic data while also reducing operational risks in the census by eliminating the long-form historically given to one in every six addresses.

Full implementation of the ACS includes an annual sample of approximately three million residential addresses a year in the 50 states and the District of Columbia, and another 36,000 addresses in Puerto Rico. A sample this large allows for annual production and release of single-year estimates for areas with a population of 65,000 or more. Lower levels of geography require aggregates of three and five years' worth of data in order to produce estimates of comparable reliability to the census long-form. However, an ongoing data collection effort with an annual sample of this magnitude requires that the ACS continue to research possible methods for maintaining if not reducing data collection costs. If costs increase, the ACS would have to consider reductions in sample thus reducing the reliability of the data as compared to the reliability of the census long-form, especially at lower levels of geography.

One of the tests included in the 2007 Methods Panel addresses a method for potentially reducing data collection costs. In this test, we will implement the same mailing strategy as ACS production and send each sampled address a prenotice letter, an initial questionnaire (ACS-1(2005)) packet, and a reminder postcard and for those who haven't responded by a certain date, we will send a second questionnaire packet. However, for this test we will send the prenotice letter using standard postage. Current ACS production procedures send all mail pieces using a first-class postage rate. Using standard postage rather than first-class postage for this mail piece can potentially save the ACS approximately two hundred and thirty thousand dollars in data collection costs each year. The test will evaluate whether the use of standard mailing for the prenotice letter impacts mail response rates.

A second test included in the 2007 Methods Panel addresses another aspect of ACS data collection relative to the census. Both the ACS and the census collect a core set of basic demographic questions (age and date of birth, gender, relationship, Hispanic origin and race). However, the 2010 Census will use a different format (similar to the format for the 2000 Census) from the format used by the ACS for collecting this information on the mail questionnaire. The census format, referred to as a sequential person design, creates a column for each person that includes each question and associated response categories. The ACS format, referred to as the grid design, lists the names of all persons down the left side of the form, the questions across the top of the page, and the response categories fall in the ‘cells' created by crossing the person names by question.

This second test will compare the sequential person (ACS-1(X)Seq) and grid (ACS-1(X)Pro) formats for collecting the basic demographic information to measure the impact on data quality, specifically unit and item non-response rates, response distributions, and within household coverage. The outcome of the test will determine whether the different formats might contribute to differences in the estimates for the basic demographic questions. If the format does influence how people respond to these basic demographic questions, the Census Bureau will decide whether the ACS should alter its format of the collection of these data items to more closely reflect the census style format prior to the 2010 Census.

The 2007 Methods Panel may also include a third test contingent on the Start Printed Page 59720funding allocations in the President's budget for 2007. This third test will measure and compare the data quality between two versions of new content proposed by the National Science Foundation for inclusion on the ACS. The proposed content asks about the major field in which a person received his or her bachelor's degree. In this test, half the sample will answer an open-ended question reporting the actual degree he or she received. The other half of the sample will provide their field of degree information by answering a series of yes/no questions. The test will assess which, if either, version results in data of sufficient quality for inclusion on the ACS.

Given that the ACS collects data every day of the year in every county in the U.S. and in every municipio in Puerto Rico, the ACS provides an opportunity to produce data not available from any other source or survey at the same low levels of geography. The Census Bureau, in conjunction with the Office of Management and Budget, has a policy for determining whether new content or questions will be added to the ACS. As part of the content determination process, the Census Bureau must test the proposed content to determine whether the ACS can produce data of sufficiently high quality for the proposed topic. In all likelihood, this test will fold into the grid versus sequential form design test noted above in an effort to reduce cost and burden. The test would, however, include a Content Follow-Up Reinterview of approximately 80 percent of the sample. The Census Bureau and OMB will consider these results in deciding whether to include the new content, per the Census Bureau's Policy on New Content for the ACS.

In order to provide data of comparable reliability as the census long-form at low levels of geography (e.g., census tract level) or for characteristics of special, small populations, the ACS must collect data on a continual basis and aggregate three to five years worth of data. Essentially the ACS collects data every day of the year, either by mail, telephone interviews or personal-visit interviews in order to have an adequate number of interviews to achieve estimates with comparable reliability to the census long-form at low levels of geography. Federal agencies use the ACS data to determine appropriate funding for state and local governments through block grants. State and local governments use ACS data for program planning, administration and evaluation. Thus, the reliability and the quality of the data must remain high in order for the users to rely on the data for funding decisions.

Similarly, the federal government as well as state and local governments uses the core, basic demographics collected as part of the census for funding and programmatic decisions. With full implementation of the ACS, those same data are available every year. From a data user's perspective, large differences in the estimates for those core data items between ACS and the census can be problematic in terms of funding and program decisions. Since the ACS is a sample survey rather than a census we expect some differences in results between the two. However, there are many other factors that contribute to different results, such as differences in the interviewing staff, social relevance of the census versus a current survey, and even form design.

Thus, the 2007 Methods Panel will investigate ways to reduce or at least maintain data collection costs so the Census Bureau can continue to provide data of comparable reliability as the census long-form did. Additionally, the 2007 Methods Panel will test whether differences in form design between the census and the ACS may contribute to differences in results for the basic demographic items used by federal, state and local governments for funding and programmatic decisions. Lastly, funding permitting, the Methods panel will test proposed content regarding major field of study for a person's bachelor degree in order to provide the National Science Foundation and the National Center for Education Statistics with current information regarding estimates of types of fields in which people receive bachelor's degrees.

Affected Public: Individuals or households.

Frequency: One time.

Respondent's Obligation: Mandatory.

Legal Authority: Title 13, United States Code, Sections 141, 193, and 221.

OMB Desk Officer: Brian Harris-Kojetin, (202) 395-7314.

Copies of the above information collection proposal can be obtained by calling or writing Diana Hynek, Departmental Paperwork Clearance Officer, (202) 482-0266, Department of Commerce, room 6625, 14th and Constitution Avenue, NW., Washington, DC 20230 (or via the Internet at dhynek@doc.gov).

Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to Brian Harris-Kojetin, OMB Desk Officer either by fax (202-395-7245) or e-mail (bharrisk@omb.eop.gov).

Start Signature

Dated: October 3, 2006.

Madeleine Clayton,

Management Analyst, Office of the Chief Information Officer.

End Signature End Preamble

[FR Doc. E6-16728 Filed 10-10-06; 8:45 am]

BILLING CODE 3510-07-P