Skip to Content


Proposed Information Collection; Comment Request; 2016 Census Test

Document Details

Information about this document as published in the Federal Register.

Published Document

This document has been published in the Federal Register. Use the PDF linked in the document sidebar for the official electronic format.

Start Preamble


U.S. Census Bureau, Commerce.




The Department of Commerce, as part of its continuing effort to reduce paperwork and respondent burden, invites the general public and other Federal agencies to take this opportunity to comment on proposed and/or continuing information collections, as required by the Paperwork Reduction Act of 1995.


To ensure consideration, written comments must be submitted on or before October 5, 2015.

Start Printed Page 46240


Direct all written comments to Jennifer Jessup, Departmental Paperwork Clearance Officer, Department of Commerce, Room 6616, 14th and Constitution Avenue NW, Washington, DC 20230 (or via the Internet at

Start Further Info


Requests for additional information or copies of the information collection instrument(s) and instructions should be directed to Robin Pennington, Census Bureau, HQ-4K065, Washington, DC 20233; (301) 763-8132 (or via email at

End Further Info End Preamble Start Supplemental Information


I. Abstract

During the years preceding the 2020 Census, the Census Bureau will pursue its commitment to reduce the costs of conducting a decennial census, while maintaining our commitment to quality. A primary decennial census cost driver is collection of data from members of the public from which the Census Bureau received no reply via initially offered response options. Improving our methods for increasing the number of people who take advantage of self-response options (“Optimizing Self-Response”) and further refining the questionnaire content will help increase the efficiency and effectiveness of census operations and substantially reduce costs. Additionally, making our methods for enumerating households that do not initially respond (“Nonresponse Followup”) more efficient can contribute to a less costly census while maintaining high-quality results.

The Census Bureau will conduct a 2016 Census Test, with components designed to test new approaches or validate existing approaches and systems integration related to (1) Optimizing Self-Response, including contact strategies, language support, and questionnaire content; and (2) Nonresponse Followup, including administrative records usage, and technological and operational improvements.

Optimizing Self-Response

The 2016 Census Test is designed to evaluate several strategies to optimize the rate at which the public self-responds to the census. A higher rate of self-response will mean fewer cases for the Nonresponse Followup operation, saving taxpayer money. Significant areas of continued testing are:

  • Evaluation and refinement of our “Internet push” strategy, where we do not initially send paper questionnaires to households, but rather invite them to complete the questionnaire online. We will evaluate the number of online invitations necessary before sending a full paper questionnaire to an address.
  • Updated and modernized household contact strategies to encourage self response, including text/short message service (SMS) communication and postcard reminders.
  • Refinement of our non-English support for respondents with limited English proficiency, the inclusion of non-English language letters and/or brochures in mailings, and web response addresses (Uniform Resource Locators, or URLs) in various languages on the incoming envelope.
  • Further evaluation of questionnaire content:

○ We will include testing of a combined race and Hispanic origin question that is similar to one the Census Bureau is using in the 2015 National Content Test. Based on results from the 2010 Race and Hispanic Origin Alternative Questionnaire Experiment (Compton, et. al. 2012), the 2016 Census Test provides an opportunity to further test a combined race and Hispanic origin question.

○ On the Internet instrument only, we will test a terminology change in the race and ethnicity question specific to the “Black and African American” category, by comparing the use of “American” to the abbreviated “Am.” This addresses a problem with this abbreviation related to software providing Section 508 compliance [1] . We are testing this on the Internet initially because the path for testing and screens on the Internet are more easily deployed than paper versions. We will continue and expand testing this terminology change with paper questionnaires (self-response) in future testing.

○ For the relationship question, the 2016 Census Test will include testing new response categories for opposite sex and same sex husband/wife/spouse and unmarried partner. In addition, the Internet data collection instrument will also provide two versions of the relationship question, with one version eliminating the response categories associated with unrelated household members (“roomer or boarder” and “housemate or roommate”).

○ The Internet data collection instrument will also include various ways to collect and confirm the number of persons residing at an address. Respondents will see one of three screens about the existence of people on the roster: one that displays the residence rule and asks for the number of people in the household, one that asks for the number of people who live in the household but puts the residence rule in the help text, and one that asks if any other people live at the household with the residence rule in the help text. After the names of the roster members are collected, the respondent will then see one of two series of undercount questions: one series asks for additional people on two separate screens, and another series asks for additional people on only one screen. After the demographic items are collected, the respondent will then see overcount questions in one of three forms, depending on test panel. Some respondents will see seven topic-based questions that ask if anyone in the household stayed at a particular type of place. Some respondents who live in small households (that is, households with three or fewer people) will see one person-based question that asks if a specific person stayed in any of a number of places. Other respondents will see two household-level questions that first ask if anyone in the household stayed in another housing unit or if anyone in the household stayed in a group quarters. The quality of the final household roster created from these experimentally applied questions will be evaluated by a coverage reinterview conducted by telephone that will contain extensive probes about missed roster members or other places that people sometimes stay.

Nonresponse Followup (NRFU)

The 2016 Census Test will be instrumental to the Census Bureau in testing new implementation and management processes, the use of automated data collection tools, and approaches such as using administrative records and third party data to reduce the NRFU workload. This test allows us to refine our use of administrative records, technologies to support field data collection and management, and operational procedures.

  • Administrative Records
  • Continued evaluation of our plan to use administrative records and other third party data (such as from the Internal Revenue Service, Center for Medicare and Medicaid Services, United States Postal Service, etc.) to identify vacant housing units that do not require a field visit during the nonresponse follow-up operation. Historically, the costs to verify and follow up with these types of units have been significant
  • Continued evaluation of our plan to use the “occupancy” status of administrative records sources to Start Printed Page 46241enumerate housing units after a certain number of NRFU contact attempts. This includes quality evaluations of the sources of the administrative records, and reviews of the procedures by which those administrative records are produced (working with the source agencies, etc.). This will help us to decide in what scenarios the use of administrative records is most appropriate, the ideal number of personal visits to attempt before enumerating with these records, and several other research questions.
  • Supplemental mailings to housing units that have been removed from the NRFU operation, giving respondents an additional chance to respond to the 2016 Census Test before final disposition using administrative record source data.
  • Technological Improvements:
  • Evaluation of our refined operational control system and case assignment processes, including identifying efficiencies for field data collection, as well as automated assignments that are based on enumerator availability and other criteria.
  • Continued testing of a software to record housing unit status, interview, and enumerations at nonresponding housing units for operational readiness, as well as the ability to deploy the software on mobile devices that are Census owned, personally owned, or provided as a service.
  • Continued evaluation of automated training for field employees.
  • The inclusion of additional language translations to our enumeration software. Previous versions of this software provided translations in Spanish only.
  • Operational Procedures:
  • Comparison of the effectiveness of data collection modes (in-field enumeration vs. centralized telephone contact) to conduct telephone follow up activities.
  • Use of innovative survey methodologies for NRFU cases, including the continued testing of different stopping rules for enumerators (maximum visits before stopping work, etc.); further evaluation of in-person vs. phone contacts, and continued research on when and how to attempt to obtain proxy responses for a housing unit.
  • Implementation of a refined field management structure, designed to lessen the number of supervisors required in the field for conducting the NRFU operation.
  • Testing our re-interview operation, including the rules by which cases are selected for re-interview, the use of a handheld device to input re-interview data, and a re-designed approach to using call center staff to make the first attempt at re-interviewing each case, where appropriate. We will also test the use of paradata collected from our automated data collection instrument, such as the recorded Global Positioning System (GPS) location of field interviews and the length of time for interviews to be conducted, to help detect and deter falsification by enumerators.

II. Method of Collection

Test Sites

The Census Bureau will conduct the 2016 Census Test concurrently in portions of Harris County, TX and Los Angeles County, CA. These locations offer particular characteristics that support the Census Bureau's research goals. Conducting the 2016 Census Test in urban areas will allow us to test our assignment routing strategies in densely populated areas and understand challenges to field enumeration. Both sites have populations that are linguistically diverse and provide an appropriate context to test our language and translation services. Lastly, both areas contain “hard to count” populations and areas with high vacancy rates that will allow us to test our follow-up activities with these populations in this environment.


The housing units in the selected areas included in the 2016 Census Test will be contacted by mail and invited to complete their questionnaire via the Internet. Internet self response contact methods include a letter, postcard, and text (either as an invitation or as reminders), a multi-lingual brochure (either with a letter or in the envelope with URL). We will also test optimal strategies for delivering mail materials, including paper questionnaires, to households who do not or cannot respond online.

We will continue to test our Non-ID processing methodology as another strategy for optimizing self-response. Non-ID Processing refers to address matching and geocoding for Census responses that lack a preassigned Census identification code. In the 2016 Census Test, we will continue to develop our capability to conduct real-time Non-ID processing. This test will allow us to interactively prompt a respondent (while they are still online filling out the form) for additional address and location information if the respondent's address cannot be matched to a Census ID or geocoded. A Non-ID respondent whose address cannot be matched to our address database will be prompted during his or her Internet self-response session to confirm the address information they provided while filling out the form, or to indicate the location of their address on an on-screen map. This test will allow us to better understand requirements related to scalability of planned systems and determine metrics for ongoing monitoring and evaluation. If the address match is not resolved during automated processing Census staff will attempt to manually match or geocode addresses. We estimate that about one percent of the overall Non-ID respondents will be contacted as part of the manual matching process. Additionally, we plan to test a mechanism for validating all Non-ID respondents through the use of administrative records. To further explore our methodology for validating Non-ID responses, a sample of Non-ID responses will be selected for re-contact. The re-contact is intended to validate and re-collect information from a respondent to confirm the existence of the address and the persons enumerated at that address. The re-contact may occur through centralized phone contacts or in-field enumeration.

Nonresponse Followup (NRFU)

If a household does not ultimately respond to the self-response portion of the test by a specified date, it is included in the universe for the NRFU portion of the test, during which enumerators will attempt to follow up with nonresponding households to collect data. The Census Bureau will test centralized phone contacts to nonresponding cases prior to sending cases to an enumerator in the field. In advance of the full deployment of enumerators following up with nonresponding households, a small number of the nonresponding cases may be subject to early followup to allow for live testing of systems, data collection applications, and field procedures.

The Census Bureau will continue to test our use of administrative records for the removal of vacant housing units from the NRFU universe and to determine rules for when we can stop making visit attempts to households, and refer to administrative record data instead. For each of these cases, we will test a supplemental mailout to any household that is removed from the NRFU workload in this way as a final attempt to generate a self-response.

The Census Bureau will conduct NRFU with a combination of Census-owned, enumerator-owned, and mobile devices provided as a service using the Census developed enumeration Start Printed Page 46242software. The use of employee-owned equipment/services is commonly referred to as “Bring Your Own Device,” or BYOD. It is important to note, that for Census-owned devices, BYOD devices, or devices provided as a service that the data collection application collects and securely transmits respondents' data. The use of mobile devices that are Census-owned, enumerator-owned, or provided as a service will enable the Census Bureau to assess options for a secure and cost-effective approach to the NRFU data collection.

Nonresponse Followup Quality Control Reinterview (NRFU-RI)

A sample of cases that have been enumerated via Nonresponse Followup will be selected for reinterview. This operation is intended to help us pinpoint possible cases of enumerator falsification. Like the NRFU operation before it, NRFU-RI will use the Census Bureau's enumeration software on mobile devices (Census-owned, BYOD, and devices provided as a service). We will also test centralized phone contacts of reinterview cases before sending them to an enumerator in the field, providing potential cost savings.

Additional Followup Operations

Understanding the accuracy of administrative records usage to identify vacant addresses and for the household composition of occupied housing units will inform decisions associated with the design of the 2020 Census. The Census Bureau may conduct additional followup with cases to obtain the most accurate Census Day status of each housing unit. The intent is to revisit addresses where we find discrepancies between the NRFU results and administrative records information for the address. This mostly will include those addresses where information collected during NRFU conflicts with information we have from administrative records for that address.

Language Services

Telephone questionnaire assistance will be available in languages other than English.

Focus Groups

To evaluate the use of new contact strategies, enumeration methods, and efforts to reduce burden, the Census Bureau will conduct focus groups, comprised of various categories of respondents and non-respondents. These focus groups are intended to gather information about respondent perspectives. Participants will be asked about their experiences with the 2016 Census Test, including but not limited to: Their reactions and thoughts about being contacted by the Census Bureau by alternative methods, the perceived legitimacy of these contacts; opinions about Bring Your Own Device; and their opinions on the use of administrative records by the Census Bureau. Participants will also be asked about their general concerns with government collection, cyber security, and protection of confidential data. At the end of the focus groups, we may be asking participants for whom we have acquired additional data from a commercial third party to verify whether this information is accurate.

III. Data

OMB Control Number: 0607-XXXX.

Form Number(s): Paper and electronic questionnaires; numbers to be determined.

Type of Review: Regular submission.

Affected Public: Households/Individuals.

Estimated Number of Respondents: Self responders [Internet/Telephone/Paper]: 250,000 respondents.

Nonresponse Followup Cases: 120,000 respondents.

Nonresponse Followup Quality Control Re-Interview Cases: 12,000 respondents.

Manual Non-ID Processing Cases requiring a phone call to the respondent: 400.

Validation of Non-ID responses: 5000.

Administrative Records Followup: 5000.

Focus Groups:

Focus Group Selection Contact: 288.

Focus Groups: 160 participants.

Total: 392,848 respondents.

Estimated Time Per Response:

Paper/Internet Responders: 10 minutes per response.

Nonresponse Followup Cases: 10 minutes per response.

Nonresponse Followup Quality Control Re-Interview Cases: 10 minutes per response.

Non-ID Manual Processing Cases: 5 minutes.

Non-ID Respondent Validation: 10 minutes per response.

Administrative Records Followup: 10 minutes per response.

Focus Groups:

Focus Group Selection Contact: 3 minutes per response.

Focus Groups: 120 minutes per response.

Estimated Total Annual Burden Hours:

Self responders [Internet/Paper/Telephone]: 41,667 hours.

Nonresponse Followup Cases: 20,000 hours.

Nonresponse Followup Quality Control Re-Interview Cases: 2,000 hours.

Non-ID Manual Processing Cases: 33 hours.

Non-ID Respondent Validation: 834 hours.

Administrative Records Followup: 834 hours.

Focus Groups:

Focus Group Selection Contact: 16 hours.

Focus Groups: 320 hours.

Total: 65,704 hours.

Estimated Total Annual Cost to Public: For the 2016 Census Test, respondents who are contacted by text message may incur charges depending on their plan with their service provider. The Census Bureau estimates that the total cost to respondents will be no more than $20,000. There are no other costs to respondents other than their time to participate in this data collection.

Respondent's Obligation: Mandatory.

Legal Authority: Title 13 U.S.C. Sections 141 and 193.

IV. Request for Comments

Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden (including hours and cost) of the proposed collection of information; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.

Comments submitted in response to this notice will be summarized and/or included in the request for OMB approval of this information collection; they also will become a matter of public record.

Start Signature

Dated: July 29, 2015.

Glenna Mickelson,

Management Analyst, Office of the Chief Information Officer.

End Signature End Supplemental Information


1.  Section 508 of the Rehabilitation Act (29 U.S.C. `794 d) as amended by the Workforce Investment Act of 1998 (Pub. L. 205-220)

Back to Citation

[FR Doc. 2015-19005 Filed 8-3-15; 8:45 am]