Skip to Content


Agency Forms Undergoing Paperwork Reduction Act Review

Document Details

Information about this document as published in the Federal Register.

Published Document

This document has been published in the Federal Register. Use the PDF linked in the document sidebar for the official electronic format.

Start Preamble

The Centers for Disease Control and Prevention (CDC) has submitted the following information collection request to the Office of Management and Budget (OMB) for review and approval in accordance with the Paperwork Reduction Act of 1995. The notice for the proposed information collection is published to obtain comments from the public and affected agencies.

Written comments and suggestions from the public and affected agencies concerning the proposed collection of information are encouraged. Your comments should address any of the following: (a) Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility; (b) Evaluate the accuracy of the agencies estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used; (c) Enhance the quality, utility, and clarity of the information to be collected; (d) Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses; and (e) Assess information collection costs.

To request additional information on the proposed project or to obtain a copy of the information collection plan and instruments, call (404) 639-7570 or send an email to Written comments and/or suggestions regarding the items contained in this notice should be directed to the Attention: CDC Desk Officer, Office of Management and Budget, Washington, DC 20503 or by fax to (202) 395-5806. Written comments should be received within 30 days of this notice.

Proposed Project

Questionnaire Design Research Laboratory (QDRL)—(OMB No. 0920-0222, expires 6/30/2015)—Revision—National Center for Health Statistics (NCHS), Centers for Disease Control and Prevention (CDC).

Background and Brief Description

The Questionnaire Design Research Laboratory (QDRL) is the focal point within NCHS for questionnaire development, pre-testing, and evaluation activities for CDC surveys (such as the NCHS National Health Interview Survey, OMB No. 0920-0214) and other federally sponsored surveys; however, question development and evaluation activities are conducted throughout NCHS. NCHS is requesting 3 years of OMB Clearance for this generic submission. This revision is a request for additional burden hours due to anticipated increase in the number and size of projects being undertaken in the next three years.

The QDRL and other NCHS programs conduct cognitive interviews, focus groups, in-depth or ethnographic interviews, usability tests, field tests/pilot interviews, and experimental research in laboratory and field settings, both for applied questionnaire development and evaluation as well as more basic research on response errors in surveys.

Various techniques to evaluate interviewer administered, self-administered, telephone, Computer Assisted Personal Interviewing (CAPI), Computer Assisted Self-Interviewing (CASI), Audio Computer-Assisted Self-Interviewing (ACASI), and web-based questionnaires are used.

The most common questionnaire evaluation method is the cognitive interview. These evaluations are conducted by the QDRL. The interview structure consists of respondents first answering a draft survey question and then providing textual information to reveal the processes involved in answering the test question. Specifically, cognitive interview respondents are asked to describe how and why they answered the question as they did. Through the interviewing process, various types of question-response problems that would not normally be identified in a traditional survey interview, such as interpretive errors and recall accuracy, are uncovered. By conducting a comparative analysis of cognitive interviews, it is also possible to determine whether particular interpretive patterns occur within particular sub-groups of the population. Interviews are generally conducted in small rounds of 20-30 interviews; ideally, the questionnaire is re-worked between rounds, and revisions are tested iteratively until interviews yield relatively few new insights.

Cognitive interviewing is inexpensive and provides useful data on questionnaire performance while minimizing respondent burden. Cognitive interviewing offers a detailed depiction of meanings and processes used by respondents to answer questions—processes that ultimately produce the survey data. As such, the method offers an insight that can transform understanding of question validity and response error. Documented findings from these studies represent tangible evidence of how the question performs. Such documentation also serves CDC data users, allowing them to be critical users in their approach and application of the data.

In addition to cognitive interviewing, a number of other qualitative and quantitative methods are used to investigate and research survey response errors and the survey response process. These methods include conducting focus groups, usability tests, in-depth or ethnographic interviews, and the administration and analysis of questions in both representative and non-representative field tests. Focus groups are conducted by the NCHS QDRL. They are group interviews whose primary purpose is to elicit the basic sociocultural understandings and terminology that form the basis of questionnaire design. Each group Start Printed Page 34645typically consists of one moderator and 4 to 10 participants, depending on the research question. In-depth or ethnographic interviews are one-on-one interviews designed to elicit the understandings or terminology that are necessary for question design, as well as to gather detailed information that can contribute to the analysis of both qualitative and quantitative data. Usability tests are typically one-on-one interviews that are used to determine how a given survey or information collection tool functions in the field, and how the mode and layout of the instrument itself may contribute to survey response error and the survey response process.

In addition to these qualitative methods, NCHS also uses various tools to obtain quantitative data, which can be analyzed alone or analyzed alongside qualitative data to give a much fuller accounting of the survey response process. For instance, phone, internet, mail, and in-person follow-up interviews of previous NCHS survey respondents may be used to test the validity of survey questions and questionnaires and to obtain more detailed information that cannot be gathered on the original survey. Additionally, field or pilot tests may be conducted on both representative and non-representative samples, including those obtained from commercial survey and web panel vendors. Beyond looking at traditional measures of survey errors (such as missing rates, item non-response, and don't know rates), these pilot tests can be used to run experimental designs in order to capture how different questions function in a field setting.

There are no costs to respondents other than their time. The total estimated annual burden hours are 4,383.

Estimated Annualized Burden Hours

Type of respondentsForm nameNumber of respondentsNumber of responses per respondentAverage burden per response (in hrs.)
Individuals or householdsEligibility Screeners4,00015/60
Individuals or householdsDevelopmental Questionnaires3,90011
Individuals or householdsFocus group documents10011.5
Start Signature

Leroy A. Richardson,

Chief, Information Collection Review Office, Office of Scientific Integrity, Office of the Associate Director for Science, Office of the Director, Centers for Disease Control and Prevention.

End Signature End Preamble

[FR Doc. 2015-14786 Filed 6-16-15; 8:45 am]