Skip to Content

Notice

Proposed Collection; Comment Request

Document Details

Information about this document as published in the Federal Register.

Published Document

This document has been published in the Federal Register. Use the PDF linked in the document sidebar for the official electronic format.

Start Preamble

ACTION:

Notice and request for comments.

SUMMARY:

The Department of Labor (DOL), as part of its continuing effort to reduce paperwork and respondent burden, conducts a preclearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and/or continuing collections of information in accordance with the Paperwork Reduction Act of 1995 (PRA95) (44 U.S.C. 3506(c)(2)(A)). This program helps to ensure that the requested data can be provided in the desired format, the reporting burden (time and financial resources) is minimized, the collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Employment and Training Administration (ETA) is soliciting comments about the proposed new collection of information on the validity or correctness of certain Unemployment Insurance (UI) data that States now provide to ETA in monthly, quarterly or annual reports. Some of these data are used to calculate performance measures or to allocate the funds used for program administration. ETA is seeking Office of Management and Budget (OMB) approval under the PRA95 to establish a UI Data Validation (UIDV) program to replace the existing Workload Validation (WV) program. The WV program, for which authority expired on 12/31/2000, validated—checked the accuracy of—a small number of reported data elements that are used to determine the allocation of funds appropriated for UI program administration. Under the more comprehensive UIDV program, States would validate about half the data they now report, including all the workload items. The UIDV system would increase the validation reporting burden. A copy of the proposed information collection request (ICR) can be obtained by contacting the office listed below in the addresses section of this notice.

DATES:

Written comments must be submitted to the office listed in the ADDRESSES section below on or before May 29, 2001.

ADDRESSES:

All comments about this proposed collection of information should be addressed to: Burman Skrable, Office of Workforce Security, Employment and Training Administration, U.S. Department of Labor, Room S-4231, 200 Constitution Avenue, NW., Washington, DC 20210. Telephone: 202-693-3197 (this is not a toll-free number); fax: 202-693-3229; e-mail: bskrable@doleta.gov.

End Preamble Start Supplemental Information

SUPPLEMENTARY INFORMATION:

I. Background

Section 303(a)(6) of the Social Security Act specifies that the Secretary Start Printed Page 17444of Labor will not certify State UI programs to receive administrative grants unless the State's law includes provisions for—

Making of such reports. * * * as the Secretary of Labor may from time to time require, and compliance with such provisions as the Secretary may from time to time find necessary to assure the correctness and verification of such reports.

Since the mid-1970s, all State Employment Security Agencies have been required to check the validity of certain data elements they submit on four required UI reports. The Department uses these data in a formula for determining each State's share of funds appropriated for the administration of the State's UI program. These elements are all aggregate counts of the number of times the State performs certain activities, or counts of such items as employers subject to UI taxes.

Validation and the UI System. Validity means that the counts the State submits on its reports are correct accumulations of elements which conform to the Federal reporting definitions. State staff, following the instructions in ET Handbook No. 361, perform this WV process; Department of Labor Regional staff, assisted by a technical support contractor, audit the State's validations. The validation has two dimensions: quantity and quality. The quantity validation consists of comparing a reported count for a selected period with a reconstructed validation count; it passes if there is no more than a 2% difference between the two. In the quality validation, samples of each element are checked against primary agency records to ensure that the proper activities are being counted according to Federal reporting definitions. To pass, a sample may contain no more than 5% invalid elements. The WV process is repeated every three years if all validations pass; any failure requires a revalidation of failed elements the following year.

Starting in the 1980s and continuing through the 1990s, the General Accounting Office and the Department's Office of Inspector General have criticized ETA for not validating all elements it requires States to report as program managers and policy officials at all levels rely upon such elements in making decisions affecting program design, funding and operations. More recently, the Government Performance and Results Act (GPRA) emphasizes that agencies need to ensure the validity of all data on which they base their strategic planning decisions and performance determinations. Commonly, agencies' GPRA displays indicate how they validate, or propose to validate, their performance data.

In the 1990s DOL asked Mathematica Policy Research, Inc., to develop a more automated validation approach in conjunction with its management of the field test of new benefits timeliness and quality measures. When the field test showed the methodology to be sound, it was extended to key UI tax performance data.

The new UIDV system has one feature in common with the WV system, but also some important differences:

  • In common with WV, UIDV does quantitative validation by independently reconstructing reported counts, and qualitative validation by checking samples against primary agency records;
  • The major differences are:

—WV starts with workload items, identifies each item the report elements comprise, and validates the report elements. In contrast, UIDV starts with the report elements to be validated. It first identifies the broad groups (“populations”) of underlying elementary transactions on which those report elements are based (e.g., initial claims), then devises mutually exclusive subgroups (“subpopulations”) which relate to the report elements.

—UIDV uses State-specific handbooks (one for benefits, another for tax) instead of one generic handbook. The UIDV handbooks' instructions for programmers and validators are specific to a State's own management information system. Thus, Federal reporting requirements are mapped to the related data element on each individual State's data system.

—UIDV is more highly automated and efforts are being made to automate its operations further to increase efficiency;

—UIDV's scope of validation is more extensive. It validates approximately half of the elements on the 47 required UI reports, versus WV's validation of only 29 data elements on four reports. UIDV validates all workload elements, including most of the data used to construct the Tier I UI performance measures (See Unemployment Insurance Program Letter 37-99, July 1, 1999, published as Federal Register Notice 64 FRN 38088 (July 14, 1999)).

UIDV Pilot Test. Three States pilot tested the UIDV system between November 1997 and October 1998. Two States undertook validation of all benefit and tax report elements in the UIDV handbooks; the other State validated all benefits elements but only validated one (Field Audit) of the five tax populations. Pilot States and associated ETA Regional Office staff received preparatory training before starting and technical assistance throughout the pilot from a support contractor.

In brief, the pilot test showed:

  • States could generally implement the UIDV system with a reasonable but sustained level of effort.
  • The UIDV system worked as designed to discover reporting errors.
  • States do make reporting errors which need detecting and fixing.
  • The reporting problems can be fixed.
  • The average staff requirements from the pilot test were about 2200 hours to complete Benefits Validation and about 2300 hours for Tax Validation, or 2.2-2.5 staff years for both, of which programming time was about 77% or 1.8 staff years. The contractor's evaluation report estimated that the continuing validation cost will be about 35% of initial, or about 0.8 staff years for tax and benefits validation combined. Very little of this is programmer time.

Although DOL has based the burden estimates below on the pilot program experience, it believes the estimates represent an upper limit for the true burden. The pilot was conducted while States were addressing Y2K concerns, which caused turnover among programmer staff and a lack of availability or intermittent availability of senior programmers for the pilot. The Department is also working to develop additional automation for the UIDV processes which will reduce initial programming time below the pilot test estimate.

II. Review Focus

DOL is particularly interested in comments which:

  • Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, especially whether the information will have practical utility;
  • Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;
  • Discuss how to enhance the quality, utility, and clarity of the information to be collected; and
  • Suggest how to minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection Start Printed Page 17445techniques or other forms of information technology (e.g., permitting electronic submission of responses).

III. Current Actions

The Department proposes the following plan for implementing and operating the UIDV system:

  • Mandatory implementation will begin around July 2001; States have been encouraged by Unemployment Insurance Program Letter No. 03-01 to implement the UIDV program voluntarily before then.
  • States that are not ready to begin implementation in 2001 will be required to validate all or some of the 11 workload items using the WV methodology if WV procedures would have called for validation.
  • UIDV will initially retain the 3-year cycle for validation and the validation standards applied under WV (±2% for quantity, 5% for quality). The following criteria, taken from WV, will also be used to determine when deviation from the cycle will be required: (1) A change in Federal reporting requirements; or (2) failure of the previous validation test; or (3) a major change in the State's computerized data system. In each of these cases, validation would be required the following fiscal year. Once into the continuing cycle, States decide when to conduct validation during a year.
  • Beginning with the FY 2004 State Quality Service Plan (SQSP) cycle, States will be required to include validation findings in the SQSP. They will be required to develop a corrective action plan for failure to complete a validation or if the same report element repeatedly fails validation.

Resources: States are expected to provide resources for UIDV from their UI administrative grant. Since the WV program was begun in the late 1970s, each State's grant has included one staff year for WV activities. The estimates below, based on estimates provided by the pilot evaluation contractor, indicate that average UIDV staffing requirements for continuing operations will be less than one staff year.

ADP Support: To reduce programming costs, the Department is developing additional software intended to limit State programming requirements to preparing the extract programs for the data elements to be validated. The additional software provided by the Department should cut the programming demand on States during implementation, which averaged 1.8 staff years in the pilot test, in half.

Data Recording and Reports: States will record the results of their investigations on spreadsheet software prepared as an accompaniment to their handbooks. Initially, the spreadsheets can be transmitted by e-mail or regular mail to the Department. Eventually, the results will be submitted the same as other reports. The results will be stored in a database in the National Office in Washington, D.C., and compiled in an annual validation accuracy report.

Training: DOL will begin conducting UIDV training for State staff in the Summer of 2001. Several sessions, perhaps on a regional basis, are envisioned. Experience to date suggests that small training sessions are most effective. States that elect to implement UIDV voluntarily may receive individual training. The Department's technical support contractor, Sparhawk Group, Inc., assisted by staff from Mathematica Policy Research, will conduct the training along with Department staff, and will provide continuing technical assistance during implementation. DOL will issue a directive containing details on the times, locations, and content of the training in advance of the sessions.

Type of Review: New .

Agency: Employment and Training Administration.

Title: Unemployment Insurance Data Validation Program.

OMB Number: 1205-0NEW.

Recordkeeping: States are required to follow their State laws regarding public record retention in retaining validation results.

Affected Public: State Governmental entities.

Reference: Handbook 361.

Total Respondents: 53.

Frequency: Complete validation every third year; annually to revalidate failed data, when there are changes in Federal reporting requirements or when State data systems undergo major changes. Table below assumes that one third of States must validate 10% of elements in each of two “off years.”

Total Responses: 53 (Average in a year: 29.7).

Estimated Time Per Response: 1,600 hours for a full validation, conducted every third year (based on pilot program. Off-year burden will depend on number of elements needing re-validation.)

Total Burden Hours: 30,187 Hours.

Total Burden Cost (capital/startup): 121,792 hours, $3,524,660 (2,768 hours, $80,106 per each of 44 States).

Total Burden Cost (operating/maintaining): $873,612 ($29,414 per State).

Calculation of Annual Burden and Capital/Startup Cost

FrequencyRespondentsHours per responseTotal hoursRate in $/hrTotal $Average per State $
Calculation of Annual Burden
Full ValidationEvery 3rd year531,60084,80028.942,454,11246,304
Partial Validation2 off years361605,76028.94166,6944,630
3-Year TotalNANANA90,56028.942,620,806
Ann. Avg.29.71,01630,18728.94873,60229,414
Calculation of Capital/Startup Cost
States ImplementOne Time444,500121,79228.943,524,66080,106
Start Printed Page 17446

Comments submitted in response to this request will be summarized and/or included in the request for OMB approval of the information collection request; they will also become a matter of public record.

Start Signature

Signed in Washington, DC on March 16, 2001.

Grace A. Kilbane,

Administrator, Office of Workforce Security.

End Signature End Supplemental Information

[FR Doc. 01-7909 Filed 3-29-01; 8:45 am]

BILLING CODE 4510-30-P