Buyer\u2019s guide \u00b7 WIOA Sec. 116 \u00b7 2026 cycle
The six Sec. 116 indicators, state and local target negotiation, UI wage record limits, credential and MSG documentation, and where assessment platforms reduce upstream risk.
This guide explains the six WIOA Sec. 116 Common Performance Measures: employment rate Q2 after exit, employment rate Q4 after exit, median earnings Q2 after exit, credential attainment within four quarters after exit, measurable skill gains during the program year, and effectiveness in serving employers. It walks through how state and local performance levels are negotiated under Sec. 116(b)(3), the regression-based statistical adjustment for economic conditions and participant characteristics, and the Sec. 116(f) sanction triggers under 20 CFR \u00a7677.165. It explains the role of state UI wage records as the primary post-exit employment data source, the substantial categories of employment they do not capture (federal, military, self-employment, contract / 1099, off-the-books), and the typical six-to-nine-month UI reporting lag that puts performance reports always behind the program. It walks through the credential-attainment indicator, the WIOA Sec. 3(52) definition of recognized postsecondary credential, the TEGL 10-16 and TEGL 23-19 clarifications, and the implications for ITA program selection. It covers the five MSG types under TEGL 10-16 Change 2 and the documentation requirements, and the effectiveness-in-serving-employers indicator with its three measurement approaches. It positions career-assessment platforms as upstream risk-reduction tools that affect indicators 1-5 indirectly through better participant-program fit, with no direct contribution to indicator 6.
A reading map for workforce-board quality and operations staff.
Upstream alignment that reduces risk on indicators 1-5.
For a local board serving 8,000 customers per year
This guide is one of twenty in the JobCannon for Business reading library; performance-and-quality managers reading the Sec. 116 indicator detail here also read the WIOA Individual Training Account guide for the upstream ITA-decision discipline that drives Indicators 1, 4, and 6, and the apprenticeship RAPIDS guide for the closely-aligned registered-apprenticeship reporting framework that LWDBs increasingly co-fund.
For the operational landing of WIOA-aligned reporting, see our government programmes vertical, where local workforce development boards, state workforce agencies, and federal sponsors deploy the same primitives end-to-end.
Customer-facing assessments and Career Guide stay free for WIOA-eligible customers. Performance-aligned reporting and case-management exports run on the Business tier from $199/mo flat, or under a state-WDB partnership for multi-board deployments.
Try it with a micro-team
For independent coaches and therapists
For startups, teams and HR
For agencies, L&D and scale-ups
For 200+ person companies
All plans currently activated manually via the contact form — we review each request within 24 hours and provision access the same day. Self-serve checkout coming once we've heard from the first wave of teams.
Tell us your role (board director, performance and quality manager, monitoring lead) and your indicator focus. We respond within one business day.
WIOA Sec. 116 (29 USC §3141) establishes the Common Performance Measures across the six core programs administered by the Department of Labor and the Department of Education: WIOA Title I adult, dislocated worker, and youth programs (DOL); WIOA Title II adult education and family literacy (ED); WIOA Title III Wagner-Peyser employment service (DOL); and WIOA Title IV vocational rehabilitation (ED). The six measures are: (1) Employment rate in the second quarter after exit — the percentage of program participants in unsubsidized employment in the second quarter after their exit from program services; (2) Employment rate in the fourth quarter after exit — the same population measured in the fourth quarter after exit, which captures employment retention; (3) Median earnings in the second quarter after exit — the median quarterly earnings of program participants who are employed in the second quarter after exit; (4) Credential attainment within four quarters after exit — the percentage of program participants enrolled in education or training (excluding on-the-job training and customized training) who attained a recognized postsecondary credential or a secondary school diploma or its equivalent during participation or within one year after exit; (5) Measurable skill gains during the program year — the percentage of participants in education or training that lead to a recognized postsecondary credential or employment who, during a program year, are achieving documented academic, technical, occupational, or other forms of progress toward such a credential or employment; (6) Effectiveness in serving employers — a state-pilot indicator that has been measured inconsistently across states using one or more of three approaches: retention of participants by employers, repeat business customers, and employer penetration in the relevant local area. The implementing regulations sit at 20 CFR §677.155 and 34 CFR §361.155.
WIOA Sec. 116(b)(3) establishes the negotiation framework. Each state submits expected performance levels for each of the first six indicators in its Unified or Combined State Plan, and DOL ETA and ED OCTAE (for Title II) negotiate adjusted state performance levels with the state. The adjustment factors include the differences among states regarding economic conditions (typically using regression-based adjustment models that account for unemployment rate, industry mix, and demographic composition) and the characteristics of participants served (which adjusts for the difficulty of the population the state is serving). Local performance levels are then negotiated between the state and each local workforce board, again with adjustment factors. The state-level sanction trigger sits at Sec. 116(f) and is implemented in 20 CFR §677.165: a state that fails to meet its adjusted performance level on the same indicator for the same program for two consecutive program years receives technical assistance the first year, and after the second year of failure may have its allotment reduced by up to five percent. Local-level sanctions sit in state-board policy and typically include performance improvement plan requirements, technical assistance, and possible designation as not meeting performance levels. The 90-percent rule for state targets in some indicators creates additional pressure: a state below 90 percent of its negotiated target on any of the first five indicators in any year is considered to have failed for that year, regardless of the closeness to target. Local boards face similar 90-percent rules in many state plans. The practical consequence is that local boards have limited discretion to set targets meaningfully below state targets, and underperformance triggers cascade through the system.
State Unemployment Insurance (UI) wage records are the primary source of post-exit employment and earnings data for WIOA performance measures. Every employer covered by state UI law reports quarterly wages to the state UI agency, which produces a wage record per employee per quarter. WIOA performance reporting uses these records to identify whether program participants were in covered employment in the second and fourth quarters after exit and to compute median quarterly earnings. The match is conducted through state-level data-sharing agreements between workforce agencies and UI agencies, typically using the Wage Record Interchange System (WRIS) or its successor for cross-state matching. The limits of UI wage records are significant. First, federal employment, military service, self-employment, employment by certain religious organizations, agricultural employment by small employers, and most contract or 1099 work are not in covered employment, which means substantial portions of the labor market do not appear in UI records. The Federal Employment Data Exchange System (FEDES) has historically provided federal-employee matching but with substantial lag. Military employment requires separate matching with DMDC. Self-employment requires separate self-report or supplemental data sources. Second, undocumented workers and workers paid in cash off the books are not in UI records, which depresses the measured employment rate for some immigrant-heavy populations. Third, the UI lag is operationally meaningful: Q2 wages for a participant who exited in Q1 are reported approximately six to nine months after the exit, depending on state UI processing. WIOA performance reports therefore inevitably operate on lagged data, and program decisions made today will not show up in performance reports for at least two to four quarters. Workforce boards typically supplement UI data with self-report follow-up and employer-supplied data for populations and types of employment that UI does not capture.
The credential-attainment indicator under WIOA Sec. 116(b)(2)(A)(i)(IV) measures the percentage of participants who, during participation in or within one year after exit, attained a recognized postsecondary credential or a secondary school diploma or its equivalent. The denominator is participants enrolled in education or training services excluding on-the-job training and customized training. The numerator is participants who attained the credential. Recognized postsecondary credentials are defined in WIOA Sec. 3(52) and include: industry-recognized certificates and certifications; certificates of completion of registered apprenticeship programs; license recognized by the state involved or federal government; an associate or baccalaureate degree. The DOL Training and Employment Guidance Letter (TEGL) 10-16 and subsequent updates clarify which specific credentials count, with state determinations often required for industry-recognized credentials that do not appear on a state-approved list. The TEGL 23-19 expanded the list of secondary credentials to include high-school equivalency credentials issued under state procedures (HiSET, GED, TASC). The indicator has produced operational issues for boards because credential attainment depends on what training the participant entered and what credential is associated with that training. A participant in a short-term occupational-skills training that does not lead to a recognized credential cannot satisfy the credential-attainment indicator regardless of completion. Boards directing participants toward training programs without credential paths create indicator risk. Career-assessment platforms reduce the upstream risk by supporting alignment between participant interest and credential-bearing programs on the Eligible Training Provider List, but the credential itself is awarded by the training provider — the platform does not generate credentials.
Measurable Skill Gains (MSG) under Sec. 116(b)(2)(A)(i)(V) measures interim progress for participants in education or training, recognizing that some participants are still in services during the relevant program year and are not yet at exit. MSG can be documented through any of five types: (1) educational functioning level (EFL) gains for adult education participants, measured through pre/post testing on a state-approved instrument (CASAS, TABE, BEST Plus); (2) attainment of secondary school diploma or its equivalent during participation; (3) secondary or postsecondary transcript or report card showing the participant achieved at least one academic year of progress (defined as twelve credit hours or equivalent); (4) progress toward established milestones from the participant’s training plan such as completion of an OJT, completion of one year of a registered apprenticeship program, or other quantifiable milestone; (5) successful passage of an exam required for an occupation, or progress in attaining technical or occupational skills as evidenced by trade-related benchmarks. Documentation requirements are detailed in TEGL 10-16 Change 2 (June 2018) and subsequent updates. Each MSG type has specific documentation requirements; for type (4) progress toward milestones the documentation must include a written description of the milestone in the IEP / ISS and evidence the milestone was met. The MSG indicator has been a source of considerable interpretation difficulty for state and local boards because the definition is intentionally broad to accommodate program diversity but narrow enough to require documented evidence rather than self-report. Career-assessment platforms can support MSG type (4) when the milestone is defined in the participant’s plan as completion of a specific assessment or skill-development sequence, with the platform export serving as documentation. The platform alone does not satisfy MSG; the milestone must be in the plan and the platform output is the evidence the milestone was met.
Effectiveness in serving employers under Sec. 116(b)(2)(A)(i)(VI) was introduced as a state-pilot indicator in WIOA, with three approved measurement approaches under the joint DOL / ED guidance: (1) employer retention with the same employer in the second and fourth quarter after a participant’s entry into employment, expressed as a rate of employer retention; (2) repeat business customers, measured as the percentage of employers who use core program services more than once during the previous three years; (3) employer penetration, measured as the percentage of employers in the local area using the program services. States choose any combination of the three approaches, and reporting consistency has been weak. The Department of Labor has, in TEGL 14-18 and subsequent guidance, signaled an intention to standardize the indicator, but the variation persists in 2026 because the underlying data infrastructure (employer identifiers, NAICS coding, longitudinal employer-establishment tracking) is not consistently mature across states. The effectiveness-in-serving-employers indicator is the only WIOA performance measure where assessment platforms have minimal direct contribution. Employer effectiveness is driven by the quality of employer engagement, the alignment between participant skills and employer needs, business-services staff capacity at the workforce board, and the local labor market. Career-assessment platforms can support the upstream alignment by helping participants and case managers identify in-demand-occupation matches, but the employer-side measurement is conducted on employer data that the platform does not touch. Boards should not look to assessment platforms to solve effectiveness-in-serving-employers; they should look to business-services capacity and employer-engagement strategy.
Author
Founder & Lead Researcher, JobCannon
Peter is the founder of JobCannon and leads the assessment validation, knowledge graph, and B2B partnerships. He has 10+ years working with NGO and educational career programmes globally.