Skip to main content

Buyer\u2019s guide \u00b7 Perkins V \u00b7 2026 cycle

Guide to Perkins V 2026 CTE Reporting for State and Local Eligible Recipients.

Indicators 2S1, 2S2, 3S1, and the concentrator math behind them. Where assessment platforms legitimately fit, and where they do not.

In Brief

This guide covers the Perkins V Sec. 113(b) core accountability indicators for the 2026 reporting cycle: 1S1 / 1S2 (graduation), 2S1 / 2S2 / 2S3 (academic proficiency), 3S1 (placement at second quarter after exit), 4S1 (non-traditional participation), and the 5S program-quality indicators including the state-determined 5S4. It explains the Sec. 3(12) concentrator definition (two CTE courses in a program of study, down from three under Perkins IV), why concentrator math is the load-bearing variable for almost every indicator, and where the EDFacts FS231 / FS232 specification has tightened for 2026. It walks through the Sec. 113(b)(3) ninety-percent rule that pushes state targets down onto local recipients via the local application under Sec. 134, and the Program Improvement Plan triggers under Sec. 123. It maps career-readiness assessment platforms to three legitimate Perkins V use cases \u2014 the local needs assessment under Sec. 134(c), the individualized learning plan requirements that several states impose, and a possible 5S4 program-quality measure where the state plan allows it \u2014 and is honest about what assessment platforms cannot satisfy (academic-proficiency indicators 2S1 / 2S2, the credential indicator 5S1, and direct placement reporting for 3S1). It closes with the documentation set monitors look for and the Uniform Guidance 2 CFR \u00a7200 fiscal requirements that determine whether Perkins funds can be used to license the platform at all.

Chapters in this guide

A reading map for state and local CTE staff.

Indicator 2S1 / 2S2 — academic proficiency
How concentrator-level reading and mathematics proficiency is computed, why the denominator is concentrators-who-took-the-state-assessment, and how EDFacts FS231 / FS232 has tightened for 2026.
Indicator 3S1 — second-quarter placement
The most common Program Improvement Plan trigger. Where placement data actually comes from (state UI wage records, NSC StudentTracker, military / service program lists) and why assessment platforms are upstream, not source-of-record.
Indicator 5S4 — the state-determined slot
Where career-readiness assessment completion can legitimately count toward the program-quality measure, and where it cannot. State plans differ — confirm with your SEA before assuming.
Local application and CLNA
How Sec. 134(b) local applications and Sec. 134(c) comprehensive local needs assessments use assessment-platform exports as evidence for program-of-study decisions and stakeholder consultation.

Assessments commonly cited in CLNA evidence files

What state monitors typically accept as documented career-interest signals for CTE concentrators.

Career-orientation core
CLNA Sec. 134(c) evidence
Aptitude profile
ILP / personalized plan evidence
Workplace-readiness signals
WBL allocation support (5S3)

Compared to other CTE-reporting-adjacent platforms

For a state-level eligible recipient covering 50,000 CTE concentrators per year

$120-250K/yr
Naviance + state CTE module
Per-student licensing plus implementation
$80-180K/yr
YouScience CTE bundle
Aptitude-first; thin career database
$60-140K/yr
Xello districtwide
Per-school licensing plus integration
$0
JobCannon
Unlimited, forever

What this guide covers

Sec. 113(b) core indicators for the 2026 cycle (1S, 2S, 3S, 4S, 5S)
Sec. 3(12) concentrator definition and why it is load-bearing
EDFacts FS231 / FS232 changes for 2026
Sec. 113(b)(3) ninety-percent rule and local target negotiation
Sec. 123 Program Improvement Plan triggers
Sec. 134(c) Comprehensive Local Needs Assessment evidence
State-determined 5S4 measures and platform fit
2 CFR §200 fiscal documentation for Perkins-funded platform purchases

Related on JobCannon

This guide is one of twenty in the JobCannon for Business reading library; for the related state accountability framework, see the ESSA career-readiness indicator guide, which covers how state Title IV CCR indicators interact with Perkins V Sec. 113(b) reporting at the school level.

CTE programmes deploying these workflows in the field typically read this alongside our vocational and trade-schools landing for the operational layer of programme-fit screening, concentrator tracking, and 5S3 work-based-learning allocation.

Pricing for CTE recipients

Student-facing assessments and Career Guide stay free for CTE concentrators. Cohort and indicator-aligned exports run on the Business tier from $199/mo flat, or under a state-level partnership for SEA-wide deployments.

Starter

Try it with a micro-team

$0
  • 5 invites (one-time, not recurring)
  • All 50+ assessments
  • Basic individual reports
  • Share link via email or Slack
  • No credit card required
Request free access

Coach

For independent coaches and therapists

$29/mo
or $290/yr (save 17%)
  • 30 invites per month
  • All 50+ assessments
  • Detailed individual reports
  • Coach notes per client
  • PDF export (client-ready)
  • Session prep recommendations
Get Coach access
Most Popular

Team

For startups, teams and HR

$79/mo
or $790/yr (save 17%)
  • 100 invites per month
  • Everything in Coach
  • Team DNA dashboard
  • Compatibility matrix
  • Conflict-pattern detection
  • Compare 2-3 team members
Get Team access
Recommended

Business

For agencies, L&D and scale-ups

$199/mo
or $1990/yr (save 17%)
  • 500 invites per month
  • Everything in Team
  • White-label PDF reports (your logo)
  • API access (read-only results)
  • Custom assessment builder (beta)
  • Bulk CSV import/export
Get Business access

Enterprise

For 200+ person companies

From $5k/yr
  • Unlimited invites
  • Everything in Business
  • SSO (SAML, Google Workspace)
  • SLA (99.9% uptime)
  • Data residency options (EU/US)
  • Dedicated Customer Success
Talk to us

All plans currently activated manually via the contact form — we review each request within 24 hours and provision access the same day. Self-serve checkout coming once we've heard from the first wave of teams.

Talk to a CTE specialist

Tell us your role (state director, local CTE coordinator, SEA monitoring lead) and your indicator focus. We respond within one business day.

We reply within 24 hours. No spam, no per-seat pitches.

FAQ

What does the Perkins V Sec. 113(b) accountability framework actually require for the 2026 reporting cycle?

The Strengthening Career and Technical Education for the 21st Century Act (Perkins V, Public Law 115-224, codified at 20 USC §2301 et seq.) requires every state and every local eligible recipient to report on a fixed list of core indicators of performance. Section 113(b)(2) defines them. For secondary recipients there are five: 1S1 (four-year graduation rate), 1S2 (extended-year graduation rate where the state uses one), 2S1 (academic proficiency in reading/language arts), 2S2 (academic proficiency in mathematics), and 2S3 (academic proficiency in science). On top of those, 3S1 measures post-program placement — the percentage of CTE concentrators who, in the second quarter after exiting, are in postsecondary education, advanced training, military service, a service program (Peace Corps / AmeriCorps), or competitive integrated employment. 4S1 measures non-traditional program participation, and 5S1-5S4 measure program-quality indicators (industry-recognized credentials, dual / concurrent enrollment, work-based learning, plus a state-defined optional measure). For postsecondary recipients the list shifts: 1P1 (postsecondary placement at second quarter after exit), 2P1 (earned a recognized postsecondary credential), 3P1 (non-traditional concentration). The 2026 cycle differs from earlier cycles in two ways. First, the OCTAE EDFacts FS231 / FS232 specification has tightened how concentrator status is computed at the cohort level — states that previously aggregated to district must now provide per-program-of-study counts. Second, several states have adopted optional 5S4 measures that include career-readiness assessment outcomes; if your state did, an assessment platform that exports per-student completion records becomes part of the data file rather than a side tool. The federal portion is reported via the Consolidated Annual Report (CAR) by 31 December for the prior program year.

How is a CTE concentrator defined for 2S1 and 2S2 reporting, and why does the definition matter so much?

Under Perkins V Sec. 3(12), a secondary CTE concentrator is a student who has completed at least two courses in a single CTE program of study. (Perkins IV used a three-course threshold, which is one of the most consequential changes between the two laws.) For postsecondary, a concentrator is a student who has earned at least 12 credits in a single program of study, or has completed the program if it is shorter than 12 credits. The definition matters because every accountability indicator in Sec. 113(b)(2) is computed only over concentrators — not over all CTE participants, not over all enrolled students. If your district codes program-of-study membership inconsistently across courses, your concentrator denominator wobbles, and 2S1 / 2S2 / 3S1 numerators move around for reasons that have nothing to do with student outcomes. Three operational consequences follow. First, the unit of measurement for 2S1 and 2S2 is the concentrator who took the state assessment in reading or math during the cohort year — not every concentrator. Districts that lose track of which concentrators sat which assessments end up with denominators that don’t reconcile to the SEA file. Second, 3S1 placement requires second-quarter-after-exit data. A student counted as a concentrator in one program but exited as a concentrator in another (because they switched POS senior year) creates a coding ambiguity that OCTAE has clarified in non-regulatory guidance: count the student in the program in which they exited as a concentrator. Third, several states layer additional sub-population reporting on top of the federal indicators — special populations under Sec. 3(48) — and the disaggregation only works if concentrator status is clean. Career-assessment platforms can help by providing a per-student record of which assessments were completed, when, and the result — but only if the platform exports reconcilable IDs that match your SIS.

Where do career-readiness assessments fit in the Perkins V data file, and is JobCannon a fit?

Career-readiness assessments fit Perkins V reporting in three specific places. First, several states use career-readiness assessment completion as their state-determined 5S4 program-quality indicator under Sec. 113(b)(2)(A)(v). The list of approved 5S4 measures sits in each state’s Perkins V plan — check your SEA’s plan, not federal guidance, because 5S4 is state-determined. Second, assessment outcomes inform the local needs assessment (CLNA) required under Sec. 134(c) every two years; CLNA is where you justify which programs of study to fund, expand, or sunset. Third, individualized learning plans (ILPs) and personalized graduation plans — where state law requires them, e.g., Indiana, Tennessee, Florida — typically include a documented career-interest inventory or aptitude assessment. JobCannon’s fit is reasonable for the second and third use cases and partial for the first. The platform offers RIASEC (Holland code), Big Five, Multiple Intelligences, Skills Audit, and 49 other validated assessments, and exports per-student completion records via CSV or API. The knowledge graph maps to 2,536 careers and 1,533 skills with 64,317 weighted edges, which means the result page links the student to specific O*NET-aligned destinations rather than a generic interest profile. What JobCannon does not do: it is not a state-approved test of academic proficiency, so it cannot satisfy 2S1 or 2S2. It is not a credentialing test, so it cannot satisfy 5S1. It can satisfy a 5S4 measure if your state’s plan permits a career-interest or skills-readiness completion as an optional indicator — confirm with your SEA before assuming. For the local needs assessment and the ILP requirement, the per-student export is the relevant artifact.

How does the 90 percent state-level performance target under Sec. 113(b)(3)(A)(i) affect local recipients?

Sec. 113(b)(3)(A)(i) sets the state-level performance baseline rule: a state that fails to meet 90 percent of its negotiated state-determined performance level on any indicator in two consecutive years must enter a federally required improvement plan with OCTAE. That state-level rule pushes downstream onto local eligible recipients in two ways. First, every local recipient negotiates a local-level performance target with the SEA in its annual local application under Sec. 134. The SEA cannot accept local targets so low that they make the state target unreachable, so local targets are usually set at or near the state target with limited room for negotiation. Second, Sec. 123 requires the SEA to identify under-performing local recipients (those failing 90 percent of negotiated local target on any indicator in any year) and require a Program Improvement Plan (PIP). PIP requirements are SEA-defined but typically include a root-cause analysis, a corrective-action plan, and quarterly progress reports for two years. Repeated failure can result in funding restrictions or, in extreme cases, recipient ineligibility. For local CTE directors the practical implication is that you cannot treat performance reporting as a back-office finance task. Underperformance on 3S1 (placement) is the most common trigger for PIP because it depends on second-quarter-after-exit follow-up that many districts do poorly. Career-assessment platforms reduce 3S1 risk indirectly: a student who enters a program of study aligned to their interest and aptitude profile is more likely to complete and place. They do not directly produce 3S1 data — that comes from state wage-record matches, postsecondary enrollment matches (typically via NSC StudentTracker), and military / service-program records. Treat the platform as a leading indicator of placement, not a reporting source for it.

What documentation should local recipients keep to survive a Perkins V monitoring visit?

Federal monitoring under Perkins V is conducted by OCTAE on a state cycle (states are visited roughly every 4-6 years on a risk basis), and SEAs in turn monitor local recipients on their own cycle (typically annual or biennial). The documentation set you need to keep is broadly consistent across states, with three layers. First, programmatic documentation: your CLNA report (Sec. 134(c)), your local application (Sec. 134(b)), program-of-study sequence documentation including course-to-CIP mapping, and evidence of stakeholder consultation (Sec. 134(d)). Second, accountability documentation: per-student records for every concentrator in the cohort year, including program of study, program completion status, exit reason, and placement evidence at second quarter after exit. SEAs increasingly require electronic file submissions reconciled to SIS, so paper-only records are no longer sufficient. Third, fiscal documentation: the time-and-effort certifications under Uniform Guidance 2 CFR §200.430 for any staff partially funded by Perkins, equipment inventories for items over the federal capitalization threshold (typically $5,000) under 2 CFR §200.313, and procurement records under 2 CFR §200.318-326. The single most common monitoring finding is inadequate documentation of how Perkins funds were spent on items also used for non-CTE purposes — the supplant-not-supplement test under Sec. 211(a). For assessment-platform purchases specifically, monitors will ask whether the platform was used by CTE concentrators (Perkins-allowable) or general-population students (Perkins-disallowable absent a clear allocation). Keep an allocation worksheet showing CTE-vs-non-CTE usage if you fund the platform from Perkins.

How does the program-quality indicator 5S3 (work-based learning) interact with assessment platforms?

5S3 measures the percentage of CTE concentrators who participate in work-based learning (WBL) experiences during their program of study. The federal definition of WBL is in Perkins V Sec. 3(55) and is broad: sustained interactions with industry or community professionals in real workplace settings, or simulated environments, that foster in-depth, firsthand engagement with the tasks required in a given career field. The list typically includes internships, apprenticeships, school-based enterprises that meet specific criteria, clinical placements, supervised agricultural experiences, and certain types of project-based learning with industry partners. State plans further refine the definition; California for instance requires a minimum of 25 hours of supervised work experience to count, while Tennessee allows shorter experiences but with stricter documentation. Assessment platforms interact with 5S3 in two ways. First, several states expect WBL placements to follow a documented career-interest inventory or aptitude assessment — the rationale being that work-based learning slots are scarce and should be allocated based on student-program fit rather than alphabetical assignment. JobCannon’s RIASEC, Skills Audit, and Career Match outputs are commonly used as the upstream signal to a WBL placement decision, particularly in CTE programs that work with multiple industry partners. Second, the platform’s knowledge graph (2,536 careers, 1,533 skills) lets a counselor identify which O*NET occupations align to a student’s profile and which industry partners offer WBL slots in those occupations. The platform does not provide WBL placement-tracking; you still need a placement-tracking system or a CTE student information system (e.g., CTSO Manager, Edthena, eDoctrina, or a custom SIS module) to produce the 5S3 numerator. Treat the assessment platform as the upstream allocation decision support, not the downstream placement record.

Author

Peter Kolomiets

Founder & Lead Researcher, JobCannon

Peter is the founder of JobCannon and leads the assessment validation, knowledge graph, and B2B partnerships. He has 10+ years working with NGO and educational career programmes globally.