Buyer\u2019s guide \u00b7 Perkins V \u00b7 concentrators
Sec. 3(12) two-course concentrator definition, EDFacts FS231 / FS232 file specs, Sec. 3(48) special populations, and the placement-indicator infrastructure for 3S1.
This guide focuses on the operational details of CTE concentrator reporting under Perkins V for the 2026 cycle. It walks through the Sec. 3(12) definition (two CTE courses in a single program of study at secondary, twelve credits at postsecondary), the state-level rules that determine what counts as a single program of study, and the implementation challenges around foundational courses, course completion definitions, and credit-equivalent conversion. It explains the EDFacts FS231 (CTE participants) and FS232 (CTE concentrators) file specifications, the 2026 tightening on program-of-study disaggregation, special-populations disaggregation under Sec. 3(48), and the cohort-year accounting that determines indicator placement. It walks through the nine special-populations categories and how districts identify them from administrative records, and the equity-of-access provisions that make this disaggregation more than a reporting exercise. It covers the cross-cohort tracking required for the 3S1 second-quarter-after-exit placement indicator, including NSC StudentTracker for High Schools, military and service-program data exchanges, and UI wage record matching. It positions career-assessment platforms as upstream tools that influence concentrator outcomes through pathway placement, CLNA evidence, and ILP support, without substituting for the SIS-to-EDFacts data flow. It closes with the six-layer monitoring documentation set state monitors look for.
A reading map for district CTE coordinators and SEA monitoring staff.
Career-orientation core. Aptitude and trait profile support pathway-fit decisions.
For a district covering 10,000 CTE concentrators per year
This guide is one of twenty in the JobCannon for Business reading library; CTE coordinators reading the concentrator math here usually read the full Perkins V 2026 reporting guide next for the Sec. 113(b) indicator detail, and the trade-school drop-out reduction guide for retention tactics that move 3S1 placement numerators in the right direction.
For the operational landing of CTE concentrator tracking, see our vocational and trade-schools vertical, where programme-fit screening, programme-of-study selection, and concentrator follow-through are configured for each pathway.
Student-facing assessments and Career Guide stay free for CTE concentrators. Cohort and indicator-aligned exports run on the Business tier from $199/mo flat, or under a state-level partnership for SEA-wide deployments.
Try it with a micro-team
For independent coaches and therapists
For startups, teams and HR
For agencies, L&D and scale-ups
For 200+ person companies
All plans currently activated manually via the contact form — we review each request within 24 hours and provision access the same day. Self-serve checkout coming once we've heard from the first wave of teams.
Tell us your role (district CTE coordinator, SEA monitoring lead) and your indicator focus. We respond within one business day.
Perkins V Sec. 3(12) defines a CTE concentrator differently for secondary and postsecondary contexts. At the secondary level, a concentrator is a student served by an eligible recipient who has completed at least two courses in a single career and technical education program or program of study. The two-course threshold replaces the three-course threshold that applied under Perkins IV, which is why districts that have not adjusted their concentrator math since 2018-2019 may be undercounting concentrators. At the postsecondary level, a concentrator is a student enrolled in a CTE program who has earned at least 12 credits in a single CTE program of study or has completed a CTE program if it encompasses fewer than 12 credits or its equivalent in total. The two-course rule looks simple but produces nontrivial implementation challenges. First, what counts as a single program of study? States define their CTE programs of study at the state plan level using a state-determined CIP-code-based crosswalk. A district’s CTE department must map every CTE course to one or more programs of study; courses that count toward multiple programs (foundational courses common across pathways) need clear rules about whether they count toward concentrator status in each program or only one. Second, how is course completion defined? Most states require a passing grade for the course to count toward concentrator status; some allow audited or pass-only grading. Third, how are credit-equivalent measures handled in postsecondary? Programs that use clock hours rather than credit hours need a state-approved conversion to determine when 12-credit-equivalent has been reached. State plan documents are the authoritative source for these state-level decisions, and districts often discover they have been counting concentrators incorrectly only when state monitors reconcile district submissions against state-level rules.
EDFacts is the U.S. Department of Education’s data submission system for state education agencies, and the OCTAE-administered file specifications FS231 (CTE participants by gender, race, ethnicity, special populations, and disability status) and FS232 (CTE concentrators by the same dimensions plus program of study) are the core Perkins V data files. The 2026 cycle has tightened from earlier cycles in three ways. First, program-of-study disaggregation is now required at the FS232 level rather than the aggregate-secondary or aggregate-postsecondary level; states that previously aggregated must produce per-program-of-study counts traceable to the state-approved POS list. Second, special-populations disaggregation under Sec. 3(48) (individuals with disabilities, individuals from economically disadvantaged families including low-income youth and adults, individuals preparing for nontraditional fields, single parents including single pregnant women, out-of-workforce individuals, English learners, homeless individuals, youth aged out of foster care, youth with parent on active duty in the armed forces) is required at finer cell granularity, with cell-size suppression for privacy under PTAC guidance. Third, the cohort-year accounting for concentrator status has been clarified — a student who became a concentrator in a prior year continues to count as a concentrator in the current year if still enrolled, but the program-of-study attribution at exit determines indicator placement (academic proficiency, placement, etc.). The file submission timeline for the 2024-25 program year is approximately spring 2026, with the Consolidated Annual Report due by 31 December for the prior program year. State-level data quality checks reconcile EDFacts submissions against state SLDS records and against district SIS records via state file specifications.
Special populations under Perkins V Sec. 3(48) drive both the federal disaggregated reporting requirements and the equity-of-access provisions of the law. The nine categories are listed in the previous answer. Operationalizing them in district reporting requires identifying each concentrator’s special-populations status from existing district data sources, since Perkins V does not require a separate intake mechanism. Individuals with disabilities are identified from special-education records (IEP / 504 status). Economically disadvantaged includes free / reduced-price lunch eligibility under the National School Lunch Program in K-12 and Pell-eligible status in postsecondary. Single parents are identified through self-report on enrollment forms, often with a Title IX-aligned approach to gender-neutral reporting. Out-of-workforce individuals (postsecondary) typically self-identify at intake. English learners are identified from ESL / EL program records. Homeless youth under McKinney-Vento Act definitions are identified through the district’s homeless liaison records. Youth aged out of foster care are identified through state-level child-welfare data exchanges where authorized. Military-family youth are identified through self-report or DoD impact data where available. Nontraditional-field preparation depends on whether the student is enrolled in a program where their gender represents fewer than 25 percent of those employed in the related occupational area; this requires a state-determined nontraditional-occupation list that districts apply to their POS enrollment. The disaggregation reveals equity gaps that local recipients are required to address through their CLNA (Sec. 134(c)) and local plan (Sec. 134(b)). Career-assessment platforms can flag potential mismatches between special-populations students and their POS enrollment, but the special-populations identification itself is sourced from district administrative records, not from the platform.
The 3S1 placement indicator under Perkins V Sec. 113(b)(2)(A)(iii) measures the percentage of CTE concentrators who, in the second quarter after exiting from secondary education, are in postsecondary education or advanced training, military service, a service program (Peace Corps / AmeriCorps), or competitive integrated employment. The indicator is computed at the cohort level (the cohort of students who exited in the relevant program year) and reported in the Consolidated Annual Report two years after exit (because the exit year plus the second quarter after exit plus state data lag is approximately eighteen months total). District tracking has four required data sources. First, postsecondary enrollment matching, typically via the National Student Clearinghouse StudentTracker for High Schools service, which matches by name and date-of-birth against the Clearinghouse’s coverage of approximately 97 percent of US postsecondary enrollment. The match has known limitations — students enrolling in colleges not in the Clearinghouse, students with name or date-of-birth ambiguity, and students enrolling in advanced training outside Clearinghouse coverage are not matched. Second, military matching, typically through state-level Department of Defense data-sharing agreements that provide enlistment confirmation. Third, service-program matching, typically through state-level agreements with AmeriCorps and Peace Corps state offices. Fourth, employment matching through state UI wage records, with the same six-to-nine-month lag and same coverage gaps as the WIOA Sec. 116 employment indicator. Districts that lose track of which exited students were concentrators in which program of study cannot reconcile placement results to the indicator denominator. The pre-exit data quality is therefore the binding constraint on placement reporting; downstream matching is mechanical given clean concentrator records.
Career-assessment platforms fit the concentrator-reporting workflow in three ways, each of them upstream of the federal data submission. First, they support the program-of-study placement decision at the point a student selects pathway courses. A documented career-interest profile and aptitude profile, paired with the district’s POS catalog, helps students select pathways aligned to their interests and likely-to-complete programs. This shows up later as improved completion-to-concentrator conversion (more enrolled students reach concentrator status), better academic proficiency on 2S1 / 2S2 (concentrators in fit-aligned programs perform better academically), and better 3S1 placement (concentrators in fit-aligned programs are more likely to place in related postsecondary or employment). The platform does not produce concentrator records — those are produced by SIS course-completion records mapped to POS — but it influences the upstream placement decision. Second, they support the comprehensive local needs assessment under Sec. 134(c). The CLNA requires evidence on student career interest, on labor-market demand, on equity gaps, and on program quality; aggregate platform data provides usable evidence on student career interest at the cohort and special-populations level. Third, they support individualized learning plans where state law requires them. Platforms produce per-student profiles suitable for ILP attachment as evidence the plan was informed by the student’s interests and aptitudes. JobCannon’s production posture supports all three uses through per-student exports, cohort-level admin reporting with special-populations disaggregation where the district provides the disaggregation fields, and ILP-attachment-friendly PDF exports. The platform is not a substitute for the SIS-to-EDFacts data flow that produces the federal submission.
Perkins V monitoring at the local recipient level varies by SEA but typically includes annual or biennial reviews covering programmatic implementation, accountability data, and fiscal compliance. The documentation set has six layers. First, the local plan under Sec. 134(b) and the CLNA under Sec. 134(c), with evidence of stakeholder consultation under Sec. 134(d). Second, program-of-study sequence documentation including state-approved POS list, district course-to-POS mapping, and prerequisite / sequencing evidence. Third, concentrator-level data per cohort, including the SIS extract showing which students reached concentrator status in which POS in which year, reconciled to the state submission. Fourth, special-populations disaggregation evidence sourced from district administrative records, with appropriate cell-size suppression in published outputs. Fifth, placement evidence at second quarter after exit, including the data sources used (NSC StudentTracker contracts, military data exchange agreements, UI wage record matching) and the reconciliation file. Sixth, fiscal documentation under 2 CFR §200 Uniform Guidance for any items charged to Perkins funds, including time-and-effort certifications, equipment inventories, and procurement records. The most common monitoring findings are inadequate concentrator-level data quality and weak placement-data infrastructure. Districts that have invested in clean POS-mapping and reliable placement-data sources tend to clear monitoring with technical-assistance findings rather than corrective-action findings. Career-assessment platform documentation — evidence of platform use in CLNA, in ILP development, and in pathway placement — supports programmatic implementation review but does not substitute for the data quality work on the core indicators.
Author
Founder & Lead Researcher, JobCannon
Peter is the founder of JobCannon and leads the assessment validation, knowledge graph, and B2B partnerships. He has 10+ years working with NGO and educational career programmes globally.