Buyer\u2019s guide \u00b7 NACE FDS \u00b7 2026
Knowledge-rate thresholds, methodology disclosures, multi-source data combination, and where assessment platforms fit in the upstream career-services workflow.
This guide explains the NACE first-destination survey framework for the 2026 reporting cycle. It covers the standard outcome categories (employed full / part-time, continuing education, military, volunteer, fellowship, seeking, not seeking), the typical six-month window for bachelor\u2019s graduates, and the methodological refinements adopted between 2022 and 2025. It walks through the knowledge-rate threshold (65 percent NACE compliance, 80 percent quality benchmark), why low knowledge rates produce non-response bias, and how multi-source data combination (self-report, LinkedIn / Lightcast, faculty knowledge, NSC matching, state wage records) reaches 75-90 percent rates. It positions career-assessment platforms in three upstream workflows \u2014 undecided-major support, career-services intake, internship and experiential-learning matching \u2014 that affect FDS outcomes indirectly through better student-program fit, completion, and engagement. It walks through the academic-program-review implications of FDS outcomes, the state-level higher-education accountability frameworks that incorporate them, and the federal Gainful Employment regulation under 34 CFR \u00a7668.401-499 as it interacts. It surveys the methodology disclosures NACE requires (cohort, window, sources, knowledge rate, categorization rules, suppression) and closes with a six-component career-services workflow evaluation.
A reading map for university career-services and institutional-research staff.
Pre-graduation career-readiness building blocks.
For a university with 25,000 undergraduates
This guide is one of twenty in the JobCannon for Business reading library; career-services directors reading the FDS methodology here also read the corporate internal-mobility design guide for the employer-side picture of where graduates land, and the bootcamp ISA underwriting data guide for related outcome-attribution discipline used by alternative-credentials programmes.
For the operational landing in higher education, see our universities and educators vertical, where the same primitives support placement-office workflows, undecided-major support, and major-selection counselling.
Student-facing assessments stay free under a university partnership. Career-services dashboard, undecided-major reporting, and cohort-level outcome tracking run on the Business tier from $199/mo flat, or under a partnership for system-wide deployments.
Try it with a micro-team
For independent coaches and therapists
For startups, teams and HR
For agencies, L&D and scale-ups
For 200+ person companies
All plans currently activated manually via the contact form — we review each request within 24 hours and provision access the same day. Self-serve checkout coming once we've heard from the first wave of teams.
Tell us your role (career-services director, IR analyst, undergraduate dean) and your reporting cycle. We respond within one business day.
The First-Destination Survey (FDS) is the standard mechanism by which US colleges and universities measure the post-graduation outcomes of their bachelor’s and graduate-degree recipients. The methodological standards are maintained by the National Association of Colleges and Employers (NACE), which published its first standards in 2014, expanded them with master’s-level coverage in 2017, and updates them periodically; the current standards revision in effect for 2026 reporting cycles incorporates several methodological refinements adopted between 2022 and 2025. The survey collects, for each graduate, an outcome category (employed full-time, employed part-time, continuing education, military service, volunteer service, postgraduate fellowship, still seeking, not seeking) within a specified window post-graduation (typically six months for bachelor’s degrees, varying for graduate programs). For employed graduates the survey collects employer name, job title, employment location, salary where willing-to-share, and for some institutions degree-related-employment classification. For continuing-education graduates the survey collects institution name, program of study, and degree level. The survey is conducted by the institution’s career-services office or its institutional-research office, and the resulting data feeds into university rankings (US News, WSJ / THE, Princeton Review), accreditation reports, public-facing program-outcome disclosures, and increasingly state-mandated higher-education reporting. The NACE standards specify methodology requirements that affect data comparability across institutions: how outcomes are categorized, how missing data is handled, how knowledge-rate thresholds are computed and disclosed, and how the data is presented in public-facing reports.
The knowledge rate is the percentage of graduates for whom the institution has determined post-graduation status from any source, expressed as a fraction of the total graduate population in the cohort. NACE standards require institutions to report a knowledge rate of at least 65 percent for bachelor’s-degree FDS data to be considered compliant for public reporting; some institutions and accrediting bodies look for 80 percent or higher as a quality threshold. The knowledge rate matters for two reasons. First, low knowledge rates introduce non-response bias: graduates who respond may differ systematically from those who do not, and outcome rates computed from low-knowledge-rate samples can be misleadingly high or low. Second, the knowledge rate is a quality signal that ranking organizations and accreditors look at; a program reporting 95 percent employment but with a 35 percent knowledge rate is essentially reporting outcomes for the most engaged third of its graduates. Achieving high knowledge rates requires multiple data sources combined. Self-report through email and SMS outreach to graduates within the post-graduation window typically achieves 30-50 percent response. LinkedIn data scraping or licensed access (through services like LinkedIn Career Insights or Lightcast Career Pathways) adds another 20-30 percent for graduates with public profiles. Faculty and advisor knowledge — outcomes communicated to faculty in the normal course of academic mentorship — adds a few percent. Administrative record matching with state-level wage records (where state-university data-sharing agreements exist) and with the National Student Clearinghouse for continuing-education tracking adds more. The combined approach typically reaches 75-90 percent knowledge rate with sustained operational effort. JobCannon does not produce post-graduation outcome data; it operates upstream of the FDS in the career-services workflow.
Universities use career-assessment platforms in three career-services workflows that feed into FDS outcomes. First, undecided-major support — students who arrive at the university without a declared major (typically 20-40 percent of incoming first-years at many universities) are routed through career-orientation programming that includes interest, values, and aptitude assessments. The platform output supports academic-advising conversations and major-selection decisions. Universities that systematically deploy this support typically see modest improvements in major persistence and time-to-degree, both of which correlate with first-destination employment outcomes because students who persist in well-matched majors are more likely to engage with career services and complete on time. Second, career-services intake assessment — students engaging with career services typically complete an intake assessment that includes career-interest profile, work-readiness traits, and career-direction clarity. The career-services counselor uses the output to focus the engagement on appropriate services (resume support, interview preparation, employer connections, internship placement, graduate-school planning). Third, internship and experiential-learning matching — employers, faculty research labs, and community organizations with internship slots typically have role-specific candidate profiles in mind; the platform supports matching between student profiles and available roles, with the internship experience itself being a strong predictor of first-destination employment. The combined contribution of platform-supported career services to FDS outcomes is multidimensional and indirect; outcome measurement at the student-cohort level requires a careful program evaluation rather than a simple before-and-after comparison.
FDS outcomes feed into academic-program review at most universities. The annual or biennial program-review process typically considers enrollment trends, completion rates, faculty productivity, and post-graduation outcomes among other factors. Programs with weak first-destination outcomes — low employment rates, low salary outcomes relative to comparable programs, low rates of degree-related employment — face increased scrutiny in program review. The implications vary by institutional context. At enrollment-pressured institutions, weak FDS outcomes contribute to program-discontinuation decisions when combined with weak enrollment. At better-resourced institutions, weak FDS outcomes contribute to program-improvement initiatives — curriculum updates, employer-engagement strategy, internship requirements, dedicated career-services staffing for the program. The pattern across institutions over the 2018-2025 period has been increasing weight on FDS outcomes, reflecting both ranking-driven incentives and state-level higher-education accountability. Several states have introduced gainful-employment-style accountability frameworks for public institutions, with FDS outcomes feeding into the calculations. The federal-level Gainful Employment regulation (34 CFR §668.401-499 in its 2024 form) applies primarily to non-degree programs at degree-granting institutions and to all programs at proprietary institutions, and uses Department-of-Education-supplied earnings data rather than institutional FDS, but the institutional FDS data informs program-level conversations even where it does not drive federal accountability. Career-assessment platforms are not directly part of program review, but the platform-supported career-services infrastructure that improves outcomes for graduates of struggling programs is part of the program-improvement response.
NACE standards require methodology disclosures that allow readers to interpret the data correctly. The required disclosures include: the cohort definition (which graduates are included — typically all bachelor’s graduates from a specific academic year or graduating term); the survey window (the period during which post-graduation status was collected, typically six months from graduation); the data sources used (self-report, LinkedIn, faculty knowledge, administrative records); the knowledge rate (overall and by program if reported at the program level); the response rate for self-report data sources; the categorization rules for outcome types and any institution-specific deviations from NACE definitions; and any data-suppression or rounding rules applied. Some institutions go further and disclose which categories of graduates are excluded from specific analyses (international graduates returning to home countries, graduates entering family business, graduates entering military service) and how missing or partial responses are handled. The 2024-2025 NACE standards revision tightened expectations for several disclosures including the knowledge-rate computation and the treatment of graduates whose status is determined from third-party data versus self-report. Universities preparing FDS reports for the 2026 reporting cycle should review their disclosures against current NACE guidance. The methodology disclosure protects the institution against criticism that the data is selectively presented and supports comparison across institutions where the disclosure conventions match. Career-assessment platforms do not affect the FDS methodology directly; they affect upstream student outcomes that the FDS later measures.
A career-services-team evaluation of its FDS-outcome-supporting workflow has six components. First, undecided-major support coverage — what percentage of undecided-major students complete career-orientation programming, what percentage convert to a declared major within their first or second year, and how the timing compares to time-to-degree benchmarks. Second, career-services intake coverage — what percentage of graduating cohort engages with career services at any point, what percentage engages substantively (multiple visits, completed intake assessment, ongoing relationship with a counselor), and how engagement varies by major. Third, experiential-learning placement — what percentage of graduates completed at least one internship, research experience, or substantive part-time job aligned to their field, and how placement varies by major and demographic. Fourth, employer-engagement — the count of employer relationships, the diversity of employers across industry and geography, the number of on-campus and virtual recruiting events, and the conversion rate from event attendance to interview to offer. Fifth, FDS data quality — knowledge rate trend, response rate trend, the share of outcomes determined from each data source. Sixth, FDS outcome trend — employment rate, salary, degree-related-employment percentage, by program and across the institution, with appropriate context about comparison institutions or state benchmarks. JobCannon supports the first two components directly through assessment-platform integration with career-services workflow, and supports the third through experiential-learning matching. The fourth, fifth, and sixth components require career-services-management infrastructure (CRM, recruiting platform, FDS survey tooling) outside the assessment platform.
Author
Founder & Lead Researcher, JobCannon
Peter is the founder of JobCannon and leads the assessment validation, knowledge graph, and B2B partnerships. He has 10+ years working with NGO and educational career programmes globally.