Table of Contents >> Show >> Hide
- What Is IPEDS, and Why Does Adding ACTS Matter?
- Why the Department of Education Proposed ACTS
- What ACTS Would Add to IPEDS
- Which Institutions Would Be Affected?
- Why Colleges and Associations Pushed Back
- Why Some People Still Support the Proposal
- What Happened After the Proposal
- What This Means for Students and Families
- The Bigger Policy Takeaway
- Experiences From the Campus Front Line
- Conclusion
If you have ever wished federal higher education reporting came with fewer acronyms and more plain English, welcome to the club. Unfortunately, Washington had other plans. In 2025, the U.S. Department of Education proposed adding a new survey component to IPEDS called ACTS, short for the Admissions and Consumer Transparency Supplement. The name sounds tidy. The implications are not.
At first glance, this may look like another routine federal paperwork update. It is not. The proposal would expand how colleges report admissions, financial aid, academic performance, and student outcomes, especially in ways that break data out by race, sex, income, first-generation status, test scores, GPA ranges, and more. In plain English, this was not a small form tweak. It was a major redesign of what the federal government wants to know about how colleges admit students and what happens after they enroll.
That is why the ACTS proposal quickly became one of the most talked-about higher education data developments of the year. Supporters framed it as a transparency move. Critics saw a fast-moving reporting mandate with giant compliance costs, confusing definitions, privacy concerns, and a policy goal that could easily outrun the quality of the data itself.
This article explains what ACTS is, why the Department of Education proposed adding it to IPEDS, what kind of information schools may be required to report, why campuses are uneasy, and what the whole thing could mean for students, families, and college leaders. Because while “IPEDS modernization” may not sound like thrilling dinner conversation, it absolutely matters to how institutions are judged and how policy fights get framed.
What Is IPEDS, and Why Does Adding ACTS Matter?
IPEDS, the Integrated Postsecondary Education Data System, is the federal government’s core higher education reporting system. Colleges that participate in federal student aid programs already provide data through IPEDS on topics such as enrollment, completions, student aid, graduation rates, staffing, tuition, and finances. It is the backbone of a huge share of public information about colleges in the United States.
Because IPEDS is already mandatory for institutions tied to Title IV federal aid, adding a new component to it is a serious step. It does not merely create another optional dashboard or a voluntary transparency initiative. It effectively turns a new reporting idea into a compliance obligation for covered schools.
That is one reason the ACTS proposal drew so much attention. The federal government was not asking colleges to send in a casual memo about admissions trends. It was proposing to insert a new admissions-focused collection into an already mandatory national data system. Once that happens, the conversation shifts from “interesting policy idea” to “who has to report what, by when, and with what penalties if they do it wrong?”
Why the Department of Education Proposed ACTS
The proposal did not appear out of nowhere. It followed an August 2025 White House memorandum on higher education admissions transparency and a same-day directive from the Secretary of Education to the National Center for Education Statistics. The administration’s stated argument was that existing public data were not detailed enough to show whether institutions were using race-based preferences or proxy variables in admissions after the Supreme Court’s Students for Fair Admissions decisions.
That goal shaped the entire design of ACTS. The proposal was built around the idea that disaggregated admissions data could reveal patterns that broad institutional averages might hide. In that framing, colleges would not simply report how many students applied and enrolled. They would report far more nuanced slices of information that could be compared across applicant, admitted, and enrolled cohorts.
Supporters of the move argue that greater visibility into admissions and aid practices helps families, policymakers, and watchdogs understand whether access is fair and whether institutional behavior matches public promises. That is the cleanest version of the case for ACTS: more data, more transparency, better oversight.
But even people who support better admissions data have warned that the proposal raises a separate question: Can the data actually do what the Department wants them to do? A spreadsheet can show patterns. It cannot always explain the decision-making logic behind those patterns. And that gap matters a lot when the underlying policy debate is already politically explosive.
What ACTS Would Add to IPEDS
The proposed ACTS component goes well beyond traditional admissions reporting. Instead of collecting only basic counts or standard survey responses, it expands the scope to include much more detailed breakdowns tied to admissions, academic indicators, aid, and student outcomes.
At the undergraduate level, the proposal focuses on:
Applicant, admitted, and enrolled groups broken out by race and sex; academic measures such as GPA and standardized test scores; indicators tied to economic background like family income range and Pell Grant eligibility; and student background markers such as parental education or first-generation status. The framework also reaches into admissions pathway details, such as whether a student came in through regular decision, early action, or early decision.
At the graduate and professional level, the reporting gets even more layered:
Institutions may need to report comparable admissions and enrollment data by field of study, often using CIP-based program categories. That means the federal government is not just asking broad university-level questions. It is asking for more granular program-level visibility in areas where data collection practices vary widely across campuses.
And it does not stop with admissions:
The broader ACTS concept also reaches into enrolled student outcomes and aid. That includes graduation rates, final GPA, and information connected to aid offered or aid received. So while the proposal is framed as an admissions transparency supplement, it is really an admissions-plus-outcomes-plus-aid reporting expansion.
That is a big reason why the proposal triggered alarm. This is not just “tell us how selective you are.” It is “assemble a detailed, highly disaggregated picture of the admissions funnel and what follows it.”
Which Institutions Would Be Affected?
One of the most important questions was scope. Early discussion centered on institutions with competitive or selective admissions. Legal and association summaries noted that the proposal was aimed at four-year selective institutions, while open-admission schools appeared to be outside the main target. Later implementation materials described required institutions more specifically as four-year degree-granting public, private nonprofit, and private for-profit institutions that primarily award bachelor’s degrees or higher, including institutions that offer only graduate degrees.
There was also a notable exemption concept: institutions could be excused in a survey year if they admitted virtually all applicants or operated as open admission and did not award non-need-based aid in that year. That caveat matters because it shows ACTS was not simply about selectivity in the popular sense. It also tied into the government’s interest in whether merit-style or non-need-based aid might interact with admissions patterns.
Translation: some colleges might avoid the reporting requirement in certain years, but many four-year institutions would still land squarely inside the collection’s orbit.
Why Colleges and Associations Pushed Back
The resistance to ACTS was not just political theater. A lot of it centered on operational reality.
1. The timeline looked brutally fast
Higher education groups argued that the Department moved from memo to proposal to implementation at breakneck speed. Institutions said they were being asked to prepare a major new reporting regime in a matter of weeks or months, not years. For a collection this complex, that is like announcing a kitchen remodel and expecting dinner to be ready on time anyway.
2. The historical backfill raised immediate red flags
One of the most criticized parts of the plan was the requirement to report not only current-year information but also multiple prior years. Later materials indicated that the first ACTS cycle would pull in admissions data going back through 2020-21 and graduation-rate information going back even further. Trade groups argued that many institutions simply do not retain admissions-related inputs in a clean, standardized, easily retrievable way across that many years.
3. The burden estimate was enormous
By the Department’s own later estimate, institutions would spend an average of about 200 additional hours on the initial ACTS submission and about 40 extra hours in later years. Sector groups argued that even those numbers might be optimistic. When a one-person or very small institutional research office hears “200 hours,” it does not hear “manageable update.” It hears “cancel other work and start digging through old systems.”
4. The reporting architecture is unusually detailed
Associations reviewing the proposal warned that the number of new categories and fields could be staggering. ACE and others described the expansion as potentially the largest in IPEDS history, with more than 100 new questions and thousands upon thousands of additional reporting fields once all required disaggregations are counted.
5. Privacy concerns are not academic
Highly disaggregated data are useful precisely because they get specific. But that same specificity can create small-cell problems, especially at smaller institutions or in niche graduate programs. If a data table becomes too granular, users may be able to infer information about real individuals, even when names are not attached. That tension between transparency and privacy sits right at the center of the ACTS debate.
6. Data quality depends on definitions, and definitions were a fight
Institutions and associations repeatedly warned that some elements in the proposal were not collected consistently across campuses. Family income ranges, parental education, unweighted high school GPA, and standardized test use are not always stored the same way. If the federal government wants comparable data, it has to define the terms in a way campuses can apply consistently. Critics worried that ACTS moved faster than that groundwork allowed.
Why Some People Still Support the Proposal
For all the criticism, it would be unfair to pretend the push for better admissions data has no logic behind it. Many students and families genuinely want more transparency. They want to know who gets admitted, how aid intersects with access, whether first-generation and low-income students are entering and graduating, and whether institutional policies produce opportunity or merely talk a good game.
In that sense, ACTS taps into a very real demand: more meaningful public information about admissions and outcomes. The federal government is already the central collector of college data. So from a policy design standpoint, IPEDS is the obvious place to try to build a more robust admissions picture.
The catch is that a good question does not automatically produce a good reporting instrument. That is where the debate gets serious. Plenty of observers agree that students deserve better data while still believing ACTS, as proposed, asks for too much, too fast, with too much ambiguity.
What Happened After the Proposal
The ACTS story did not stop at the proposal stage. Public comments poured in. Associations, institutional research professionals, access advocates, attorneys, and campus administrators weighed in on feasibility, intent, privacy, methodology, and timing. AIR’s community survey of more than 580 professionals found deep concern about immediate rollout, five-year backfilling, staffing limits, data reliability, and privacy risk.
Later federal materials showed the proposal continuing to advance. By late 2025, the Department was still moving the collection through the Paperwork Reduction Act process, and by December 2025 NCES announced that OMB had approved the new ACTS component and opened the collection window, with a March 2026 deadline for keyholders.
That later development is worth mentioning because it revealed something important: the proposal was never just symbolic rhetoric. It became an operational compliance issue. Once that happened, the abstract policy debate turned into a practical scramble across campuses.
What This Means for Students and Families
For students and parents, ACTS could eventually produce more detailed public information about how colleges admit, enroll, support, and graduate different groups of students. That is the optimistic version, and it is not a silly one. Better data can improve college choice, sharpen accountability, and expose gaps that broad averages gloss over.
But consumers also need to be careful. More data do not always mean better interpretation. If a future dataset is incomplete, inconsistently defined, or built from institutions rushing to reconstruct years of historical information, it could mislead just as easily as it could inform. A flashy public table can look authoritative while still resting on shaky foundations.
That is why the larger lesson is not simply “more transparency good” or “more reporting bad.” The lesson is that transparency only works when the reporting system is feasible, clearly defined, privacy-conscious, and methodologically sound.
The Bigger Policy Takeaway
The Department of Education’s proposal to add ACTS to IPEDS is really a story about the collision of three forces: federal oversight, public demand for clearer admissions information, and the limits of institutional data infrastructure. Each force is real. Each force matters.
The federal government wants more visibility into admissions practices. Families want clearer signals about access and fairness. Colleges, meanwhile, know that data systems are messy, historical records are incomplete, and the difference between “collectable in theory” and “accurate in practice” can be enormous.
So the ACTS debate is not just about one supplement. It is about how higher education will be measured in the years ahead. Will federal reporting move toward more student-level and highly disaggregated collections? Will privacy protections evolve at the same pace? Will technical review and field testing remain important, or will politics keep outrunning process?
Those are the real stakes. ACTS may sound like one more acronym in the alphabet soup of higher ed policy. In reality, it is a test of whether the government can demand sharper admissions transparency without breaking trust in the data it collects.
Experiences From the Campus Front Line
To understand why this proposal landed with such force, it helps to picture how it feels inside an institution. For an institutional research director at a small university, ACTS is not a policy white paper. It is a blinking cursor at 9:40 p.m., a spreadsheet with missing fields, and a half-forgotten admissions database that nobody has touched since before the current phone system was installed. It is the practical experience of being told that years of decisions now need to be reconstructed, categorized, cleaned, cross-checked, and delivered in a format precise enough for federal reporting.
For admissions teams, the experience is different but just as intense. They may understand the broad categories being requested, yet still run into problems the moment definitions get specific. Which GPA counts if the system holds weighted and unweighted versions? What happens when a student applies test-optional but later submits a score? How should application rounds be coded when institutional practice changed over time? These are not dramatic questions, but they are exactly the kind that decide whether a dataset becomes reliable or turns into a very expensive mess.
Financial aid offices face their own version of the scramble. ACTS crosses into areas that sit between admissions, aid, registrar functions, and institutional research. That means offices that usually work in parallel suddenly have to stitch together records from multiple systems that were not always designed to speak the same language. One team may track Pell eligibility one way, another may use a different historical field for family income, and a third may hold award data in a format built for packaging rather than federal disaggregation. The lived experience here is less “new insight” and more “everyone open your laptops, we are going system by system.”
Then there is the privacy side. Campus data professionals do not just worry about whether they can produce the tables. They worry about what happens after they do. At smaller colleges, one highly specific cut of the data can point uncomfortably close to a real student, especially in specialized graduate programs or among very small demographic groups. That creates a quiet but serious tension. The same people tasked with increasing transparency are often the ones who feel most responsible for protecting student confidentiality.
And yet, there is another experience running alongside the stress: a genuine belief among many professionals that better public information could be useful if done well. That is what makes the ACTS debate so complicated. Many people in higher education are not rejecting transparency. They are rejecting rushed transparency, unclear transparency, and transparency that risks turning fragile data into confident headlines. Their experience is not simple resistance. It is the frustration of being asked to build a very complicated bridge while traffic is already crossing it.
Conclusion
The proposal to add ACTS to IPEDS is one of those policy moves that looks bureaucratic until you examine the details. Then it becomes obvious that it reaches into admissions strategy, student privacy, financial aid operations, data governance, institutional workload, and the national debate over fairness in higher education. Whether you view ACTS as overdue transparency or overbuilt federal oversight, one thing is clear: it is a consequential attempt to redefine how admissions data are gathered and interpreted in the United States.
If the Department wants ACTS to succeed in the long run, the challenge is not simply collecting more numbers. The challenge is making sure those numbers are clear, comparable, accurate, and safe to use. Otherwise, the government may get a mountain of data and still end up short on trust.