OIG Report: FSA Lacks Evidence to Back Up Verification Selection Process

By Megan Walter, NASFAA Policy & Federal Relations Staff

The Department of Education's (ED) Office of Inspector General (OIG) publicly released a report on Thursday that found a host of issues regarding the Office of Federal Student Aid's (FSA) oversight of verification during the 2015-16 and 2016-17 award years.

OIG highlighted two major findings in the report: FSA did not evaluate its process for selecting FAFSA data elements for verification, and it did not effectively evaluate and monitor its processes for selecting students for verification.

Beginning with the 2012-13 award year, FSA selected the verification items that institutions would need to verify for applicants. These items, selected by FSA in collaboration with the Office of Postsecondary Education (OPE), were reported to have the greatest impact on Expected Family Contribution (EFC) and be most likely to be misreported. However, through its audit, OIG found that FSA and OPE did not evaluate any data when determining these verification items, and had no data to defend using them for any of the following award years.

In November 2018, NASFAA conducted research that showed verification does not impact most financial aid awards. NASFAA found that verification selection rates were too high and/or selection algorithms may not have been well targeted, as data showed that on average, 84% of verified applications resulted in either no EFC change or a change so small that it did not result in a change to the student's Pell Grant award. Also at issue with FSA and its verification processes, which NASFAA has repeatedly spoken out against as well as included in its recommendations for Congress, is ED not regularly publishing annual verification data—making it impossible to assess its true impact on students or value to taxpayers.

Over the last decade, ED has made attempts to make the verification process more efficient. But despite the attempts to ease the burden, a complex web of promised solutions and new requirements has resulted in little to no change in verification rates, creating a troubling situation for both students and schools.

The new report revealed that FSA has five selection groups that it uses to determine how applicants are selected and assigned to a verification tracking group, all of which have varied methodologies:

  • Targeted: Uses FAFSA criteria that are statistically found to have the highest likelihood of errors and could affect financial aid awarding.

  • Automatic: Uses specific criteria that FSA decided upon through professional judgment to automatically select students who reported an unexpected amount or change to that specific item on the FAFSA.

  • Identity: Selects students who FSA suspects through FAFSA responses may be fraudulently attempting to receive federal aid.

  • Discretionary: Targets FAFSA responses that are of specific concern to FSA.

  • Random: 2.5% of all applicants are selected at random.

These selection groups dictate how applicants are selected and assigned to a tracking group. OIG found that only the discretionary selection group was evaluated properly. The random selection group was not part of the audit, as there is no way to evaluate it.  

The report found that FSA did not evaluate data from the targeted selection group applicants to see if there were changes to the student's award amounts after verification was completed. The automatic selection process' two criteria, selected by FSA's professional judgement, were not evaluated. Lastly, OIG found that FSA also did not effectively evaluate the identity selection process. Because FSA did not collect disbursement data for students selected for identity verification, FSA was unable to determine if funds were ever disbursed to these students.

To conclude its report, OIG recommended that FSA:

  • Establish procedures to periodically evaluate data to identify those FAFSA data elements that have the greatest impact on the EFC and are most likely to be misreported on the FAFSA.

  • Select for verification the FAFSA data elements identified based on the evaluation performed as a result of the previous recommendation.

  • Establish and implement procedures to evaluate the effectiveness of the targeted, automatic, and identity selection processes.

  • Ensure staff members have training and experience with statistical modeling to evaluate the work General Dynamics performs to develop the statistical model for identifying the targeted selection criteria before implementing the processing cycle.

  • Evaluate whether the 30% limitation for selecting students for verification is appropriate to prevent an undue burden on students and institutions and maximizes the selection of applicants whose FAFSA information is most prone to error.

  • Establish and implement procedures to monitor the results of its selection processes to ensure the verification processes are performing as expected and significant differences are addressed.

FSA responded to the recommendations that as of March 31, 2019 it had completed analyses related to the criteria used to select applicants for verification, as well as the data elements to be verified. Related to the recommendations addressing the evaluation of the selection groups, FSA said that in the fall of 2018 it had implemented a process for the annual testing of the selection criteria used in each group. OIG did not receive any proof of the changes, and therefore was not able to include them in the report.


Publication Date: 5/17/2019

Thomas V | 5/20/2019 2:3:51 PM

The most recent audit information I could get from the IRS lists a rate of .5% in 2017. If the Department dropped all but the random selection of 2.5%, they would be five times higher than the general population. See

David S | 5/20/2019 11:13:04 AM

Joel...I would say that most of us have noticed/suspected all of this before Duncan and King served as Secretaries of Education, while they held those roles, and since they left. This is not unique to any Secretary of Ed nor to any one party. Verification hurts the students who need the most help, and it always has. And there are a number of ED processes and tools that receive little if any evaluation or assessment.

As far as a class action suit, I'm not a lawyer, but I doubt it. ED can say they're protecting the taxpayer investment and making sure the correct amount of funds are going to those who are eligible. What we need to be asking is that in light of the US Treasury losing an estimate $400B+ each year in tax fraud, why is an IRS audit rate of 1% considered sufficient but ED must make a target of 30%, which as we all know is drastically exceeded at many schools with large Pell populations.

Joshua N | 5/20/2019 9:28:51 AM

Very surprising news. It's nice to get confirmation we all felt like was true, but was unable to prove. I think the revealing of the selection criteria is welcome news also.

Michelle, I wouldn't be surprised if a class action suit was filed. Not sure how far it'd go, but it may be worth a shot.

Joel T | 5/20/2019 9:24:21 AM

Interesting to note that these findings occurred under the years that Arne Duncan and John King Jr. were in leadership roles...

Michelle C | 5/17/2019 5:12:25 PM

just WOW! anyone want to bet someone somewhere is thinking class action suit against DOE?

You must be logged in to comment on this page.

Comments Disclaimer: NASFAA welcomes and encourages readers to comment and engage in respectful conversation about the content posted here. We value thoughtful, polite, and concise comments that reflect a variety of views. Comments are not moderated by NASFAA but are reviewed periodically by staff. Users should not expect real-time responses from NASFAA. To learn more, please view NASFAA’s complete Comments Policy.

Related Content

Application Processing


Verification: 2022-2023 - February 2022: Verification: 2022-2023


View Desktop Version