SEARCH TODAY'S NEWS ARCHIVES

Report: University Faculty Can Use Small Data to Make Big Change Now

By Joelle Fredman, NASFAA Staff Reporter

Due to the focus on accountability in higher education, there is a big push for institution-wide data to judge how colleges and universities are faring when it comes to preparing students for the labor market. A new report argues, however, that it is actually small data that is needed to motivate faculty and staff to take action to help students now.

“Accumulating and pushing out more and more data is not necessarily a winning strategy. Savvy data use involves putting the right data in the hands of the right people at the right time,” the authors of the American Council on Education (ACE) report wrote.  

In the report, authors distinguished between “administrative data,” or small data collected through one-on-one interactions such as financial aid counseling, and “accountability data,” which includes larger sets of numbers such as an institution’s cohort default rates. They argued that the latter can be just as instrumental as the former in motivating staff to help students if it is broken down into smaller subsets. According to the authors, this will enable faculty and staff to find a number which will mobilize them to action and make an issue appear manageable — which the authors refer to as the “actionable N.”     

The authors placed a strong emphasis on the importance of “often-overlooked” small data and wrote that the “actionable N” can be as little as just a single student.  

“An ‘N’ of one brings data close-to-practice when staff, faculty, and administrators ask themselves, ‘What is the best course of action we can take to fulfill our responsibilities in this case?’” the authors wrote.

The authors warned that as institutions invest more money into technological systems to collect data, they need to also recognize the importance of this empirical data gathered by faculty and staff.

“With greater investments in computer systems, investments in the human systems of data use must keep pace,” the authors wrote. “Danger lies in investing in the realm of high tech without simultaneously investing in the student-centered workforce that has the nuts-and-bolts, DIY know-how concerning the quintessential work of the university that student information systems must support.”

And while the authors argued that this close-to-practice data is a good motivator for faculty and staff to help students “because the need and possibility of having a positive impact is clear,” institutions are often asked to provide a bigger picture of their performance using larger data sets such as graduation rates over a number of years.

The authors wrote that this data can “come to life” when staff and faculty “interpret the data on a human scale, at a nuts-and-bolts level where they know they can make a difference through their own work,” by breaking the data down into smaller subsets.

“When the large and amorphous cohorts of students represented in accountability data were broken down into smaller cohorts of students at particular places in their academic journeys, staff, faculty, and administrators saw that policies and practices within their control were barriers to student success,” the authors wrote.

For example, the authors conducted a case study and found that one institution discovered in its larger dataset that two-thirds of its students who enrolled at the school with undeclared majors dropped out. By breaking the data down even further by race and ethnicity, the school found that African-American students did not receive academic advising at disproportionately higher rates than other students because they most often came in with undeclared majors. This led the school to revise its curriculum to help this population of students remain in school and update other student policies to help close this gap.

“The ideals of accountability become actionable when staff, faculty, and administrators make an active connection between the numbers in the data and the students they advise, teach, and shepherd through higher education—whether that work takes place in classrooms, advising centers, or administrative and program offices,” the authors wrote.

The authors recognized that not every institution has the tools and resources to interpret and break down larger datasets, and those that do have institutional researchers (IRs) found that they were often overworked, and suggested that institutions create “hybrid information technology-student advising positions and place them in things such as enrollment and student retention offices, and offer training opportunities for staff with data analytic providers.” Additionally, they suggested that institutions create the space for IRs to teach interested staff how to analyze data — especially since they are the ones capable of making changes for their students.  

“[E]ducational leaders should seek out ways to move data out of the hands of their often overworked and inaccessible data gurus in the IR offices and into the hands of faculty, staff, and administrators who are equipped to ask and answer meaningful questions of institutional data,” the authors wrote.

 

Publication Date: 6/13/2018


You must be logged in to comment on this page.

Comments Disclaimer: NASFAA welcomes and encourages readers to comment and engage in respectful conversation about the content posted here. We value thoughtful, polite, and concise comments that reflect a variety of views. Comments are not moderated by NASFAA but are reviewed periodically by staff. Users should not expect real-time responses from NASFAA. To learn more, please view NASFAA’s complete Comments Policy.

Related Content

Poll the Pros: 2024-25 Aid Offers Results

MORE | ADD TO FAVORITES

Original Research - Highlighted Projects

MORE | ADD TO FAVORITES

VIEW ALL
View Desktop Version