Sitting down with senior staff
The following is the third installment in a series by The Bachelor focusing on artificial intelligence at Wabash College.
Wabash College is at a historic crossroads. With total surrender to artificial intelligence (AI) down one road and stubborn refusal to adapt to AI down another, Wabash must navigate a treacherous and complicated path to not only survive, but thrive in the AI age.
In light of this critical moment, The Bachelor sat down for a conversation with President Scott Feller to paint the big picture of how Wabash will continue to grapple with AI in the future. As part of the discussion, Feller analyzed the results of the student survey on AI published by The Bachelor last month. The survey took the pulse of the student body, asking undergraduate Wabash men not only how they used AI, but how they felt about its implications on their future.
“I was surprised at the level of anxiety,” said Feller. “I had assumed that young people had more tolerance for technological change than someone like me. I think they’re not that different from me in terms of seeing both opportunities and worries.”
Dean of the College Todd McDorman also commented on the wide range of responses the survey yielded. Particularly, McDorman was struck by the sheer diversity of opinions students held on the development of AI technology.
“The students are thinking about AI like the faculty are, and what I mean by that is that you see a diversity of perspectives from the students,” said McDorman. “I can’t put a percentage on the faculty, but it felt to me like it did roughly correspond with what I’m hearing from the faculty.”
As an authority on academic policy and administration, McDorman provided a unique perspective on students’ suggestions for integrating AI into the curriculum. A recurring suggestion from many students was a dedicated course on AI. While McDorman was open to this possibility, he suggested that such a class would not be a core requirement of the curriculum.
“I’m confident that people will have options,” said McDorman. “Learning about AI will be accessible more than I would say we’re going to have an AI-specific requirement. Are students in the fall going to come back and see a course that everyone has to take? I would say no.”
Another large theme from the student survey was the student body’s positive or negative predictions regarding AI in society at large. While Feller expressed both optimism and concern, he offered thoughtful reasons why Wabash should have hope in the AI age.
“My hope is that we don’t see that AI is some kind of replacement for liberal arts learning, but in fact makes the case for it,” said Feller. “Perhaps we’ll find that the jobs that go away are ones that don’t require very much critical thinking, creativity or human judgment. In my utopia, the folks who take their education seriously are immune to this AI displacement.”
While Feller maintains grounded optimism for the role of the liberal arts in a world increasingly dominated by AI, he did not shy away from the very real difficulties that AI presents not just to Wabash, but all colleges and universities. A multitude of factors — the emergence of AI, the aftereffects of COVID-19 and an impending college enrollment cliff — all contribute to what Feller described as a perfect storm that higher education must navigate.
“If AI came on fast and surprised us, the enrollment cliff is the opposite,” said Feller. “We know that next year, the number of high school graduates is going to decline. We knew this 12 years ago when they enrolled in kindergarten. If this had hit at a moment when colleges knew that there’d be more students wanting to come to college, you could probably weather that storm, but we know that we’re going to have fewer students coming to college.”
The simultaneous emergence of all of these obstacles means that the sustained success and fiscal security of colleges and universities across America is an increasingly uphill battle. However, accessible AI tools are also helping Wabash’s race to stay ahead of the curve.
Feller acknowledged that AI has been a powerful tool to optimize his own work in Center Hall. While he described himself as a moderate AI user, he noted how AI can complement his data analysis skills honed by years of experience as a professor of chemistry.
“I try to look at a lot of trends in higher education, a lot of trends in the economy,” said Feller. “I’ve found it has helped me become more efficient at accumulating information. I can do all that stuff. I have a lot of Excel spreadsheets that do these things, but it just speeds things up.”
In adapting the liberal arts to the modern world, Wabash’s administration intends to continue engaging with AI. A potential path forward would likely include institutional access to powerful AI tools, which students and faculty alike could take advantage of.
“One approach is to use an AI platform that connects to multiple AI models and charges based on usage,” said Director of Information Technology Services Brad Weaver. “These platforms can be more cost-effective and avoid locking us into a single vendor.”
Weaver noted that the College is exploring a test-run of one of these platforms, namely LibreChat. This multifaceted tool could have various applications in teaching, learning and administration.
However, such a strategy is neither simple nor easy. An important factor that many may often forget is the technological infrastructure required for AI integration. Developing AI infrastructure would require overhauling systems like logins, monitoring cost and establishing safeguards to protect users.
“Behind the scenes, there are infrastructural considerations that aren’t always visible,” said Weaver. “Providing institutional AI access isn’t just about turning on a tool. These pieces take time and coordination across multiple areas of the College to ensure we can support faculty, staff and students in using these tools effectively.”
One of the more serious concerns about institutional AI raised by senior staff is the protection of user data. Every school in America deals with sensitive data, which is protected under federal law by the Family Educational Rights and Privacy Act (FERPA). Thus, any institutional AI platform would need to comply with this law in order to keep users safe. For example, Wabash would need to develop policies that prevent students’ data from being used to train public AI models.
“Brad Weaver and I probably worry about your data a lot more than you do,” said Feller. “I think we’re going to get to a place very soon where you’re going to worry about your data as much as I have been.”
Offering safe and effective institutional AI will certainly be complicated. Thankfully, solutions don’t need to be executed overnight. Furthermore, Wabash has access to external resources that could provide critical help to this project.
“This is one reason I’m grateful to the Lilly Endowment, because I think all of these things are going to take time and money,” said Feller.
This year, the Lilly Endowment is offering a grant intended to help Indiana schools navigate the evolving landscape of AI. If secured, Wabash could net as much as $5,000,000 that could go towards tech infrastructure, as well as faculty training and co-curricular opportunities for students.
The future of AI at Wabash and beyond is not set in stone. Continuous developments in technology and policy create a landscape that demands flexibility and a constant willingness to adapt. Thus, the Wabash administration remains aware that the only guarantee is uncertainty, regardless of how ideal or grim the future may seem.
“The end is not written,” said McDorman. “Drawing grand conclusions is both difficult and probably misguided.”
Despite the uncertain future created by AI, Wabash remains committed to the same principles it was founded upon. In its nearly 200-year history, the College has seen technological revolutions irrevocably change the world, from the industrial revolution to the invention of the personal computer. Through it all, Wabash has always striven to teach its students to be not only knowledgeable, but virtuous. Even though large language models may seem like powerful and life-like tools, at present they remain just that: tools. Tools may be destructive if used improperly, but that must not prevent Wabash from teaching young men how to use even the most powerful tools responsibly.
“My hope is that AI becomes a technology that amplifies those durable skills that one could learn at Wabash College, the ones that will set you up to use these new tools to amplify your work just as humans have used tools to amplify their work for millennia,” said Feller.
