Dalhousie faculty raise concerns over university AI integration
Bots have replaced at least three full-time jobs at the university since 2023
Dalhousie University is seeking to employ AI tools “more robustly,” Cathie O’Toole, the university’s vice-president of finance and administration, told the board of governors at a recent meeting.
Mike Fleury, the senior manager of strategic communications at Dalhousie, said in a statement to the Dalhousie Gazette that the university has been exploring the use of new technology to automate routine tasks within the university’s financial services department since 2023.
The university uses AI-powered bots to run scheduled tasks in its enterprise resource planning system — software that unites administrative tasks in a central database — and to transfer data from the DalBuy system into it.
DalBuy is a purchasing tool powered by Jaggaer AI that connects Dalhousie faculty and staff to suppliers for them to order goods and services for university-related purposes.
“Dalhousie continues to pursue operational efficiency,” said Fleury.
The use of bots for these services has replaced roughly three full-time positions in financial services by reducing the need for human involvement and automating tasks within the DalBuy system.
These positions, which Fleury said became available through natural attrition, have since been replaced by bots like Jaggaer AI, which powers DalBuy.
Jason Haslam, Dalhousie McCulloch professor of English, questions this decision. He says this leads to a lack of critical thinking and accountability in different departments.
“I’m always concerned when thoughtful individuals are replaced with unthinking machines,” he says.
Frank Rudzicz, a Dalhousie computer science professor, says that AI models can be helpful tools, but they shouldn’t have been created to replace human workers.
“I’m worried that we’re handing over too much of our own cognitive ability to the bots,” he says.
AI integration interferes with academic research
In addition to resource management and financial services, Dalhousie is also integrating AI within academic and research tools.
Haslam has had difficulty with Microsoft Copilot, an LLM-based AI assistant incorporated into the Microsoft software purchased by Dalhousie.
“I reached out to ask if [Copilot] could be turned off because it’s distracting and constitutes an academic offence within my discipline, and I was told by Dalhousie IT that it couldn’t be turned off.”
He says tools like Copilot violate the academic integrity standards of his discipline.
“Some of these tools are inherently violating academic integrity.”
Rudzicz says that AI is being utilized constantly by professors across the computer science department.
A question of conflicting standards
For Haslam, the integration of AI into essential research tools raises practical and ethical concerns for both instructors and students.
“Most of these programs hallucinate sources, hallucinate material and make mistakes,” Haslam says. “It’s an ethical quandary when we’re telling our students not to [use AI] … it means that the administrators aren’t also doing that organic work of training their own brains and learning new things.”
In addition to raising ethical concerns, Haslam says AI use weakens administrative capabilities.
“If administrators are sidestepping that [learning] process themselves, it’s an ethical quandary when we’re telling our students not to, and it means that the administrators aren’t also doing that organic work of training their own brains and learning new things.”
Rudzicz, however, worries less about the example being set and more about the specific implementation of AI tools. He sees the expansion of AI use as an opportunity to educate students about the ethical use of technology.
“This is an opportunity for us to teach responsible AI [use] more deeply,” he says. “If [computer science] is not just programming, then what is it? It also means making sure that models are fair across demographic groups. That’s not as simple as it sounds.”






