An open letter to students in SHSS
Generative artificial intelligence (GenAI) has stormed higher education over the past year. GenAI apps and sites claim they can find materials for you, summarise literature for you, answer your questions, edit your writing, give you ideas, structure your thinking, maybe even write for you. GenAI is increasingly being sold (literally as well as figuratively) as a fantastic tool for saving time and effort.
Saving time and effort – efficiency – is all well and good. But efficiency is not the core norm driving scholarship. In some scientific departments and disciplines, automating basic language-based tasks may seem benign and unproblematic. But in the disciplines of the social sciences and humanities, language is our key tool for thinking, and a great part of your education is learning how to use it well. Most often in your learning, it is spending the time that matters. Using GenAI to summarise readings for you, to suggest answers to questions, draft structures or write for you will rob you of the opportunity to learn to do these things by yourself.
In the disciplines of the social sciences and humanities, language is our key tool for thinking, and a great part of your education is learning how to use it well.”
We often hear that universities should prepare students to live and work in a world full of AI and that is surely right. But at least part of that means learning to think and write without AI, so you can assess and judge what the machine produces. Graduating in the social sciences and humanities shows that you can distinguish a convincing argument from a superficial one, a striking phrase from a humdrum one, a good summary from one that is biased and incomplete. In order to learn these skills you will need to practice finding relevant literature, reading, summarising, arguing, drafting and writing – there are no shortcuts. Learning-by-doing is often slow, discombobulating, full of mistakes and inefficiencies, and that’s in fact what makes it fun. That is why so much of your term-time supervision work in Cambridge is deliberately not assessed: to give you a chance to make those mistakes yourself and learn from them. Don’t let genAI rob you of that opportunity.
Aside from its threats to your learning, genAI also raises profound ethical issues:
- Many large language models have been trained, without permission or credit, on the creative and expressive works of multiple knowledge sectors, from art to literature to journalism to the academy. This is a labour question as much as it is a question about intellectual creativity.
- Engaging generative AI means calling on cloud data centres, which means using scarce freshwater and releasing carbon dioxide. A typical conversation with ChatGPT, with ten to 50 exchanges, requires a half-litre of water to cool the servers, while asking a large generative AI model to create an image for you requires as much energy as charging your smartphone’s battery up all the way. It’s difficult to un-know these environmental consequences, and they should give us pause at using generative AI when we can do the same tasks ourselves.
Furthermore, we have to recognise that generative AI may be trained on elite datasets, and thus exclude minoritised ideas and reproduce hierarchies of knowledge, as well as reproduce biases inherent in this data – which raises questions about the perpetuation of harms arising from its use.
By Professor Matei Candea, Academic Project Director for Technology and Teaching and Dr Ella McPherson, Deputy Head of School and Director of Education. This piece builds on our 'Manifesto and Principles for AI in Scholarship'
Further reading and resources
Imagery: Carol Yepes / Getty images