University charters for the use of generative AI
As the use of generative AI expands, some universities have realized that rather than viewing this phenomenon as a dubious deviation, it is necessary to establish rules for its use. This is the reason for writing and publishing ‘charters’ (regardless of their form).
This movement is widespread in Asia and North America, more disparate in Europe (and notably in France).
The general observable principle is that the student may use AI tools unless otherwise indicated by the professor, with the pedagogical conditions of use for AI tools detailed in the course syllabus (meanwhile, teachers are encouraged by the University to be very precise in this area). Example: UQAC (uqac.ca/ressourcespedago/iag-declaration).
These texts contain advice for vigilance and encourage critical thinking. For instance, interrogate Perplexity about the ‘content of university charters regarding the use of AI,’ and it will clearly respond. Overall, these charters indicate the types of usage that are accepted or even recommended (exploration, documentation, correction, translation) and those that are discouraged (innovation, complete writing, etc.).
Specifically, for the production of texts or images, the other general principle is the obligation to provide a precise indication of the use of AGI. This indication can take several forms; in general, the reference to a generated text must comply with precise formal rules like a bibliographic reference (see the APA standard, 7th edition which now includes AI);
• if it is a specific response, a footnote should be used, such as: “To the question XXX, the Mistral chatbot answered: AAA“
• some universities provide a standard statement text (for example: mandatory text in the appendix for any doctorate in Zurich) or recommend the mentions to be included in the dissertation (in footnotes, or in the ‘methodology’ section). Some regulations can be detailed: relevant passage, type of AI usage, indication of the model used and its version, reproduction in the appendix of the ‘prompts’ used (see Cambridge, Swiss universities…).
For France, a series of current web conferences on ‘Digital Transition: Focus on AI in Higher Education and Research’ (www.amue.fr) outlines the interesting experience of developing the Charter at the University of Orléans.
There is no doubt that the practice of writing ‘Charters’ on this subject will only develop in the future, given the increasing reliance on AI. Universities will all need to be clear about what is acceptable or not, and about the advice to be given both in the research work of their faculty and in the preparation and delivery of courses and the submission of texts by their students. Balances will need to be found continuously as this technology advances.
The ‘AI’ aspect also fits into a more general need for rules regarding scientific integrity in the current harmful climate.