Why artificial intelligence is particularly challenging for universities

2024-03-06 23:00:00

“The science system can get even more into trouble as a result of AI.”

Martin Unger

Institute for Advanced Studies (IHS)

But there are also concerns about this. “It is clear that the science system as a whole can get even more into trouble as a result of this technology,” warns Martin Unger. We have already seen during the pandemic how much mischief can be done with simple technical means, how only supposedly scientific findings have reached millions of people and influenced them incorrectly. “AI offers potential here in a whole new league. That’s what you really have to be afraid of when you think about this year’s super election year. There is a danger that society will tip over.”

Teacher training is a central point for AI skills

The general problem behind this is the lack of AI competence everywhere, and even at universities it is questionable how structured this is already being thought about. Comprehensive skills are needed in all areas, as practically everyone would be affected. Universities in particular must immediately begin training their teachers so that they can then pass on appropriately adapted things. “This is needed in practically all disciplines – and there is a lack of people,” confirms the IHS researcher. “

That’s why an important first step would be: a massive further training offensive among teachers.” General AI training would not be enough; subject-specific offerings would also be needed as quickly as possible. “This is a mega task for universities,” warns Unger. “That’s why you should in cooperation – for example via a common platform – instead of setting up many silos.”

A door opener for this “mega-task” could be the “text-based” nature of many scientific disciplines, which is a good bridge to the so-called Large Language Models (LLM) in AI, for example the ChatGPT program.

Unger emphasizes that teacher training is also a central point: “There is a great responsibility here. If we do not give these things to prospective teachers – and ideally at all school levels – then entire generations of students will miss out on them relevant content and skills.” Unger does not see the universities as having the main responsibility – but rather a “reason-based policy”. However, universities should be involved and take their key role seriously.

In general, it’s not easy for them: “The universities are lagging behind ‘Big Tech’ from the USA when it comes to AI. These developments don’t come from the universities, but from industry, and that’s where they are driven from.”

This aspect of external control and dependence is increasingly becoming the focus of experts and critics. Professors from two universities recently discussed a “creeping loss of autonomy” and “digital immaturity” of universities in a guest article in the “Frankfurter Allgemeine Zeitung”. These have already been heavily influenced by various software applications – from plagiarism checks to online meeting tools – but never so comprehensively and profoundly. They would train the AI ​​software, become increasingly dependent on its use and be at the mercy of the mostly opaque algorithms. The authors, on the other hand, advocate alternative, open solutions.

Martin Unger also wants this for university research: “There is so much that we could analyze with a good AI application!”

1709766582
#artificial #intelligence #challenging #universities

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.