Generative AI like the onset of cloud computing but much faster

Since ChatGPT was revealed to the general public, it seems that communicating with computers in natural language is child’s play. Just connect to the site and start chatting with the smartest bot ever or open any of the Office apps and start using Co-Pilot to create documents, presentations, spreadsheets… But is that all? Obviously not. The role that AI must have in companies goes far beyond, but being able to squeeze its true potential is a considerable technical and mental challenge.
At KubeCon + CNCFCon 2024, an event dedicated to Kubernets and the Cloud Native Computing Foundation (a platform and foundation under the umbrella of the Linux Foundation), more than 12,000 super-technologists gathered to study and experiment with artificial intelligence in a to truly bring generative AI into business processes in a useful and profitable way, but a complex task awaits them.
«Experimenting with AI» – says Priyanka Sharma, Executive Director of the Cloud Native Computing Foundation – «is very easy: all you need is a notebook and knowledge of the basic principles of the platforms, but then bringing it to use by an entire company is a different kettle of fish.” Already from a technical point of view, the infrastructure that must serve tens or hundreds of employees (if not thousands) must be studied very carefully and each case has peculiarities that must be resolved after careful analysis.
And then there is the time factor. According to Sharma, in fact, the difficulties encountered closely resemble those we have already seen when the Cloud arrived: companies must understand what it is, what they can do about it, start to trust and then, finally, include it in the life of everyday. With one difference… the speed with which all this must happen will be ten times that which moved companies towards the cloud.
Contrary to what happened with the cloud, when companies were very suspicious at the beginning and there was a fairly long period of mistrust which allowed technicians to study it calmly and arrive at adoption in a more or less reasoned manner, today everyone wants AI in companies, but few know how to do it and even fewer have everything it needs at their disposal. Buying GPUs is very difficult today if you are not a large data center and skills in the sector are still rare. Not to mention the problem of acceptance into the company.
“Companies cannot keep up with the innovation we see now in AI” – said the head of information systems of a large German company met here in Paris. “We are still dealing with the cycle opened by cloud migration, with the optimization of loads and the definition of cloud native software, with staff who have not yet fully embraced the features offered by the new technology and now this cycle opens which It promises to be very fast. We will run, but we will always be left behind.”
The answer to the many problems surrounding the tidal wave that AI is becoming and which threatens to overwhelm companies lies in standardization. “We must try to create” – says Sharma – “an ecosystem that is as standardized as possible, in order to speed up learning and adoption by all entities affected by the change. We must take advantage of past experience and build on that, focusing on open and well-supported technologies such as Kubernetes.” Standardization has always been the aim of the Cloud Native Computing Foundation which for years has worked to promote the development of Open Source projects that act as a basis for the open and free development of cloud technologies. Kubernetes is one of these and at its tenth year it presents itself as the most popular platform for AI projects. The road, however, is still long and the near future sees the adoption of artificial intelligence by companies to support existing operations. Solutions like Microsoft’s Co-pilot will make their way easily because they are simple to integrate and semi-ready. Sharma preaches to resist the sirens of simplicity, but it is not a message destined to make too many converts: what brings a benefit at reduced costs and tight deadlines will be adopted quickly. The point is what will happen next and we will know this in a couple of years at most because, as we were saying, the speed of AI is ten times that of previous technologies and will continue to grow.

Find out more

#Generative #onset #cloud #computing #faster
2024-03-28 17:25:34

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.