SANTA CLARA, Calif., Dec. 5, 2023 – cnvrg.io, an Intel company and provider of artificial intelligence (AI) and large language model (LLM) platforms, today released the results of its 2023 ML Insider survey. While every industry appears to be racing toward AI, the annual survey revealed that despite interest, a majority of organizations are not yet leveraging generative AI (GenAI) technology.
Released for the third year, cnvrg.io's ML Insider survey provides an analysis of the machine learning industry, highlighting key trends, points of interest and challenges that AI professionals experience every day. This year’s report offers insights from a global panel of 430 technology professionals on how they are developing AI solutions and their approaches to applying generative AI to their businesses.
“While still in early development, generative AI has been one of the most talked-about technologies of 2023. The survey suggests organizations may be hesitant to adopt GenAI due to the barriers they face when implementing LLMs,” said Markus Flierl, corporate vice president and general manager of Intel Cloud Services. “With greater access to cost-effective infrastructure and services, such as those provided by cnvrg.io and the Intel Developer Cloud, we expect greater adoption in the next year as it will be easier to fine-tune, customize and deploy existing LLMs without requiring AI talent to manage the complexity.”
GenAI Adoption Trends
Despite the rise in awareness of GenAI technology in 2023, it is only a slice of the overall AI landscape. The survey reveals that adoption of large language models (the models for training generative AI applications and solutions) within organizations remains low.
Three-quarters of respondents report their organizations have yet to deploy GenAI models to production, while 10% of respondents report their organizations have launched GenAI solutions to production in the past year. The survey also shows that U.S.-based respondents (40%) are significantly more likely than those outside the U.S. (22%) to deploy GenAI models.
Download image (197 KB)
While adoption may not have taken off, organizations that have deployed GenAI models in the past year are experiencing benefits. About half of respondents say they have improved customer experiences (58%), improved efficiency (53%), enhanced product capabilities (52%) and benefited from cost savings (47%).
Adoption Challenges
The study indicates a majority of organizations approach GenAI by building their own LLM solutions and customizing to their use cases, yet nearly half of respondents (46%) see infrastructure as the greatest barrier to developing LLMs into products.
Download image (106 KB)
The survey highlights other challenges that might be causing a slow adoption of LLM technology in businesses, such as lack of knowledge, cost and compliance. Of the respondents, 84% admit that their skills need to improve due to increasing interest in LLM adoption, while only 19% say they have a strong understanding of the mechanisms of how LLMs generate responses.
This reveals a knowledge gap as one potential barrier to GenAI adoption that is reflected in organizations citing complexity and lack of AI talent as the biggest barriers to AI adoption and acceptance. Additionally, respondents rank compliance and privacy (28%), reliability (23%), high cost of implementation (19%) and a lack of technical skills (17%) as the greatest concerns with implementing LLMs into their businesses. When considering the biggest challenge to bringing LLMs into production, nearly half of respondents point to infrastructure.
Download image (204 KB)
There is no doubt GenAI is having an impact on the industry. Compared with 2022, the use of chatbots/virtual agents has spiked 26% and translation/text generation is up 12% in 2023 as popular AI use cases. This could be due to the rise in LLM technology in 2023 and the advances in GenAI technology. Organizations that have successfully deployed GenAI in the past year see benefits from the application of LLMs, such as a better customer experience (27%), improved efficiency (25%), enhanced product capabilities (25%) and cost savings (22%).
Intel’s hardware and software portfolio, including cnvrg.io, gives customers flexibility and choice when architecting an optimal AI solution based on respective performance, efficiency and cost targets. cnvrg.io helps organizations enhance their products with GenAI and LLMs by making it more cost-effective and easier to deploy large language models on Intel's purpose-built hardware. Intel is the only company with the full spectrum of hardware and software platforms, offering open and modular solutions for competitive total cost of ownership and time-to-value that organizations need to win in this era of exponential growth and AI everywhere.
For the full ML Insider 2023 report visit the cnvrg.io website.
About cnvrg.io
cnvrg.io, an Intel company, is a full-stack machine learning operating system with everything an AI developer needs to build and deploy AI on any infrastructure. cnvrg.io was built by data scientists to help data scientists and developers automate training and deployment of machine learning pipelines at scale and help organizations enhance their products with GenAI and LLMs by making it more cost-effective and easier to deploy large language models on Intel's purposebuilt hardware provided by Intel Developer Cloud. cnvrg.io helps organizations accelerate value from data science and leverage the power of generative AI technology faster.