Community
In the middle of 2023, the world of computing is abuzz with excitement over the emergence of large language models (LLMs) such as GPT4/ChatGPT premium and StarChat. Understanding what these models can and cannot do and how to successfully apply them for business advantage is not simple. It's important to analyse the emerging research on the properties of current generation LLMs and outline the strategies that will need to be adopted if they are to be successfully applied.
LLMs are special because they emit language in response to language; if the model is stimulated with some text, it produces relevant text as a response. This means that it is easy for anyone to interact with any LLM that they have an interface to, and many LLMs have been made publicly available through chat interfaces. Because of this, the development of LLMs as an AI technology has had a sudden and significant impact on the public perception of the capabilities of AI. Understanding LLMs
The only thing that LLMs do is consume text, and produce text, but because the text generation is so good, the models appear to reason about and understand the text they are manipulating. Many people working in natural language and AI research have been working hard to understand and probe LLM capabilities. There is a growing literature that is identifying the limitations of the current generation of models and demonstrating that perhaps the initial excitement that greeted them should be tempered. It is important to round up the current list of limitations to state-of-the-art LLMs and to evaluate both the significance of these and the likelihood of them proving to be fundamental flaws of LLMs as an approach to AI. In other work, some of the technical limitations of LLMs are surveyed.
However, I have looked at validated limitations with some simple examples of current LLM behaviour and analysed non-technical constraints such as security and intellectual property issues. Having reviewed the limitations of the technology, you can then examine how the technology can be applied successfully and what enterprises should focus on, to generate maximum value out of the opportunity that the LLM revolution creates.
A pathway to success
A path to success can be defined for organisations wishing to access the undoubted value of the new generation of LLMs, whilst managing the risk from their identified weaknesses. This pathway to success lies between constraining LLMs’ use to components delivering well specified and controlled functionality, to embed them in appropriate infrastructures of control and accountability.
It is possible that future LLMs may resolve the issues that currently prevent the unconstrained use of this new generation of models. For example, LLMs may well be reengineered (beyond current transformers) to plan effectively in the relatively near future. Technically, there does not seem to be a fundamental reason why this cannot be done, although it will certainly require another astonishing investment in compute power.
Other limitations, such as dealing with compositional reasoning, parroting and security seem more intractable. Regardless of continuing advances, it is worth considering that far simpler, mature and predictable technologies such as email, databases and web browsers all still require sophisticated application patterns and management controls. It seems unlikely that LLMs will prove to be any different.
The natural language interface demonstrated by many of the latest generation of LLMs has awakened a much wider population to the power of LLMs in particular, and AI more generally. As such, we have identified some of the main limitations of such approaches, and at the same time made recommendations for implementations that can mitigate some of these issues, ultimately enabling the successful adoption of LLMs. However, it must be noted that none of this removes the need for vision, investment, and a skilled team to implement such solutions.
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Alex Kreger Founder & CEO at UXDA
27 November
Kathiravan Rajendran Associate Director of Marketing Operations at Macro Global
25 November
Vitaliy Shtyrkin Chief Product Officer at B2BINPAY
22 November
Kunal Jhunjhunwala Founder at airpay payment services
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.