• About

  • Awards

  • Blog

  • Issues

Back to Blog

Four Things Businesses Need to Know about AI Ahead of the Autumn Budget

18 September 2024

Simon Bain, AI expert and CEO at OmniIndex warns of the potential dangers that will come from unchecked AI adoption.

According to reports from Reuters, Labour’s autumn budget is set to prioritise public sector adoption of technologies like AI over direct investment into the industry. After scrapping the previous government’s £1.3 billion plan – that included a supercomputer at the University of Edinburgh worth £800 million – the Labour Party is planning to announce a new strategy targeted at delivering efficiencies and cost savings within the public sector.

Ahead of AI’s further integration into UK public services, Simon Bain, CEO at OmniIndex, argues that there are multiple factors to consider carefully to ensure that its adoption is safe and effective. Below, Bain outlines what businesses need to consider before investing heavily in AI.

1. The problem with LLMs

Bain: “The rise of ChatGPT has seen acronyms like LLM enter the vocabulary of the UK public. But to be frank, the novelty around LLMs has worn off and enthusiasm for them has significantly waned. The continued inaccuracies that they throw up, not limited to six-finger humans, have certainly eroded some of the public’s trust in their capabilities.

“Instead, we’re seeing a rise in popularity for smaller language models that are bespoke, built on data sets that are carefully curated and do one job well, rather than lots of jobs in a mediocre fashion. For organisations and their teams, investing time and resources into personalised SLMs is far more likely to deliver valuable information that you can rely on to be correct and appropriate, and actually support employees in doing their job.”

2. AI and sustainability don’t often go hand in hand

Bain: “It is well-reported that the extended use of some AI tools has extraordinary implications for the planet. ChatGPT’s daily power usage is nearly equal to 180,000 U.S. households, and a single conversation on ChatGPT uses around half a litre of water.

“It’s vital that public sector organisations take great care in the choices they make around which technologies to use and invest in, as well as who to partner with. The best solutions or partners will address these concerns before you ask, proving that they have considered the impact of their products.

“More usage means more impact, and we ought to reach a stage quite soon where AI can do what it’s best at, without setting the nearest forest alight.”

3. Not all data is good data

Bain: “The internet is full of an entire planet’s worth of information. Unfortunately, not all of it is true and hardly any of it is useful. In order for it to be fit for purpose, any AI chatbot, assistant or solution must be able to provide a level of transparency around the information that it returns to end-users so that it can be trusted to support them in their vital work.

“Not only can some information be wildly inaccurate, but you need to be able to confidently state that it is free to use. Avoid breaking copyright law and facing expensive lawsuits by making sure you know exactly where something has come from before it is used.”

4. Not all solutions are secure

Bain: “Feeding swathes of personal and sensitive data into an AI solution should strike fear into the eyes of any cybersecurity professional up and down the country. Should data not be adequately protected and fall into the wrong hands, organisations can face heavy fines for their failures. The government often relies on outdated security measures and protocols to protect our data as do many of the solution vendors that have entered the market in recent years.

“Any model that handles sensitive data should ensure that it remains protected and preferably encrypted. The latest technologies make it possible for analytics to be performed on encrypted data, meaning it never has to be seen or read by anyone at all. In order for AI to be a viable addition to the public sector’s tech stack, we need to ensure that we can trust it with our data, and trust any of the partners it works with.”