Tremendous opportunities as well as challenges need to be navigated when bringing generative artificial intelligence (GenAI) into the workforce.
Much excitement stems from the promise of increased productivity, but the hardest part for many organisations is getting started.
They also need to overcome a high level of wariness towards GenAI, with polls showing Australians are globally the most concerned around the technology.
At an Australian Industry Group webinar: Generative AI - from idea to pilot and pilot to scale. How do you get started, Louise McGrath, Ai Group’s Head of Industry Development and Policy, spoke to NRI Australia’s Sean Chamberlin and GPTStrategic's Dave Phung to explore the benefits of GenAI at work.
“Generative AI, with its constant levels of discovery, is transformative and can be a useful tool at the enterprise level,” Ms McGrath said.
“However, concerns over whether it is safe, can be trusted and able to be used responsibly and whether it will leak data into the world wide web still need to be addressed.”
“This time last year, there was a lot of sensationalism and probably unrealistic expectations as to what GenAI could and couldn’t do,” Mr Phung said.
“As it turns out, we're all still working and not sitting on the beach with robots and autonomous agents doing everything for us.
“It’s been a fascinating evolving of the dialogue in terms of how people think about their relationship with AI tools.”
It was at the end of 2022 when ChatGPT “trained the whole world on what generative AI was”, Mr Phung said.
“Since then, it's been in the foreground rather than the background like it is when we use platforms such as Spotify or Netflix.
“Tools like ChatGPT and other AI consumer tools have brought the technology into the centre of the stage, and there's a very bright spotlight shining on them right now.
“There are tremendous opportunities as well as challenges that need to be navigated in bringing them into the workforce.”
The sentiment regarding GenAI is very much cautious optimism, Mr Phung said.
“The average information worker has a high enough level of familiarisation with the tools — what they can do, what they’re good at and what they’re not so good at.
“They can look at the tasks that they perform at work with a new lens on, and we're starting to see generative AI being deeply integrated into different parts of businesses and business functions.
“I don't think anyone's going to disagree that generative AI is a once-in-a-generation change in terms of what can be achieved with software and automation.
“A lot of that excitement stems from this promise of productivity, but the hardest part for many organisations is getting started.”
Businesses and organisations at large will bring generative AI into the workplace in three ways, through:
“The hard part is knowing what to do first, what's worth pursuing and what to avoid.”
Think about the parts of your role that should be easy and fast but are slow and difficult, Mr Phung said.
“They might be the tasks that are ideal for generative AI integration.”
Mr Chamberlin said identifying the strengths of GenAI was useful when establishing use cases for the technology.
“Think about how AI is relatively strong compared to humans,” he said.
“Its power is far greater than its ability to rapidly produce content. There is speedy and accurate image recognition, an ability to retain huge volumes of memory and an ability to enable rapid self-improvement and learning.
“Then, consider the areas where humans have the upper hand – traits like common sense, advanced mathematical reasoning and establishing human trust.
“Try to understand where the use cases are that leverage the former and not the latter, and try to identify where you've got bottlenecks.
“That provides a rough framework for identifying those best use cases.”
A global AI Incident Database lists known issues already experienced with AI.
These include recruitment algorithms, which are well known to have a gender bias towards males and educational grading algorithms, which have been shown to favour students from higher performing schools, Mr Chamberlin said.
“Racial-based discrimination is also built into a lot of the algorithms,” he said.
“Even robotic surgery tools have been shown to have errors.
“So, absolutely, the risks are out there, but in the past few months, a huge proliferation of attention has been put on this, and subsequently, more maturity has gone into a lot of the frameworks.”
Ai Group has been working with the Federal Government’s National Artificial Intelligence Centre on a responsible-use-of-AI safety framework, due to be released soon.
The NIST cybersecurity framework also offers AI-specific risk management while the International Association of Privacy Professionals (IAPP), a global framework for AI governance, has been offering accreditation since the start of this year.
“My advice is: follow the frameworks,” Mr Chamberlin said.
“They're reasonably mature and provide some guidance on how to manage transparency, fairness, equity and all those sorts of key risks.
“The tricky part is being able to visualise what those risks are in the use case.
"That’s where you may need assistance from people who have ‘been there, done that’.”
Different standards apply within different jurisdictions around the world and even within Australia, Mr Chamberlin said.
When it comes to AI regulation, Ai Group says it’s behaviour that needs to be regulated, rather than technology.
“Things like discrimination and lying to customers — there are already regulations around those things,” Ms McGrath said.
“It doesn't matter if you do it on a fax machine or with AI, the regulations apply to all forms of technology.”
The barriers to adopting AI depend on when the technology is being used, Mr Phung said.
“The barriers to starting compared to the barriers to scaling are very different,” he said.
“Figuring out the right application of use, or use case, to pursue at the onset is probably the biggest barrier at the start.
“You need to envisage what the future looks like and explain to your CFO or the business owner how you got to that picture of what good looks like.”
Scaling a solution to broader users requires consideration of an operating model, technology underpinnings, data integrations and perhaps the data stewardship.
“You need to have a clear view of how that deployed capital will deliver a positive ROI,” Mr Phung said.
“In that respect, you’re relying on first principles. There is a lot of technology for technology's sake, and you'll start to see more of it over the coming months, but often there is no value in it.
“It’s essential to consider the business application, rather than being swept up by the ‘latest and greatest’.
“It’s like if you have a headache, taking a multivitamin is not going to help.
“A multivitamin is akin to a general-purpose consumer AI tool.
“When considering AI tools, you need to find the pain point and address it in a targeted and concerted way.
“A specific pain relief for your workplace symptoms is going to be a custom-built AI tool that addresses a specific problem with an awareness of the ROI.
“So, while many off-the-shelf, general-purpose AI tools will deliver wonderful productivity benefits, they won’t deliver a material business impact.”
Rethinking the way work is done will enable organisations to maximise the productivity benefits of GenAI.
“I think the next frontier for generative AI in the enterprise is utilising AI agents and assistants to perform a collection of narrowly focused, deeply specialised tasks that essentially work in concert,” Mr Chamberlin said.
“You start to transform the workforce in that regard. We've got projects where the AI assistant appears on the organisational chart because it’s a role that's a collection of tasks.
“The real value going forward is keeping the human at the centre of the interaction with the customer, with an AI agent informing them.
“The AI assistant can profile the person you're talking to on the phone, based on their demographic and the tone and content of what they’re saying.
“That’s the direction we’re heading in — an AI agent advising you in real time, to enhance and personalise customer experiences in a way that differentiates you from your competitors.”
Many organisations are reluctant to hire a consultant who will “come in and tell us what to do and then walk away and say good luck”, Mr Chamberlin said.
“And, they’re right — don't think you're just going to be able to hire the skill set externally and set and forget.
“I suggest identifying people whose behaviour and style of thinking are consistent with what you need and train them up: analytical, data-oriented people who enjoy structure and might have a good testing background.
“Set up a cross-functional centre of excellence that includes both technical and business people.
“It’s then the job of that centre of excellence to identify the skills and training needed, the use cases and the governance necessary so they can help drive adoption in the organisation.
“However, if someone is coming in to give you services, ensure they are ISO compliant (or on the path to compliance) and have a vested interest in handing over skills as part of their service offering.
“For example, retain part of the payment until a week or two after they've finished their work.
“That payment would be dependent upon handing over knowledge and staff training and mentoring, so they have a vested interest in ensuring they leave you better off and empowered.”
Mr Phung agrees.
“The opposite of that centralised coordination is having 100 people buying their own tools — $20 a month here, $50 a month there,” he said.
“It's all over the place, and not only does it get costly, but it's very hard to point to any material return on investment.
“The closer to the ‘pain’, the better a centralised effort will have in gaining a deeper, more intimate understanding of what the organisation needs: the underpinning processes, data and tasks and people involved.
“That detail is critical to getting a good foundational set of requirements to build AI with.”
What AI-related topics would you like to know more about?
Ai Group is planning a series of AI events in partnership with the Department of Industry, Science and Resources later this year and welcomes your ideas for consideration.
Email Ai Group’s Industry Development and Policy team to share your input.
Wendy Larter is Communications Manager at the Australian Industry Group. She has more than 20 years’ experience as a reporter, features writer, contributor and sub-editor for newspapers and magazines including The Courier-Mail in Brisbane and Metro, the News of the World, The Times and Elle in the UK.