A chasm between high frequency users of generative AI and those reluctant or slow to get on board is growing in the workplace.
The divide is affecting job opportunities, pay expectations and levels of optimism, with proficiency dependent on frequent usage, future of work experts said at an Ai Group webinar last week.
Webinar host Megan Lilly, Executive Director of the Ai Group Centre for Education and Training, said the disruptive potential of generative AI had this year become a reality.
“Some sections of industry have already found tangible benefits in leveraging large language models (LLMs) to solve day-to-day problems and complete tasks in the workplace,” Ms Lilly said.
“These models are expected to spur new ways of working, increasingly requiring both well-developed human skills and the capability to work purposely with technology.”
Panelists were asked to give a high-level view of how they see the world of work unfolding over the coming years, given the acceleration of technologies such as generative AI.
Sean Gallagher, Director, Centre for the New Workforce, Swinburne University, likened the release of ChatGPT in November last year to the Cambrian explosion.
“It was the fastest ‘to 100 million users’ of any technology application,” he said.
“It's been extraordinary. Already there are forecasts that we could see productivity benefits of trillions of dollars from these tools, and we are just at the beginning.
“GPT-4 is 10 times better than ChatGPT was when it was released, and technology is already being worked on that is going to be 1000 times more capable.”
Fellow panelist Simon Eassom, Chief Futurist, Australian Council of Professions, said generative AI tools were unrivalled in their ability to make it easier for users to manage, keep up with and learn from the plethora of digital content generated every day.
“In terms of volume, this unstructured data (which includes Word documents, PDFs and PowerPoint presentations) is thousands of times bigger than volumes of structured data (putting data into databases),” he said.
“(Generative AI) is a necessary evolution to help us access all the information available to us in the way we work. No one single person can keep up with the amount of research generated in their professional areas.
“So, when we talk about the world of work in the future, what we're really talking about is how much a job requires us to be able to access, use, manipulate and synthesise information and then have an output based on understanding that information.
“In the past, if that meant you went to a library and looked up information in a book, that dictated the way you worked. Looking in a database is another.
“Now, if you can ask a question (of a tool such as ChatGPT) with a prompt to search documentation, you've saved an enormous amount of time, and you've possibly accessed information you wouldn't otherwise have found,” Dr Eassom said.
There's been a huge uptake in students generating their essays using generative AI, Dr Gallagher said.
“They’re getting on top of this technology straight away as they recognise the productivity potential — well ahead of workers.
“That, in many ways, demonstrates the latent potential that we're going to see in the workforce. The education system has to keep up.”
Like students, some professionals are embracing generative AI tools more quickly than others as they recognise the potential to increase efficiency.
People leaders and marketing and communications specialists are leading the way.
“HR leaders, for example, recognise this tool is going to be incredibly powerful and that they will need to drive the strategy for the uptake of these technologies within organisations, supported by IT,” Dr Gallagher said.
“This isn't traditional software, which is why HR is involved. They want to understand this technology and how to use it within their role, with the view to taking it out more broadly.”
Many workers remain distrustful about the information AI tools produce, owing to the prevalence of AI-generated fake news and video, Dr Eassom says.
“Initially, most conscientious use of generative AI within organisations will be as a valuable tool to interrogate closed or firewalled systems of data and information within that organisation,” he said.
“It's a different matter when you're using generative AI to go beyond those closed systems — trawling through everything the technology can search, remembering the output is only as good as the input.
“It’s why we go to professionals in the first place, because we believe individual people have the knowledge and experience to give us the information we need.
“However, generative AI tools are getting better at interrogating the quality of the information they're finding.”
“Part of the reason generative AI is so powerful is that, unlike other technologies, the same input delivers a variety of outputs every single time,” Dr Gallagher said.
“This makes it quite challenging for many people because it works best by having a conversation with the tool and it’s critical to build up frequent use.”
In a national survey to understand the prevalence of generative AI usage in workplaces, a team of researchers from Swinburne and Deloitte found about a quarter of Australian workers are high frequency users, using it at least weekly.
However, three in five workers aren’t using generative AI tools at all.
“There is a huge split in the Australian workforce,” Dr Gallagher said.
“There's a chasm between high frequency users and everyone else, and the reason that's significant is that high frequency users are performing higher value activities in work.
“They're not only doing more basic activities, like summarising documents, drafting emails, etc, they’re also doing more advanced work activities that generate new ideas and content.”
High frequency users are also more optimistic about their careers and foresee more opportunities than those who aren’t, the survey found.
“They see a much more significant increase in their pay from using these tools and have high expectations of better job prospects,” Dr Gallagher said.
Many people reluctant to use generative AI fear the technology will put them out of a job, Dr Gallagher said.
However, having guidelines or a policy around usage within the workplace can help allay this fear, his team found.
“Workers everywhere have embraced this — it’s a ground-up, employee-driven revolution — and companies are continuing the uptake and ensuring these tools are more widely democratised.
“It's incumbent on them to have policies and guidelines in place.”
There are three pillars to acceptable use of generative AI at work, says Dr Gallagher.
“You have to ‘own’ every single word that comes out of generative AI,” he said.
“These are fantastically powerful tools, and they're going to continue to grow in their sophistication,” Dr Gallagher said.
“The human has to be in the loop and ultimately make the final call.”
“The remaining preserve or domain of being human seems to be getting smaller, week by week,” Dr Gallagher said.
“However, four areas in which we think humans are incredibly capable, especially when combined, include:
“From an educational perspective, these four areas, particularly when integrated, are fundamental to the highest value-creating activities of any organisation: coming up with new products or services, solving complex problems or dealing with a crisis.
“So, it's perhaps a positive takeaway in terms of the future being more human and less ‘worker’.”
The Department of Industry, Science and Resources has released a discussion paper titled Safe and Responsible AI in Australia. View it here.
Save the date — the Centre for Education and Training's next webinar on the Employment White Paper will be held on Tuesday, October 24, 11-11.45am.
Wendy Larter is Communications Manager at the Australian Industry Group. She has more than 20 years’ experience as a reporter, features writer, contributor and sub-editor for newspapers and magazines including The Courier-Mail in Brisbane and Metro, the News of the World, The Times and Elle in the UK.