[ad_1]
Across Australia, much of the focus to date has been on understanding artificial intelligence’s role in each sector and taking those first steps towards adoption. It was a year of experimentation and exploration. But where 2023 was the year where AI exploded into mainstream consciousness, 2024 looks like it will be a more reflective year, bringing regulation from the Australian government as pressure from consumers forces companies and IT teams to step back and have a proper look at the ethical implications behind the adoption of AI.
And for this, there will be a renewed look at the way that data feeds AI models and how that data is collected and managed by the scientists building the models. For AI to continue its positive trajectory, 2024 also needs to be the year where data governance catches up with the speed of innovation.
How CSIRO — Australia’s leading agency — views AI
Australia is determined to be a leader in AI development. The national science agency, CSIRO, has established a National AI Centre, which ran the nation’s first AI Month from Nov.15 to Dec. 15, 2023.
SEE: The continued rise of AI in 2024 will require Australian IT to overcome significant challenges.
One of NAIC’s key predictions for AI in 2024 is that “responsible AI and good governance will take centre stage.” To achieve that, NAIC recommends that organisations focus on four key priorities:
- Having a better understanding of the role data has in AI.
- Understanding the vulnerabilities AI opens up to data.
- Developing better processes for how data is collected.
- Building better ways to leverage data in models.
As Jade Haar, head of privacy and data ethics at the National Australia Bank, noted in NAIC’s predictions, there needs to be a balance between the desire to build models from big data lakes that are simply filled with as much data as possible and the governance value and ethical obligations to only collect the data that’s necessary.
“If ChatGPT has taught us anything, it is that what is commonly referred to as ‘public’ information, is not the same as ‘free and open’ information,” Haar was quoted as saying by NAIC. “Buyers of AI solutions must continue to ask about provenance and rights to data or simply accept the unknowns. The latter being less appealing to public or regulated entities.”
Why this is a pressing urgency
This call for a renewed focus on data governance in the journey to AI adoption is timely. One of the consequences of the past year AI had is that, now, every sector wants to use it, but they may be putting themselves at great risk by doing so without realising it.
SEE: Australia’s six-shield cyber security strategy could depend on how well the nation manages the vast pools of data.
For example, in its own predictions article for 2024, Australian Property Investor noted:
“AI is driving even greater data acquisition, crunching the never-ending data stream that is produced as we design, build, buy, sell and live in our homes … Big data is also driving the transformation of industries that have been laggards at embracing digitisation, including councils and local government, construction, building, and logistics.”
Meanwhile, while small and midsize businesses have been slow to adopt AI, of those in e-commerce, more than 40% of SMBs are using AI in some way. As a Forbes Australia feature highlights, AI is being pushed as a way for these businesses to reach Gen Z and other “emerging” consumers. For these sectors, combining customer data and AI allows them to better target marketing and sales efforts to these customers.
What is concerning is that data governance is too often a secondary concern. Australian organisations understand that data is a risk. The depth and breadth of data breaches that have occurred in recent years have driven that home. They’re also aware that the government is undertaking efforts to increase regulation in data governance as part of its broad cyber security strategy.
Yet despite this, the Governance Institute of Australia recently found that almost 60% of organisations have boards that do not have an understanding of the organisation’s data governance challenges. Furthermore, emergent technologies and AI are two of the three greatest risks around data governance — with the other being direct cyber attacks.
The promise of AI is to lead boards and other senior executives to push for organisations to embrace the technology without first considering the underlying data governance requirements.
SEE: Australia has been embracing generative AI while trying to stay ahead of the risks.
“We cannot underestimate the role of governance as we move towards safe, responsible and ethical creation and usage of AI and the protection of vital data,” said Governance Institute of Australia President and Chair Pauline Vamos in the report.
IT’s role in promoting data governance
The reality is that for AI models and applications to be successful, organisations need to heavily leverage their data — whether that be by collecting vast lakes of big data indiscriminately or by becoming more targeted in the data they collect and include in models. Either way, this opens the data to new levels of risk, and currently, boards and executives are not necessarily equipped to grapple with this challenge.
This means it will fall on the IT teams to champion data governance within the organisation. As the excitement of AI development cools in 2024, there will be a renewed focus on how organisations are ethnically handling and managing data. For both regulatory and reputational reasons, IT professionals have the opportunity to demonstrate leadership and protect their organisations while also delivering the benefits of IT.
[ad_2]