Australian know-how leaders are dropping sleep over rising regulatory complexity, rising injury from ransomware, and the challenges of near-ubiquitous synthetic intelligence (AI) and deepfakes, a brand new survey by cybersecurity physique ISACA has discovered.
Generative AI (genAI) and huge language fashions will drive the agenda in 2026, with 64% of Oceania respondents to ISACA’s 2026 Tech Trends & Priorities Pulse Poll – which surveyed practically 3,000 international safety professionals – naming them as key.
As a transformative drive, that locations genAI forward of AI and machine studying (60%), information privateness and sovereignty (34%) and provide chain threat (34%).
But for all its promise, genAI has these threat and safety professionals nervous – with 67% saying that AI-driven cyber threats and deepfakes will preserve them up at night time in 2026, and solely 8% saying they’re very ready to handle its dangers.
Some 45% fear most in regards to the “irreparable hurt” in the event that they fail to detect or reply to a serious breach, whereas 41% fear about provide chain vulnerabilities like people who hit the likes of Qantas, Dymocks, and British Airways.
Technical points resembling cloud misconfigurations and shadow IT (named by 38% of respondents) are additionally inflicting safety executives to toss and switch, as are fears that regulatory complexity (36%) will put rising stress on safety practices.
To deal with this, respondents named regulatory compliance (58%), enterprise continuity and resilience (52%), and cloud migration and safety (48%) as high focus areas – with three-quarters anticipating cyber rules will increase digital belief.
Safety leaders “are coping with fixed AI-driven threats, tighter regulation and rising expectations from executives, all whereas struggling to search out and preserve the appropriate individuals,” ISACA Board vice chair Jamie Norton stated.
“It’s an ideal storm that calls for stronger management deal with functionality, wellbeing and threat administration.”
The monster beneath the mattress
For an trade that was already extremely demanding, the brand new threats posed by genAI have solely made issues worse – ratcheting up the stress on chief data safety officers (CISOs) that had been already feeling the stress lengthy earlier than it emerged.
ISACA’s findings corroborate latest surveys resembling Proofpoint’s latest 2025 Voice of the CISO survey of 1,600 CISOs, which discovered 76% of Australian CISOs have handled the fabric lack of delicate data over the previous 12 months.
With 80% of Australian CISOs feeling that they’re held personally accountable when a cybersecurity incident occurs – effectively above the worldwide common of 67% – genAI is barely exacerbating what was already a major supply of stress.
It “provides to the stress on CISOs to safe their organisations within the face of a quickly altering menace and technological panorama,” Proofpoint discovered, with “expectations excessive and rising numbers feeling the stress and experiencing burnout.”
Accounts of the guts assault suffered by former SolarWinds CISO Tim Brown – who not solely struggled to wash up the main 2020 SolarWinds breach however was charged with fraud by the US SEC – have highlighted simply how massive a human toll the stress is taking.
The 2026 agenda
AI companies and infrastructure are driving a worldwide surge in international ICT spending, Gartner lately stated, predicting spending will develop 9.8% subsequent 12 months and cross $9 trillion ($US6 trillion) for the primary time – a lot of it pushed by genAI applied sciences.
ISACA Oceania ambassador and ACS Fellow Jo Stewart-Rattray stated solely 8% of tech leaders really feel ready for the dangers of genAI. Picture: Provided/Data Age
But for all their corporations’ spending plans, executives see managing these applied sciences as a serious a part of the problem, with many respondents to the ISACA survey nervous they received’t have the ability to discover the employees to assist them accomplish that correctly.
Some 37% of Australian organisations anticipate to develop their hiring subsequent 12 months in comparison with this 12 months, ISACA discovered – however the third who plan to rent audit, threat and cybersecurity professionals subsequent 12 months anticipate to have issues discovering the appropriate individuals.
This disconnect was recognized within the latest main OECD Science, Expertise and Innovation Outlook report, which noted that the safety and resilience points of Australia’s science, know-how and innovation insurance policies “are comparatively much less obvious”.
This, then, is the crux of the fears that ISACA survey respondents carry with them – with compliance seen as essential however 30percentnot very or in no way ready to truly ship the oversight they want.
“With solely 8% saying they really feel very ready for generative AI’s dangers,” ISACA Oceania ambassador and ACS Fellow Jo Stewart-Rattray stated.
“There’s an pressing have to steadiness experimentation and utilization with sturdy oversight.”
- This story first appeared on Information Age. You possibly can learn the original here.
