As 2025 draws to a close, the cybersecurity world reflects on a conflict that's defined the year. Despite a colossal investment in AI, automation, and policy frameworks, it's still people, not technology, who make or break resilience.
In this piece, Alexei Hnatiw of Solvd Together [1], examines the tension between rapidly advancing AI threats and slowly evolving human behaviours, and outlines potential challenges as we head into 2026.
In January of this year, The World Economic Forum's (WEF) 2025 Global Cybersecurity Outlook [2]revealed nearly half of organisations (47%) see generative AI-driven threats as their top concern. Yet, the same research showed human vulnerability remains the leading cause of breaches. KPMG's Cybersecurity Considerations 2025 [3] placed "the power of the people" at the centre of its findings, arguing that organisations with strong cyber cultures outperform those relying solely on tools and technology.
Despite all these advances in automation and machine learning, the biggest risks still begin and end with people.
The paradox of progress
We've seen organisations rolling out powerful new tools and policies, but behaviour and culture haven't kept up. We're seeing AI emerging in workplaces with employees using unsanctioned tools to get work done faster, often unaware of the risks.
At the same time, attackers have learned to exploit our most human qualities of curiosity, trust, and helpfulness through AI-generated phishing emails, and more.
In October of this year, the Cyber Monitoring Centre (CMC) reminded organisations to identify the networks that matter most and plan how they'd cope if those networks were disrupted. This follows JLR's cyber attack as "the costliest cyber attack in UK history [4]" - stretching to approximately £2 billion.
But what about people?
The uncomfortable truth is cybersecurity is as much about understanding people as it is about understanding systems, technology, and what the attackers might do next.
The WEF's report is clear. Organisations are struggling not because they lack tools, but because they lack capability - they lack the behaviours that help people respond intelligently under pressure. This aligns with KPMG's "the power of the people" comment that culture, collaboration and communication are now as critical to resilience as firewalls and encryption.
And that's a shift we've been championing for some time. Our work in learning and behavioural design, particularly in cyber awareness and resilience, is based on a simple premise that people don't change because they've been told to - or because they've completed an online task. They change behaviours because they understand why something matters, and because they've been involved and consulted in the learning process.
What's interesting about 2025 is we've finally started to understand that behaviour is a security control in itself, and the most advanced intrusion detection system in the world can't stop someone from clicking a malicious link.
That doesn't happen through fear-based messaging or endless slide decks. It happens through design and learning experiences that make people feel empowered and responsible.
We need to borrow more from behavioural science and create programmes that actually change how people think. Our work with Unilever recognised that standard cybersecurity training often fails to make cyber risk feel personal or relevant, especially across a large, diverse organisation.
Through research across high-risk groups, we set out to understand why people weren't following processes and policies, and how to shift their behaviour in a meaningful way.
At the heart of the campaign was a card game that brought people off their screens and into real conversations about their own habits and mindsets around cyber threats - "Cards Against Cyber Crime" [5] - helped Unilever's cyber teams engage high-risk groups on the issues that genuinely matter. This delivered a +9% increase in confidence in identifying cyber threats, and an +8% rise in understanding how to report those threats.
What 2025 taught us
If I had to sum up the year in a single sentence, it would be that technology accelerated, but humans had to catch up fast.
The pace of change with AI has moved far beyond what traditional training models can deliver. Raising awareness isn't enough; only adaptation can deliver the resilience organisations need.
Ultimately, every organisation, no matter how digital or advanced, is only as strong as the collective mindset of its people.
Looking ahead to 2026, I believe the next generation of cybersecurity programmes won't just help people spot phishing emails; they'll also help us navigate new technologies and threats as they emerge.
As generative AI continues to evolve, training won't keep up unless it's designed for continuous learning. This is where leaders need to focus, on helping people become comfortable with uncertainty, and confident enough to make good decisions in real time.
Because the truth is, we don't need every employee to be a cybersecurity expert; we just need them to be security-minded, to see risk, understand the consequences, and act with awareness. Which, as it happens, is a learning challenge, not a technology one.