Dimitriadis is the Chief Global Strategy Officer of ISACA, an international professional association focused on IT governance, and Chair of the Board of Directors for the organisation’s European arm.
When we last interviewed Chris, the main themes of the event and our chat were digital trust and data security. This year, AI was on everyone’s lips.
Tech.eu: Now that the ISACA conference in Dublin has concluded, can you summarise what you were expecting to learn, and the main points that you tried to get across as an organisation?
Chris Dimitriadis (ISACA): One of our core objectives was to bring together our European and global communities to facilitate networking and encourage collaboration. We wanted to bring together people from around the world to exchange opinions and ideas, especially in light of the change and disruption caused by emerging technologies.
The AI space is rapidly evolving – and will fundamentally change the way that we operate, do business, and define innovation even in the near future.
This is particularly interesting for us at ISACA, as all of those new technologies and developments have a trust implication. First and foremost, we need to ensure that all new technologies and frameworks are introduced in a safe and secure manner for everyone.
The other aim of the Conference — as is always the case with ISACA events — was for everyone to learn from the industry-leading speakers and to discuss the importance of digital trust.
Tech.eu: Everybody’s talking about emerging technologies and AI. It seems to be the most discussed topic at the moment, especially as AI has gained traction before there is regulation, or even thought around it.
ISACA has been quite forward-thinking about the security implications of AI and other new technologies, but if we go back to the basics, is that something you still need to get across in enterprises? For example, the basics about training your staff properly about the next big technology. Even with something as shiny and new as AI - how do you keep getting that security message across?
Chris: It's a very good point. At ISACA, we recognise the need to fix the basics first. A lot of the cybersecurity incidents that we witness are due to basic omissions and errors.
One of the main reasons why that is happening is complexity. Digital ecosystems are becoming more complex by the day, and supply chains are becoming longer and more interconnected themselves. In today’s world, nobody is working in isolation.
To address that complexity, we need to see better collaboration and integration within organisations. And we need privacy professionals who understand the new and emerging threats.
IT teams should be working across the whole business to embed security and privacy audits within the lifecycle of any new project. We need cybersecurity professionals to speak in business language, understand the financial dynamics within specific projects and ecosystems, and to be able to recommend cybersecurity solutions that are feasible to implement. Too often, cybersecurity is seen as a bolt-on, something that is cumbersome and costly, and that view needs to change.
But the real crux of the issue is the need to close the cyber skills gap.
We believe that everything starts with people and skills. In both Europe and across the globe, we are facing a large skills gap with a distinct lack of cybersecurity and digital trust professionals. We simply don’t have enough personnel – whether in cybersecurity, audit, or privacy – with the relevant skills and vertical expertise to adequately protect organisations.
Basic omissions, from lacking two-factor authentication to an unpatched router, are things we have been discussing for decades now. Getting that basic infrastructure in place is a requisite for companies, we simply need them to recognise the need and ensure they have the right people in place to implement it.
Tech.eu: How do companies avoid the kind of breaches that can disrupt whole supply chains? Of course, when it comes to people and data there’s this major element of trust, but surely it needs to go beyond that, right?
Chris: Trust does have a key role to play, but as you say, it goes beyond that. Ultimately, a cyber incident can happen at any time, to anyone. No organisation can protect itself 100 percent of the time, but it can take steps to reduce uncertainty within its whole supply chain. Reducing uncertainty is key to limiting the gaps and errors that lead to cyberattacks happening.
In practice, organisations need to take measures to ensure they are continuously auditing their whole ecosystem or using frameworks like CMMI (Capability Maturity Model Integration). CMMI was borne in the US as a project of the Department of Defence, designed to measure the maturity of software to ensure the trust of vendors.
Organisations need to ensure they are putting processes in place, such as a contractual requirement for auditing or rigorous frameworks to assess the maturity of software, hardware, and even whole products. This will help to ensure that they have confidence in their whole supply chain.
Tech.eu: Apart from AI, the human element and maturity assessments, what other factors do you think need to be addressed when it comes to emerging technologies — take for instance the metaverse, if it lasts?
Chris: Looking at it from a digital trust perspective, I think that all of these emerging models — and new worlds, in the case of the metaverse – can be seen as opportunities.
We should never try to stop evolution because, simply, we won’t be able to. If the metaverse is a value generator, it will be adopted, and if there are trust concerns they will need to be addressed to accelerate its safe proliferation.
As we all know, in the virtual world, new cyber threats and techniques to perform attacks will always continue to emerge. Instead of trying to stop the development of new technologies, we need to embrace change, like the metaverse, and build the right trust framework around it. This will enable us to address the new risks that are introduced and support value creation.
In a nutshell, what we need to do now is look ahead at the emerging technologies and their potential risks and threats, then wrap strong frameworks around them. Within the digital trust community, there is always an opportunity to think more critically, more innovatively, and to focus on problem-solving.
Tech.eu: Is digital trust different for things like the metaverse, whether it becomes successful or not, and for things like AI because of how little or much understanding we have of the technology as end users or customers?
Chris: It’s an interesting point. In fact, a 2022 study we conducted at ISACA revealed that nearly one in three consumers stopped doing business with a company known to have compromised cybersecurity.
Fundamentally, digital trust directly translates into consumer trust. This is the same truth for the metaverse, AI, and all other emerging technologies and fields that have cybersecurity, assurance, privacy, and ethical parameters to consider.
However, the context and actions behind the creation of digital trust differ for each emerging industry or technology. Different risks exist for the metaverse and AI, for example, so it’s imperative that we work to understand the risks and create the bespoke controls needed to protect us from them. This is another reason why learning and education are key for digital trust professionals who need to expand their knowledge — vertically in their domain and horizontally as far as new technologies are concerned.
Lead image: Chris Dimitriadis, Chief Global Strategy Officer, ISACA.