Interviews

Four predictions for 2024

by Mark Rowe

Andrew Hollister, CISO & VP Labs R&D, pictured, offers four predictions for CISOs in 2024.

Throughout 2023, Chief Information Security Officers (CISOs) have grappled with the increased adoption of generative artificial intelligence (generative AI) and what this could mean for the future of the cybersecurity industry. Generative AI-powered technologies and platforms are becoming faster and more capable, presenting new challenges and opportunities for organizations.

For security teams, generative AI holds great promise to maximize resources by augmenting analyst expertise. More and more organizations are now exploring its uses, with a survey from McKinsey revealing that 40 percent of respondents will increase their investment in AI because of advances in generative AI. At the same time, it is also important to consider the dark side of the technology. Generative AI can also provide threat actors with powerful tools for optimizing attacks and over time this could significantly change the threat landscape. For example, we’re already seeing more threat actors leverage generative AI to create more believable phishing content.

Reaping the rewards of generative AI will depend on an organization’s understanding of how it can be most effectively used and its awareness of the risks it brings. Here are my top four predictions for how generative AI will shape the cybersecurity industry in 2024:

  1. Uncertainties around Generative AI integration will continue

As we look ahead, the C-Suite will face continued uncertainty on how to effectively implement generative AI into their organizations. We are still in the early phase of adoption, and organizations don’t know what they don’t know. Much as organizations needed to learn to understand the shared responsibility model in cloud computing, they will need to learn to understand the opportunities and limitations of generative AI.

Continued uncertainty surrounding best practice in adoption of generative AI poses a notable risk to organizations, potentially leading to breaches of confidential information. The need for a clear understanding and strategic deployment of generative AI becomes paramount to mitigate these risks effectively.

  1. Balance vital to Generative AI’s impact

Ensuring effective threat detection relies on establishing a fine balance between harnessing the capabilities of generative AI tools and human decision-making. Distinguishing genuine generative AI contributions from marketing hype will continue to be challenging for organizations. The ongoing debate in 2024 will be centred around whether to invest in additional technology, such as generative AI, or hire more Security Operations Center (SOC) analysts.

It’s crucial to recognize generative AI as a tool that works to assist human analysts as opposed to being a replacement for the analyst in the SOC. Success in cybersecurity ultimately depends on aligning these tools seamlessly with analyst workflows and prioritizing the human factor.

  1. Generative AI’s power in augmenting, not replacing security analysts

In the ever-changing cybersecurity landscape, generative AI’s integration into SOCs is marked by augmentation rather than substitution for human analysts. The primary role of generative AI is to provide enhanced insight and assist SOC staff, providing valuable support to mid-level analysts and enabling them to leverage their broader expertise and decision making.

During a time when the cybersecurity skills gap continues to widen, generative AI emerges as a valuable ally, empowering stretched teams to achieve heightened efficiency. When evaluating the deployment of generative AI, organizations should consider the substantial potential it offers to support analyst teams, particularly those with limited resources.

  1. Safeguarding confidential data against Generative AI risks

Many organizations are starting to explore how generative AI platforms can help them boost efficiency by augmenting manual processes. The challenge is that these platforms lack established regulations for user data protection and may offer limited guarantees for the protection or privacy of entered data. It is crucial to carefully evaluate generative AI platform licensing agreements to avoid inputting critical business information into platforms that have no responsibility to protect the data.

Organizations must use these tools with care to balance the benefits of technological advancements with the risk it may pose to their sensitive data. If a business experiences a breach via this route, it could have huge consequences on its ability to operate effectively, damaging revenue streams and customer trust.

Getting ready for new trends

Generative AI will only continue to evolve and increase in sophistication. To stay ahead in 2024, organizations need to focus their efforts on maximizing the advantages of generative AI and taking the necessary steps to effectively protect their critical data and operations.

There is no silver bullet to protect against the security challenges surrounding generative AI, but there is planning, strategy, and maintaining awareness. Successfully navigating this technology in the cybersecurity landscape in the year ahead will depend on an organization’s ability to integrate its capabilities and defend data against emerging risks.

About the author

Andrew Hollister, Chief Information Security Officer (CISO) at he cyber firm LogRhythm has over 25 years’ experience in software, infrastructure, and security roles in both the private and public sector. He joined LogRhythm in 2012 with a keen interest in using machine-based analytics to solve cyber security problems. He maintains a close interest in this area, contributing content, expertise, and vision to the development of the company’s roadmap and platform offerings. Andrew has experience in the field leading professional services and customer care, as well as sales engineering.

Related News

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing