Perry Carpenter is Chief Evangelist for KnowBe4 Inc., provider of the popular Security Awareness Training & Simulated Phishing platform.
Cyber threats continue to grow in volume and severity, while the human element continues to be responsible for the majority of security breaches. Time has come for organizations to prioritize security training to help ferment (yes, as in “fermentation”) a security culture that will seriously reduce the chances of security incidents. Security programs can sometimes be unengaging even as organizations aim to deliver personalized and up-to-date security training to employees.
Interestingly, readily available large language models (LLMs or generative AI) such as Google’s Bard and ChatGPT hold exciting new possibilities for security training. Let’s understand five ways in which security teams can harness LLMs and supercharge their security training efforts.
1. Improving Personalization
Employees have various levels of security maturity, requiring diverse training needs. Unfortunately, most training programs are standardized and therefore do not offer an effective or engaging educational experience.
By leveraging LLMs, organizations can deploy chatbots and virtual assistants that can provide individualized and interactive learning experiences to employees. LLMs can analyze an employee’s job role, risk exposure and security knowledge and help deliver tailored content that is relevant and coherent for employees. Due to such personalization, employees can better understand training concepts and feel more culturally connected and invested in the learning process.
2. Crafting Social Engineering Scenarios
As an insidious form of social engineering, phishing is a top initial access vector that threat actors use to exploit users, bypass security defenses and gain a foothold into the victim’s environment. To mitigate this risk, organizations must run regular phishing tests to train employees to recognize scams and social engineering threats.
Using LLMs, security teams can craft more persuasive phishing exercises based on trending topics (in sports, pop culture, politics, etc.)—as a way of anticipating what kinds of social engineering themes users will likely encounter given the news cycle. LLM chatbots can also be set up to analyze individual employee responses in real time, give them hints or nudges along the way and provide tailored feedback based on the user’s performance.
3. Developing And Updating Multilingual Content
I think one of the most remarkable things about LLMs is that their language skills are extremely robust. Security teams can harness LLMs to develop content, research examples and explain security metaphors and analogies in a way that is more digestible to users.
Traditionally, it is difficult to translate and maintain training content in multiple languages. Using AI, security teams can also get a jumpstart on translating their training courses across a range of languages. For now, it helps to have someone validate the translation to ensure it feels authentic and localized.
4. Making Training More Interactive And Collaborative
Many organizations make the mistake of designing training programs as a one-way street. They know what skills they want employees to develop, but they do not take the time to understand the employees’ abilities or situations. As a result, the training is often ineffective, and the employees do not benefit from it.
LLMs can be programmed to simulate conversations and guide users to complete a task which makes the overall training experience more interactive. If an employee is unable to understand a piece of content or concept, they can ask the LLM to rephrase the term and cite more contextual examples. Employees can also access training sessions on demand (for example, a session on password best practices), allowing them to learn at their own pace and convenience.
5. Tracking And Reporting Training Effectiveness
LLMs can be integrated with existing infrastructure, such as email gateways, network monitoring tools, phishing simulation systems and learning management systems. They can then help transform raw data into actionable insights by identifying training trends, patterns and insights—such as an executive summary on training progress and completion, overall security maturity of employees, how phish-prone the business is, which user needs more training and how training reduced phishing incidents over time.
Such functionality is especially valuable for business teams that are looking to make data-driven decisions and security teams that are looking to demonstrate training progress and effectiveness to leadership teams.
Understanding The Drawbacks Of AI
It’s important to note that using LLMs for cybersecurity training efforts can have some drawbacks. For starters, they are prone to “hallucinations”—described as fabricating answers with a conviction that seems convincing.
In addition to randomly coughing up inaccurate answers, these AI models scrape the internet to collect data for learning; consequently, there’s no telling if someone’s input of confidential or sensitive information may be exposed. Certainly, the potential is there for intellectual property and copyright risks, cyber fraud risks and consumer protection risks.
Legal and compliance leaders should assess their organization’s exposure to these risks and put in place appropriate controls to mitigate them. Failure could cause organizations to face legal, reputational and financial consequences.
Taking A Thoughtful Approach To AI
The above list of uses for AI isn’t by any means exhaustive. There are other potential use cases where LLMs can be leveraged. For example, LLM tools can be purpose-built to regularly communicate cybersecurity risks to employees and stakeholders or continuously and actively seek to test employees using social-engineering tactics. AI can also be used to provide ongoing support to employees post-training by offering refresher courses and answering questions they have.
Such a personalized, interactive and collaborative approach to training can increase security knowledge and instill a feeling of accountability and responsibility in your organization. This can help boost the overall security culture, making the organization more resilient to security threats and breaches in the long run.
Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?
Read the full article here