OBJECTIVE 5.6 Given a scenario (PBQ-likely)

Implement security awareness practices

This is a PBQ objective. Expect to design awareness programs, evaluate phishing scenarios, and select appropriate training responses for described situations.

Phishing Campaigns

Internal Phishing Simulations

Controlled phishing emails sent to your own employees to test awareness.

Campaign elements:

  • Pretexting: Crafting a believable scenario (package delivery, password reset, CEO request)
  • Indicators to test for: Typosquatted domains, urgency language, suspicious sender, mismatched URLs, unusual requests
  • Metrics: Click rate, credential submission rate, reporting rate
  • Follow-up: Immediate training notification for users who click. Not punitive — educational.

Recognizing Phishing

Users should be trained to check:

  • Sender address: Does it match the organization? Look for subtle misspellings.
  • Links: Hover before clicking. Does the URLUniform Resource Locator — Web address for accessing resources match the claimed destination?
  • Urgency and pressure: “Your account will be locked in 30 minutes” is a manipulation tactic
  • Unexpected attachments: Especially executables, macro-enabled documents, or compressed files
  • Requests for credentials: Legitimate organizations don’t ask for passwords via email

Reporting Mechanisms

  • Dedicated phishing report button in email client (integrates with security tools)
  • Clear escalation path — users must know who to contact and that reporting is encouraged
  • Recognition for reporting — reward the behavior you want to see

Phishing Response Procedures

What users should do after clicking a suspicious link or submitting credentials:

  1. Disconnect from the network (WiFi off, unplug ethernet) — limit potential lateral movement
  2. Do not power off the machine — preserves forensic evidence in memory
  3. Report immediately to security/ITInformation Technology — Broad term for computing infrastructure and services — use the established reporting channel, not the potentially compromised email
  4. Change credentials from a different, known-clean device — not from the potentially compromised one
  5. Document what was clicked, what was entered, what happened after
  6. Do not attempt to “fix it yourself” — deleting the email or clearing browser history destroys evidence

Exam context: CompTIA tests the correct order of these steps. Reporting comes before self-remediation.

Situational Awareness

Environmental Awareness

  • Be aware of who can see your screen in public spaces (shoulder surfing)
  • Lock workstations when leaving your desk, even briefly (Win+L / Ctrl+Cmd+Q)
  • Secure sensitive conversations — don’t discuss confidential information in elevators, cafes, or open-plan areas
  • Be aware of recording devices (smart speakers, phones on speakerphone in shared spaces)

Travel and Conference Security

  • Use a VPNVirtual Private Network — Encrypted tunnel over public networks on all public WiFi — or avoid public WiFi entirely and tether to mobile data
  • Disable Bluetooth and NFCNear Field Communication — Short-range wireless for contactless payments/pairing when not in use
  • Leave unnecessary devices at home — travel with a clean/burner device for high-risk destinations
  • Be wary of free USBUniversal Serial Bus — Standard connector for peripherals charging stations (juice jacking) — use AC adapters or charge-only cables
  • Hotel safes are not secure storage for classified materials — they’re deterrents for opportunistic theft only
  • Customs/border agents in many countries can compel device unlock — travel with encrypted, minimal data

Contextual Awareness

  • Social engineering at conferences: “I’m from ITInformation Technology — Broad term for computing infrastructure and services support, we’re updating badges” is a classic pretext
  • Fake WiFi networks at hotels and conferences (evil twin attacks)
  • Physical document security — don’t leave printouts on conference room tables or hotel printers

Anomalous Behavior Recognition

Training users to recognize and report behavior that deviates from normal:

User Behavior

  • Colleague accessing systems they don’t normally use
  • After-hours access from unusual locations
  • Large data transfers or downloads
  • Requests to bypass security controls
  • Changes in behavior pattern (disgruntled indicators)

System Behavior

  • Unexpected pop-ups, slowdowns, or crashes
  • Applications launching without user action
  • New/unknown programs installed
  • Disabled security software
  • Unusual network activity (lights flashing on hardware, unexpected connections)

Risky Behavior

  • Sharing passwords or credentials
  • Using personal devices for work without authorization
  • Connecting to unsecured WiFi for work
  • Tailgating (allowing unauthorized people through secured doors)
  • Leaving screens unlocked and unattended

User Guidance and Training

Security Training Program

  • Onboarding: Security fundamentals during new employee orientation. AUPAcceptable Use Policy — Policy defining permitted use of org resources acknowledgment.
  • Annual refresher: Updated content reflecting current threats. Mandatory for all employees.
  • Role-based: Additional training for high-risk roles (finance, ITInformation Technology — Broad term for computing infrastructure and services admins, executives)
  • Event-driven: Training triggered by security incidents, new threats, or policy changes

Social Engineering Awareness

  • How pretexting works — attackers build rapport and trust before making requests
  • Authority bias — people comply with perceived authority figures without questioning
  • Urgency manipulation — artificial deadlines pressure people into skipping verification
  • Quid pro quo — offering something in exchange for information (“free tech support” calls)
  • Physical social engineering — tailgating, dumpster diving, shoulder surfing

Insider Threat Awareness

Behavioral Indicators

Technical indicators:

  • Accessing systems or data outside normal job function
  • Large or unusual data transfers, especially to personal devices or external storage
  • Accessing systems at unusual hours without business justification
  • Attempting to bypass security controls or access restricted areas
  • Installing unauthorized software or tools

Behavioral indicators:

  • Expressed dissatisfaction with employer, coworkers, or management
  • Financial difficulties or sudden unexplained wealth
  • Working unusual hours without clear need
  • Reluctance to take vacation (may fear someone will discover their activity)
  • Discussing departure or job searching while having access to sensitive data

Reporting Channels

  • Anonymous tip lines (phone or web-based)
  • Direct reporting to security team or manager
  • HR involvement for behavioral concerns
  • Legal/compliance for suspected fraud or regulatory violations
  • Key point: Reporting must be accessible, confidential, and free from retaliation — or people won’t use it

Policy-Specific User Guidance

Users need to understand not just that policies exist, but what they require in practice:

  • Acceptable Use Policy (AUPAcceptable Use Policy — Policy defining permitted use of org resources): What’s allowed on company devices and networks. No personal torrenting, no unapproved cloud storage, no crypto mining. Users acknowledge during onboarding and annually.
  • Clean desk policy: Lock documents in drawers when leaving. No sticky notes with passwords. Shred sensitive printouts. Whiteboards with sensitive info erased after meetings.
  • Screen lock policy: Auto-lock after 5 minutes (or less). Manual lock when leaving desk. Win+L / Ctrl+Cmd+Q muscle memory.
  • Device security: Full-disk encryption required. No unapproved USBUniversal Serial Bus — Standard connector for peripherals devices. Report lost/stolen devices immediately — minutes matter for remote wipe.
  • Removable media: Company-issued only. No personal USBUniversal Serial Bus — Standard connector for peripherals drives. Auto-run disabled on all systems.
  • BYODBring Your Own Device — Employee uses personal device for work boundaries: Work data in managed container only. MDMMobile Device Management — Centralized management of mobile devices agent required. Company reserves right to remote wipe the container.

Operational Security (OPSEC)

OPSECOperational Security — Protecting sensitive information from adversaries is the process of protecting information that could be pieced together to give an adversary useful intelligence.

The OPSEC Process (5 Steps)

  1. Identify critical information: What information, if known by an adversary, would be damaging? (merger plans, security architecture, incident details)
  2. Analyze threats: Who would want this information? (competitors, nation-states, hacktivists)
  3. Analyze vulnerabilities: How could the information be exposed? (social media, conference talks, public filings, trash)
  4. Assess risk: Likelihood × impact of exposure
  5. Apply countermeasures: Controls to prevent exposure (classification labels, training, secure communications)

OPSEC in Practice

  • Information classification: Users must know the classification of what they’re handling and the handling rules for each level
  • Secure communications: Sensitive discussions over encrypted channels. No discussing incidents on Slack public channels. No emailing credentials.
  • Social media discipline: Don’t post about internal systems, security tools, org charts, building layouts, or travel schedules
  • Conference/public speaking: Review presentations for accidental disclosure of internal architecture, IPInternet Protocol — Network layer addressing and routing ranges, vendor names, tool configurations
  • Metadata awareness: Documents, photos, and files contain metadata (author, GPSGlobal Positioning System — Satellite-based location tracking coordinates, revision history) that may leak sensitive info

Awareness Program Development

Methods

  • Computer-based training (CBTComputer-Based Training — Online training modules): Online modules with quizzes. Scalable, trackable.
  • Simulated phishing: Practical testing of email awareness. Most impactful training method.
  • Tabletop exercises: Discussion-based scenarios for incident response teams
  • Lunch-and-learns: Informal security awareness sessions
  • Posters, newsletters, intranet: Ambient awareness reinforcement
  • Gamification: Leaderboards, badges, competitions to drive engagement

Program Execution and Rollout

Audience Segmentation

Not all users need the same training:

  • General users: Phishing recognition, password hygiene, physical security, reporting procedures
  • ITInformation Technology — Broad term for computing infrastructure and services staff: Secure configuration, incident response roles, privilege management
  • Developers: Secure coding, OWASPOpen Web Application Security Project — Nonprofit producing web security resources (Top 10) Top 10, code review practices
  • Executives: Business email compromise (BECBusiness Email Compromise — Impersonating executives to authorize fraudulent transfers), social engineering targeting leadership, data classification decisions
  • New hires: Intensive onboarding security module within first week
  • Third-party/contractors: Scoped training covering systems they access and data they handle

Rollout Strategy

  1. Baseline assessment: Measure current awareness level (phishing simulation, survey)
  2. Initial training: Deploy foundational content to all users
  3. Targeted follow-up: Additional training for users who failed simulations or are in high-risk roles
  4. Continuous reinforcement: Monthly tips, quarterly simulations, annual refresher
  5. Measure and adjust: Track metrics, adjust content based on what’s not working

Metrics and Effectiveness

  • Phishing simulation click rates (should trend downward over time)
  • Training completion rates
  • Incident reporting rates (should trend upward — more awareness = more reporting)
  • Time to report (faster = better awareness)
  • Reduction in security incidents caused by user error
  • Comparison over time is what matters — a single metric snapshot is useless without trend data

Executive Buy-In

  • Program must have visible executive support — executives participate visibly, not just mandate
  • Budget allocation tied to risk reduction metrics
  • Regular reporting to leadership on program effectiveness
  • Tone from the top: If executives ignore security practices, employees will too

Offensive Context

Social engineering is the offensive technique that bypasses every technical control. Firewalls, encryption, zero-trust architecture — none of it matters if someone hands over their credentials to a convincing phishing email. Security awareness training is the defensive counter to social engineering, and it works best when it’s informed by actual offensive techniques. A phishing simulation designed by someone who understands how real attackers craft pretexts is more effective than a generic “don’t click suspicious links” CBTComputer-Based Training — Online training modules module. The attacker’s playbook should inform the defender’s training curriculum.

LABS FOR THIS OBJECTIVE