SOC 2 Tabletop Exercises: What Your Auditor Actually Wants

SOC 2 Tabletop Exercises: What Your Auditor Actually Wants

SOC 2 never uses the words “tabletop exercise.” Nowhere in the Trust Services Criteria will you find a requirement that says “run a TTX.” What it does say — across CC7.3, CC7.4, and CC7.5 — is that you need to test your incident response plan and prove it actually works.

Most orgs satisfy this with a tabletop exercise. Most orgs also do it badly, document it worse, and then act surprised when their auditor writes it up.

Here’s what the criteria actually say, what auditors are looking for, and how to produce evidence that doesn’t generate a finding.

The criteria that matter

Everything lives in CC7 — System Operations. This falls under the Security category, which is required in every SOC 2 engagement. No exceptions, no “we didn’t select that Trust Services Category” escape hatch.

Three criteria are relevant:

CC7.3 — Did you evaluate whether your response actually works?

“The entity evaluates security events to determine whether they could or have resulted in a failure of the entity to meet its objectives.”

The point of focus that matters here is Evaluates the Effectiveness of Incident Response. Translation: you can’t just have an IR plan. You have to periodically check whether the plan works. This is the hook that makes tabletop exercises relevant to SOC 2.

CC7.4 — Can you actually execute the plan?

“The entity responds to identified security incidents by executing a defined incident response program to understand, contain, remediate, and communicate security incidents.”

This one covers the mechanics — roles and responsibilities, containment procedures, communication protocols, remediation. Your tabletop should test whether your team can do each of these without stopping to Google “who do we call at legal again?”

CC7.5 — Did you learn anything and fix what was broken?

“The entity identifies, develops, and implements activities to recover from identified security incidents.”

Here’s where the explicit testing language lives. The Implements Incident Recovery Plan Testing point of focus gets specific:

  1. Scenarios based on threat likelihood and magnitude — not a generic “data breach happens” prompt
  2. Consideration of system components that can impair availability
  3. Scenarios that account for key personnel being unavailable (the CISO-is-in-Cabo scenario)
  4. Revision of plans based on test results

That last one is critical. It’s not enough to run the exercise. You have to fix what the exercise found. Auditors care about the closed loop.

What auditors actually look for

There’s a gap between what the criteria say and what auditors actually scrutinize. Here’s the real list:

The documentation

  • Who was in the room. Names and roles, not “the security team.” Cross-functional participation matters. If legal, comms, and engineering weren’t there, your auditor will notice.
  • What you simulated. The scenario — attack type, systems in scope, threat model. “We discussed a breach” is not a scenario.
  • What your team actually decided. The specific choices, not a summary. “We isolated the affected workstation before notifying legal because containment was time-sensitive” is evidence. “We discussed containment options” is a meeting note.
  • What you found. An after-action report with actual findings — gaps, slowdowns, confusion about roles, procedures that didn’t match reality.
  • What you fixed. Evidence that those findings turned into changes. Updated runbooks, revised escalation contacts, new monitoring rules. This closes the loop.

How often

SOC 2 says “periodic.” Your auditor interprets this as at least annually. That’s the floor. More frequent testing strengthens your position but isn’t required.

For Type II, the exercise must fall within the observation window — typically 3 to 12 months. One well-documented exercise within that window passes. Zero does not.

What counts

  • Tabletop exercises — discussion-based walkthrough of a scenario. The minimum viable option.
  • Functional exercises — simulations involving actual technical response. Stronger evidence.
  • Post-incident reviews — documented analysis of a real incident that evaluates the plan’s effectiveness. These count, but you can’t depend on them because incidents don’t happen on your audit schedule.

What doesn’t count

  • An IR plan in a SharePoint folder that nobody’s opened since onboarding.
  • An exercise that happened but wasn’t documented. No evidence, no credit.
  • A calendar invite titled “Tabletop Exercise” with no attached notes, scenario, or action items.
  • A “last reviewed on” stamp on the plan doc with no record of what the review actually covered.

Look — auditors have seen every version of “we technically did this.” They can tell the difference between a meaningful exercise and a box-checking ceremony.

Type I vs. Type II: why this matters

Type I checks whether your controls are designed properly at a single point in time. For IR testing, the auditor verifies that a plan exists and that you have a process for testing it. You can pass Type I with a plan that has never been tested. It just needs to look like it would work.

Type II checks whether your controls actually operated effectively over 3-12 months. This is where organizations get caught. The auditor wants evidence that testing happened during the observation period — not just that a testing process exists on paper.

Most SOC 2 engagements are Type II. If yours is, you need a documented tabletop exercise inside the observation window. No amount of policy language compensates for missing evidence.

The findings auditors keep writing

Same gaps, every year, across industries:

No testing at all. The plan exists. It’s never been exercised. The most common finding and the easiest one to avoid.

Undocumented exercises. The team says they ran a tabletop last quarter. There’s no scenario document, no attendance list, no after-action report. From the auditor’s perspective, it didn’t happen.

No remediation follow-through. The exercise surfaced gaps — escalation contacts were wrong, the cloud team didn’t know the containment playbook existed, nobody knew which Slack channel to use. Six months later, nothing changed. The auditor sees an exercise that identified problems and an org that ignored them.

Missing cross-functional participation. An exercise run entirely by the security team, without legal, comms, or executive leadership. This signals that your IR plan doesn’t account for what actually happens during a real incident — which involves a lot more people than the SOC.

Stale plans. The IR plan references an employee who left two years ago, a monitoring tool you decommissioned, and a Slack channel that was archived. A real tabletop would have caught this. The fact that it didn’t tells the auditor the exercises aren’t rigorous enough.

Running an exercise that passes

Pick a scenario your auditor would respect

CC7.5 requires scenarios “based on threat likelihood and magnitude.” A generic “someone breached us” prompt doesn’t meet the bar. If you’re a SaaS company, simulate a compromised CI/CD pipeline or credential stuffing against your auth system. If you’re in healthcare, run a PHI exfiltration scenario. Make it relevant to your actual threat model.

Get the right people in the room

At minimum: IR lead, engineering, security ops, legal, communications. For scenarios involving customer data, add your privacy officer. For availability scenarios, include on-call engineering.

Run at least one exercise where a key player is “unavailable.” CC7.5 explicitly calls this out. What happens when the CISO is unreachable and the SOC manager has to make the containment call? That’s the muscle memory you need.

Document decisions, not attendance

The difference between useful evidence and a waste of everyone’s time comes down to decision capture. At each decision point: What did the team know? What were the options? What did they choose? Why? What happened next?

This is what separates an exercise from a meeting.

Write findings worth reading

“The exercise went well” is not a finding. Useful after-action reports identify:

  • Procedures that worked as intended
  • Gaps the exercise exposed — unclear roles, stalled handoffs, communication breakdowns
  • Decisions that took too long or involved too much ambiguity
  • Tools or systems that were unavailable or unfamiliar to responders

Close the loop

For every gap, create a task with an owner and a deadline. When it’s done, document the change. Updated the escalation procedure? Save the diff. Added a new monitoring rule? Screenshot the config. This evidence trail closes the loop — and it’s what moves your program from “checkbox” to “actually improving.”

What’s shifting in 2026

Scenario specificity matters more. Auditors increasingly want exercises that reflect your actual risk profile, not a template pulled off the internet. An exercise tailored to your infrastructure carries more weight than a boilerplate ransomware walkthrough everyone has seen.

Pen tests alongside tabletop exercises. Most auditors now expect both — tabletop exercises test the plan, pen tests test the defenses. Having one without the other raises questions.

Supply chain scenarios are viewed favorably. If your IR plan doesn’t address vendor compromise, auditors notice. Exercises that simulate third-party incidents demonstrate awareness of where the real risk lives for most SaaS companies.

The documentation checklist

Here’s what a clean exercise artifact looks like — the kind an auditor reviews without asking follow-up questions:

ElementWhy it matters
Exercise date and durationProves testing fell within the observation window
Participant names and rolesProves cross-functional involvement
Scenario with threat modelProves relevance to actual risk (CC7.5)
Decision log with timestampsProves meaningful participation, not just attendance
Competency scores or assessmentQuantifies performance for trend tracking
After-action findingsProves the exercise surfaced actionable gaps
Remediation statusProves findings were addressed — the closed loop

Producing this manually — typing up facilitator notes, formatting a report, chasing people for their observations afterward — takes hours. That overhead is a big reason most teams only test once a year, right at the compliance minimum.

We built Breachdeck to handle this automatically. Every exercise produces a scored debrief with timestamped decisions, competency breakdowns across four dimensions, and a one-click PDF that covers every row in the table above. Run the demo — it takes five minutes, and the debrief it generates is the artifact your auditor is asking for.

Related Articles

NIST 800-53 IR-3: What Federal Auditors Actually Want
Compliance

NIST 800-53 IR-3: What Federal Auditors Actually Want

What FISMA and FedRAMP assessors evaluate for IR-3 compliance. The control requirements, SP 800-84 methodology, and how to pass.

Mar 24, 2026 · 10 min read Read article
Does Your Cyber Insurance Require a Tabletop Exercise?
Compliance

Does Your Cyber Insurance Require a Tabletop Exercise?

What cyber insurance carriers want from IR testing in 2026. Which scenarios to run, what to document, and how to time it for renewal.

Mar 21, 2026 · 7 min read Read article
ISO 27001 IR Testing: A.5.24 & A.5.26 Requirements
Compliance

ISO 27001 IR Testing: A.5.24 & A.5.26 Requirements

What ISO 27001 auditors want from IR plan testing. A.5.24, A.5.26, the 2013→2022 mapping, and how to produce evidence that passes.

Mar 18, 2026 · 8 min read Read article

Ready to practice incident response?

Run your team through a realistic scenario — no account required.

Try the Demo