- Wag The Dog Newsletter
- Posts
- A Systems Approach to After-Action Reports
A Systems Approach to After-Action Reports
How complexity science can transform your post-crisis learning

Dear reader,
In this week's edition of my Wag The Dog newsletter, I'm exploring how we can fundamentally improve the way we learn from crises.
After reading Adrian Hornsby's brilliant analysis of why traditional root cause methods fall short in complex systems, I realised we might be approaching after-action reports all wrong.
The implications for how we actually learn from our mistakes are, to say the least, interesting.
Enjoy the article.
Philippe.
PS: the Wag The Dog newsletter has crossed the cap of 1100 active subscribers like you last week! Thank you for being part of the community. 🙏
Table of Contents
Before we go to the main article…
🛡️ Want to strengthen your crisis comms playbook?
“Defending Your Organization Against Rage Farming”
I’m launching a self-paced online course with step-by-step frameworks, templates, and the full 24-hour response system.
I had a great time speaking at Cyabra's live session last week discussing rage farming. They kindly offered to share the full recording of the webinar. Check it out below.
A Systems Approach to After-Action Reports
How complexity science can transform your post-crisis learning
We've all sat through them. The crisis has passed, the media storm weathered, and now comes the inevitable after-action report (AAR)2 .
Teams gather round meeting tables with timelines, root cause templates, and that familiar question: "Right, let's work out what went wrong so this never happens again."
Here's the uncomfortable truth: despite years of thorough AARs, the same communication breakdowns keep happening. Crisis responses still hit the same coordination snags.
What if we're not doing AARs badly but we're just asking the wrong questions?
The Root Cause Trap
Adrian Hornsby's recent piece, "Beyond Root Cause: A Better Approach to Understanding Complex System Failures", made me completely rethink our own AAR process.
His critique of traditional root cause analysis (RCA) methods like the "5 Whys"1 exposes a basic mismatch between our investigative approach and the messy reality of modern crises, and frankly, it explains why so many of our post-crisis reviews felt unsatisfying despite our best efforts.
I'm grateful to Adrian for this insight because it reframes something we all struggle with.
The issue with linear RCA? It assumes complex failures work like broken machines; trace the fault backwards, find the broken bit, fix it, and job done.
But communication crises don't unfold like mechanical breakdowns. They emerge from countless interactions between people, processes, technology, and context in ways that resist neat cause-and-effect explanations.
Think about your last major crisis. Dig beneath the surface and you'll likely find:
Multiple factors that built up over time
People who improvised under pressure (often successfully)
Assumptions and expectations that shaped decisions
Context, organisational culture, past experiences, external pressures, that influenced behaviour
Trade-offs with no clear "right" answer
Traditional AARs struggle with this complexity. Worse, they often end in blame games and surface-level fixes that miss deeper patterns.
Reframing the Questions
What would AARs look like if we started from complexity science rather than mechanical thinking?
Instead of hunting root causes, we'd explore system behaviour. Instead of assigning blame, we'd seek understanding. Instead of generating to-do lists, we'd focus on learning and adaptation.
This shift transforms the fundamental questions we ask:
From "What went wrong?" to "Where was the system under strain, and how did it respond?"
This moves us from fault-finding to understanding system dynamics. It reveals pressure points, bottlenecks, and unexpected connections that create vulnerability.
From "Did we follow the plan?" to "What were our expectations, and what assumptions drove them?"
Plans exist on paper, but mental models drive real-time decisions. This question exposes the gap between what we planned and what actually happened.
From "What should we do differently?" to "What do we understand differently now, and what should we test?"
This encourages reflection and experimental thinking rather than rushing to implement fixes that might miss the underlying dynamics.
From "What went well?" to "What conditions enabled our successes?"
Moving beyond surface-level wins to understand what made good outcomes possible so we can strengthen and repeat them.
Systems Thinking in Practice
When we apply these complexity-informed questions to real incidents, patterns emerge:
Adaptation becomes visible. Teams discover valuable improvisation and resilience that deserves recognition, not correction back to protocol.
Context matters. Organisational culture, past experiences, and external pressures reveal themselves as powerful forces shaping behaviour, forces that must be acknowledged in any meaningful change.
Trade-offs surface. The messy reality of competing priorities and limited resources becomes explicit, leading to more realistic improvements.
Hidden conditions emerge. Long-standing vulnerabilities like understaffing patterns, misaligned incentives, and information silos that existed before the crisis but made failure more likely.
A Practical Shift for Comms Teams
For communication and PR professionals, this approach offers particular value:
Stakeholder dynamics become clearer. Instead of focusing solely on message failures, explore how different audiences interpreted and responded to information under pressure.
Channel performance tells a story. Rather than just tracking what didn't work, understand how information flowed through formal and informal networks during the crisis.
Cultural factors surface. Organisational dynamics, past reputation issues, and external relationships that shaped public response often remain invisible in traditional AARs.
Real-time adaptation gets recognised. The brilliant improvisation your team did when the original strategy hit reality deserves documentation and institutionalisation.
Moving Forward
Transforming your AAR process doesn't mean scrapping everything. Start with these shifts:
Broaden your scope. Look beyond what broke to how the system adapted under pressure. Seek both weakness and strength.
Create safety. Frame AARs as learning exercises, not blame sessions. People need to feel safe sharing the full complexity of their decisions.
Accept uncertainty. Not every incident needs neat conclusions and clear action points. Sometimes the most valuable outcome is better questions.
Think in systems. Consider how your organisation connects with external partners, stakeholders, and audiences. Friction at these boundaries often drives problems.
Design for learning. Build follow-up mechanisms to test assumptions, verify changes, and track whether new approaches actually improve performance.
The Real Prize
The goal isn't perfect AARs; it's organisational learning that builds resilience.
When we move beyond root cause thinking to embrace complexity, we create organisations that don't just recover from crises but actually get better at handling uncertainty.
Next time you're running an AAR, resist hunting for the smoking gun. Instead, explore how your system really behaves under pressure.
You might discover things about both your vulnerabilities and your hidden strengths that surprise you.
What complexity-informed questions has your team started asking in post-crisis reviews? Drop me a line; the patterns across different sectors are often remarkably similar.
PS: If you’ve missed it, check out my previous article on systems thinking and crisis communication here.
References and further reading.
1 5 Whys. (2023, January 26). Lean Enterprise Institute. https://www.lean.org/lexicon-terms/5-whys/
2 Ponciano, E. (2025, May 8). Understanding After-Action Reports (AARs) and Their Role in Emergency Management. Tidal Basin Group. https://www.tidalbasingroup.com/understanding-after-action-reports-aars-and-their-role-in-emergency-management/
Sponsor
Stay up-to-date with AI
The Rundown is the most trusted AI newsletter in the world, with 1,000,000+ readers and exclusive interviews with AI leaders like Mark Zuckerberg, Demis Hassibis, Mustafa Suleyman, and more.
Their expert research team spends all day learning what’s new in AI and talking with industry experts, then distills the most important developments into one free email every morning.
Plus, complete the quiz after signing up and they’ll recommend the best AI tools, guides, and courses – tailored to your needs.
What I am reading/testing/checking out:
If there’s one thing you should download and read this week/weekend, then it’s the freshly published “Reputation, Risk and Resilience” report from friend and colleague Rod Cartwright.
In the expanded 2025 edition, he summarised and analysed 11 major global reports – 834 pages of source material – on the interlocking topics from the past 12 months.
Let’s meet!
![]() Here are the events and conferences I'll be speaking at. If you're around, feel free to message me, and we can meet up. |
|
How satisfied were you with the content in this edition? 📚 |
PS: I hope you've enjoyed this newsletter! Creating it each weekend is a labour of love that I provide for free. If you've found my writing valuable, the best way to support it is by sharing it with others. Thank you for reading!
Parts of this newsletter were created using AI technology to draft content. In addition, all AI-generated images include a caption stating, 'This image was created using AI'. These changes were made in line with the transparency requirements of the EU AI law for AI-generated content. Some links in this newsletter may be affiliate links, meaning I earn a small commission if you click and make a purchase; however, I only promote tools and services that I have tested, use myself, or am convinced will make a positive difference.
Reply