Where Do Your Facts Come From?
- Paul Shotton
- 14 minutes ago
- 5 min read

By Paul Shotton, Advocacy Strategy
How confident are you that the facts underpinning your advocacy are still up to date?
How often do you pause to ask whether a statistic you cite has quietly aged out of relevance?
And if influence is partly about providing information, are you sure you are using your evidence as deliberately as you think?
These are uncomfortable questions. In my experience, they rarely get asked until something goes wrong.
Over the past months, I've worked with several organizations on what initially looked like straightforward assignments: codifying position papers, policy briefs, FAQs, and fact sheets. The brief was familiar. Collect what exists. Clean it up. Create a common template. Align structure, length, tone. Bring some order.
At first glance, this feels like document management. And to a degree, it is. Knowing which documents exist, who owns them, where they sit, and when they were last updated is basic hygiene. Most teams have some version of this in place, whether in SharePoint, Google Drive, Dropbox, or somewhere similar.
But it doesn't take long before a deeper issue surfaces.
What I kept running into was not missing documents, but invisible knowledge. Facts reused across multiple files with no clear source. Data that everyone recognised, but no one could date. Studies that had become part of the institutional memory, long after anyone remembered when they were produced or whether newer evidence existed.
At that point, it becomes clear that managing documents is not the same thing as managing knowledge. Documents are containers. Knowledge is what gives them weight. When the two are treated as the same thing, credibility slowly erodes without anyone noticing.
At this point, it's reasonable to ask whether I'm overcomplicating things. Whether this is a consultant seeing patterns that aren't really there, or inventing problems that need solving. These teams are busy. They're responding to live political situations. The last thing they need is another system to maintain, another layer of process, another person telling them their work isn't rigorous enough.
And maybe I am. Maybe this is elaborate scaffolding for what should be simple work. Maybe it's consultant overengineering dressed up as strategic thinking.
But here's what keeps bringing me back to it.
I've seen priority advocacy positions rest on evidence so thin, and I've sat in rooms where someone asks "where did we get this?" and the answer is a shrug, or worse, "it's always been in the deck."
These aren't hypothetical failures. They're real ones. And they matter because in public affairs, credibility is the only currency that counts. Once it's gone, it's almost impossible to rebuild.
So when teams start seeing the distinction between managing documents and managing knowledge, a different set of questions emerges. Where do our key facts actually come from? How often are they reused? Are they still valid? And who, if anyone, is responsible for keeping them current?
That is usually where the real work begins.
Knowledge in Motion
If documents aren't the real unit of value, then where is knowledge actually doing its work?
In public affairs, facts rarely speak for themselves. They're mobilised through messages. This is why message houses matter far more than they're often given credit for.
In many organizations, the message house is where impact assessments, technical analysis, studies, and expertise get translated into argumentation. It provides the logic that connects evidence to narrative. From there, that logic should flow consistently into position papers, policy briefs, press releases, FAQs, talking points, and stakeholder meetings.
Seen from this angle, the message house isn't just a communications tool. It's a central knowledge hub.
When that hub is weak or implicit, fragmentation follows. Different documents tell slightly different stories. The same statistic appears in multiple places, rewritten each time. Priority issues receive very different levels of evidentiary rigour, often depending on who last worked on them or how much time they had.
I've mapped this enough times now to recognize the pattern. Some priority issues are well supported. Others rely on thin or outdated evidence. Technical experts hold crucial knowledge that never quite makes it into shared assets. Public affairs staff, under constant pressure to respond, default to what's already written.
At this stage, the challenge isn't just about information quality. It becomes a governance question. Who ensures coherence between priorities, message houses, and evidence? Who coordinates updates with technical experts? And who makes sure that knowledge doesn't quietly decay while documents continue to circulate?
This is usually the point where the conversation turns to tools.
The Discipline Before the Tool
And understandably so. But tools are rarely the place to start.
Effective knowledge management begins with deciding what actually matters enough to track. In most public affairs contexts, that means being explicit about facts, sources, publication dates, where those facts are used, and who is responsible for reviewing them.
Other professions have treated this as non-negotiable for decades. Academics manage citations as assets in their own right. Journalists build source trails to defend their claims. Policy analysts track evidence separately from narrative outputs. The common thread isn't technology. It's discipline and traceability.
For many public affairs teams, simple tools like Excel or Google Sheets are a surprisingly effective starting point. They force clarity. They make assumptions visible. They quickly reveal where knowledge is thin, outdated, or overused.
As complexity increases, those tools start to creak. One fact supports multiple message houses. One message house feeds multiple documents. Review cycles need coordination. This is where more relational tools like Airtable can add real value, allowing teams to connect facts, sources, priorities, messages, and assets without heavy technical overhead.
Document management systems remain essential, but they're not sufficient. Storing files well doesn't guarantee that the knowledge inside them is current, traceable, or used deliberately.
Across all of this, the limiting factor is rarely the software. It's ownership and habit. Who maintains the system? How are updates triggered? How are technical experts involved in a sustainable way? And how is knowledge management integrated into everyday public affairs work, rather than treated as an occasional clean-up exercise?
The Judgment Call
I realize this might sound like a lot. More process, more tracking, more coordination. And I'm genuinely uncertain about where the line is between useful discipline and bureaucratic overhead. I don't have a neat formula for that. Different organizations will draw it differently, depending on their size, their political environment, their risk tolerance.
But influence increasingly depends on the quality and currency of evidence. The standards have risen. The scrutiny is sharper. And organizations that treat knowledge management as an afterthought are taking risks they may not fully appreciate.
In the end, this isn't a back-office issue. Knowledge management is a strategic capability. It underpins credibility, coherence, and influence. Organizations that take it seriously are better equipped to adapt, respond, and make evidence work for them, rather than quietly against them.
Whether that's worth the investment of building systems, changing habits, and maintaining discipline—that's a judgment call each organization has to make. But they should make it deliberately, not by default.



Comments