building, reflection, distortion, skyscraper, city, architecture

Photo by lisaleo on Pixabay

Black-and-White Thinking as a Driver of Organisational Distortion

Conceptual Overview

Dichotomous thinking — reducing complex matters to two mutually exclusive categories — is well-studied in psychology. When this cognitive style is reproduced at the senior leadership level, it does not remain a private bias. It reshapes the information architecture of the entire organisation. Two major distortions follow:

False Oppositional Structuring

Leaders who cannot hold nuanced positions tend to see internal viewpoints as competing camps. Over time, these camps crystallise into:

  • policy “sides,”
  • departmental identities,
  • and mutually exclusive narratives.

This acts as a kind of epistemic black-and-white filter: the system becomes defined by its poles rather than its continuum.

Inverted Compartmentalisation (Stovepipe Distortion)

Normally, compartmentalisation protects sensitive data. But here, compartmentalisation is inverted, to emerge as a defence against exposure of an internal paradox, not external risk.

The organisation creates firewalls to avoid the display of emotional or political conflicts at the senior management of the organisation. This is similar to what organisational theorists call defensive structuring or siloing as conflict-avoidance.

Observed Phenomena

This effect has been seen across multiple fields:

Gatekeeping Theory: Gatekeeping Around Senior Leadership

“Gatekeeping around senior leadership” refers to the practice of individuals (often executive assistants, managers, or other staff) controlling access to information, resources, and communication with senior executives. This can be a legitimate time-management function or a toxic control tactic, depending on the intent and implementation. Research consistently shows that executive assistants and chiefs of staff act as political buffers:

  • They decide what information reaches senior leaders.
  • They limit contact between rival executives.
  • They prevent open conflict by controlling timing and framing.

This is documented in:

Mintzberg (1973) — A classic work showing that executives rely heavily on gatekeepers to filter information flow.

Henry Mintzberg’s 1973 analysis, The Nature of Managerial Work, observed that managers often rely on various methods to cope with the sheer volume of information they receive. One key finding was the significant role of gatekeepers, often PAs or administrative assistants, in filtering, summarising, and managing information flow to make it digestible for executives
This delegation of information processing allowed managers to focus on less formal, more verbal communication channels and essential decision-making rather than getting bogged down in every piece of incoming data.

Feldman & March (1981) — Information in Organizations as Signal and Symbol: leaders receive only information that is politically safe or emotionally manageable. This article proposed that information in organizations serves not just instrumental (to make better decisions) but also symbolic purposes, acting as a signal of legitimacy, competence, and adherence to norms, rather than purely for its content or rational utility, explaining why organizations collect far more data than they use.

Their foundational article, “Information in Organizations as Signal and Symbol,” published in Administrative Science Quarterly, suggested that information use is deeply embedded in social contexts, making its visible deployment important for impression management and maintaining organizational processes, even if the data isn’t directly used in decisions.

Organisational politics literature — shows that assistants and chiefs of staff often become de facto conflict managers by filtering interactions. Organisational politics literature frequently highlight that roles such as executive assistants (EAs) and chiefs of staff (CoSs) often assume the position of de facto conflict managers. This occurs largely through their function as gatekeepers, who filter and manage interactions between their principal (the executive) and other staff members or external parties. Key aspects of this dynamic include:

Gatekeeping: EAs and CoSs control access to the executive’s time and attention. By deciding who gets a meeting, which emails reach the principal’s desk, and the nature of the information flow, they implicitly manage potential disputes and information asymmetries that could lead to conflict.

Information Management: They act as central hubs for information, allowing them to frame issues, provide context, or strategically withhold information to smooth over disagreements or prevent them from escalating to the executive level.

Buffer Role: They serve as a buffer, absorbing tension and interpersonal friction before it reaches the principal, thus protecting the executive’s time and emotional energy for strategic decision-making.

Influence and Negotiation: These roles require significant informal influence. They often negotiate solutions between warring parties or align different stakeholders on a path forward without formal authority, using their proximity to power as leverage.

This literature suggests that while not explicitly part of their formal job description, these conflict management activities are a critical component of their effectiveness in supporting the executive’s strategic goals within a politically charged environment.

Concealment of Leadership Disagreements

Organisational sociology and political science describe several mechanisms:

Issue Suppression

When conflicts cannot be resolved, they are structurally buried. This is documented in:

Bachrach & Baratz’s “two faces of power” (1962) — power is exercised by preventing issues from appearing in the first place. Bachrach and Baratz’s “Two Faces of Power” (1962) argues that power isn’t just about making decisions (the first face), but also about the hidden ability to prevent issues from even reaching the decision-making stage, which they call “non-decision-making”.

This “second face” involves consciously or unconsciously shaping societal values, institutions, and political processes to keep certain contentious topics off the public agenda, thereby protecting existing power structures and interests.

The Two Faces Explained:

The First Face (Decision-Making): This is the visible power described by pluralists, where someone influences or controls concrete decisions (e.g., passing a law, funding a project).

The Second Face (Non-Decision-Making): This is the more subtle, “latent” power where dominant groups use resources to set the agenda, filter out challenges, and create barriers so that certain issues (like radical policy changes) never even get discussed in the public arena.

Key Idea: Power is exercised not just by what is decided, but by what is not decided, because powerful actors maintain their advantage by controlling the scope of conflict itself. They create a “mobilization of bias” that favours certain outcomes and suppresses others, making it seem like consensus exists when it’s actually manufactured.

This creates a self-limiting organisation, with an atmosphere of uncertainty, as standard, efficient business practices become replaced with local narratives of control, based on false assumptions and fear of exposed impotency and indecision.

Pfeffer (1981) — managers hide disagreements to maintain perceived unity – A key theme in Jeffrey Pfeffer’s work regarding organizational dynamics, specifically as detailed in his book, Power in Organizations (1981). In it, Pfeffer argued that power and influence are central to organizational life, often manifesting through symbolic actions and the strategic management of perception. The concept you are referencing relates to how managers often engage in a process of “consensus management” or “impression management” where internal disagreements are suppressed or handled privately to present a unified, rational front to external stakeholders, upper management, or subordinates. This perceived unity is used to:

Maintain Legitimacy: A united front makes decisions seem more rational, intentional, and authoritative.

Minimize Conflict Perception: Open conflict can be perceived as a sign of weakness or poor leadership, which managers seek to avoid.

Control the Narrative: By controlling the appearance of agreement, managers maintain control over how organizational actions are interpreted

Information Buffering

Gatekeepers soften or distort messages to avoid escalation. This is seen in:

Eisenberg’s “strategic ambiguity” (1984) — organisations rely on vagueness to allow conflicting views to co-exist. Eisenberg’s concept of strategic ambiguity (1984) explains how organizations use vague language and concepts (like mission statements or values) to intentionally allow diverse, even conflicting, interpretations to co-exist, enabling different groups to find common ground and work together without needing absolute clarity, thus maintaining flexibility and managing complex environments. It’s a deliberate communication tactic to achieve goals, foster unity, and navigate disagreements by agreeing on broad ideas, not specific details. Key aspects of strategic ambiguity:

Coexistence of Diverse Views: Ambiguous messages allow different factions (stakeholders, departments) to project their own meanings, uniting them under a shared, abstract concept.

Flexibility & Change: It provides room for adaptation and innovation, as strict clarity can hinder change or force difficult choices, whereas ambiguity allows for evolving interpretations.

Manages Conflict: By not committing to one specific interpretation, organizations can appeal to multiple audiences and avoid alienating groups with competing interests.

Relational Variable: Clarity isn’t just in the message; it’s how the sender, message, and receiver interact, with ambiguity arising from a lack of shared interpretation.

Example: A vague promise of “growth” can satisfy both those seeking massive expansion and those prioritizing stable, incremental gains, notes this article from Sage Journals.

In essence, ambiguity becomes a valuable organizational resource, creating unity in diversity and allowing for progress without requiring everyone to agree on the exact same thing.

Weick’s sensemaking theory (1995) — contradictions are absorbed into the structure rather than reconciled. Karl Weick’s sensemaking theory (1995) does not generally state that contradictions are absorbed into the structure without resolution. Instead, it describes how individuals and groups attempt to impose order and rationality on ambiguous or contradictory situations in order to make sense of them. The key aspect of Weick’s theory is the process of sensemaking itself:

Enactment: People actively create the environment they face through their actions and interpretations. The environment doesn’t just exist objectively; it is “enacted” by participants.

Retrospection: Sensemaking is a backward-looking process. People look back at what has already happened and interpret those cues to construct a coherent, plausible story or explanation of events.

Plausibility over Accuracy: The goal of sensemaking is often to create a plausible or sensible account that allows action to continue, rather than finding a perfectly objective or “true” answer.

Ongoing and Social: Sensemaking is a continuous process and inherently social, involving communication and shared meaning-making with others.

When contradictions arise, the process of sensemaking allows individuals to fit these inconsistencies into a working understanding or narrative. This involves constructing interpretations that bridge the gap or normalize the contradiction, making the overall situation understandable enough to proceed. Rather than the structure itself absorbing the contradiction, the interpretation is adjusted to accommodate it within a cohesive story.

Organisational Shadow Structures

Unofficial channels emerge to keep the “official” structure clean of conflict:

  • Side conversations
  • Selective meetings
  • “Quiet” coalitions

These shadow structures do the real work of conflict navigation.

Why Black-and-White Thinking Produces Stovepiped Systems

Here’s the sequence that the research points to:

Leadership relies on binary interpretations: “Either we do X or we do Y.”

This removes space for nuanced or hybrid solutions: Middle-ground strategies lose legitimacy.

Followers quickly learn what each leader wants to hear: They align themselves accordingly.

Gatekeepers begin filtering information to avoid upsetting contradictions: Each leader receives only the version of reality that fits their dichotomy.

Departments reorganise around the two poles: Silo formation becomes political, not functional.

The organisation develops an inverted stovepipe system: Instead of information flowing vertically for clarity, it flows selectively and asymmetrically to maintain internal peace. Thus, stovepipes become emotional buffers, not operational ones.

This matches what is known in organisational theory as:

  • Defensive decoupling
  • Conflict-avoiding structural differentiation
  • Symbolic unity masking functional fragmentation

Contemporary Perspectives

Modern research ties these dynamics to:

Cognitive biases in leadership: Binary thinking increases when leaders are stressed, overloaded, or facing political threat. (Kahneman, Tversky, & successors). Leading thinkers like Daniel Kahneman and Amos Tversky established the foundation for understanding these systematic errors in thinking, demonstrating that humans often rely on mental shortcuts, or heuristics, which become more prominent when executive function is impaired by stress.

When leaders face these pressures, their capacity for nuanced analysis decreases, making the simplicity of “binary thinking” (seeing issues in stark, either/or terms) a common, though often flawed, coping mechanism.

Affective organisational politics: Emotions, not strategy, drive the creation of information firewalls. Affective organisational politics refers to the idea that workplace decisions and actions, such as the creation of “information firewalls” (barriers to information sharing), are primarily driven by emotions and interpersonal relationships rather than purely rational strategic objectives.

Key aspects include:

  • Emotion-Driven Decisions: The core premise is that negative emotions like fear, insecurity, distrust, or resentment between individuals or groups are the root cause of political behaviour, leading people to hoard information or sabotage collaboration.
  • Interpersonal Conflict: Firewalls are often a defence mechanism or an act of retaliation resulting from historical or ongoing emotional conflicts, personality clashes, and a perceived need for self-protection.
  • Subjectivity over Objectivity: The focus shifts from what is best for the organization’s overall strategy (a rational view) to what feels emotionally safe or advantageous for the individuals involved (an affective view).

This concept argues that to understand office politics and information blockages, one must look beyond formal power structures and analyse the underlying emotional dynamics and relationships within the workplace

Complexity theory: Systems under cognitive stress simplify themselves adaptively — even if simplification distorts reality. Complexity theory, particularly within fields like systems thinking and organizational behaviour, suggests that when systems or individuals face high cognitive stress or information overload, they adaptively seek simplification. This phenomenon is a cognitive shortcut designed to reduce the mental load and enable faster decision-making. This drive for simplification often involves:

Heuristics: Relying on mental shortcuts or rules of thumb rather than engaging in full, complex analysis.

Filtering Information: Selectively paying attention to information that is easiest to process or confirms existing beliefs, while ignoring contradictory data.

Pattern Recognition: Imposing familiar patterns on new, complex situations, sometimes leading to a distorted or inaccurate understanding of reality.

While this simplification can be efficient in the short term for survival or quick decision-making, the user correctly identifies a key risk: it can lead to a distortion of reality by oversimplifying nuances and ignoring crucial context

Neuroleadership: Threat responses (social or institutional) make leaders more rigid, increasing dichotomous thinking. Threat responses, whether social or institutional, make leaders more rigid and increase dichotomous (black-and-white) thinking by shifting brain activity from the rational prefrontal cortex (PFC) to the primal limbic system, specifically the amygdala, in a process often called an “amygdala hijack”.

The Neurological Mechanism

Amygdala Activation: The amygdala acts as the brain’s “alarm system,” constantly scanning for threats. When a social threat (e.g., a challenge to status, autonomy, or fairness) is perceived, the brain reacts similarly to a physical threat, triggering an immediate survival response.

PFC Deactivation: As the limbic system becomes highly active, the prefrontal cortex – responsible for executive functions like planning, rationalizing, complex decision-making, and problem-solving – experiences reduced capacity. The higher the emotional stress, the more the logical brain shuts down.

Hormonal Influence: The stress response involves a flood of hormones like cortisol and adrenaline, which heighten alertness but impair the PFC’s ability to see issues clearly, think analytically, and work with others.

Resulting Rigidity and Dichotomous Thinking

This neurological shift leads to the “threat-rigidity effect,” where leaders exhibit specific rigid behaviours and a tendency toward either/or thinking.

Information Processing Narrows: Leaders under threat develop tunnel vision, focusing narrowly on short-term concerns or immediate, tangible goals. They become less open to new information or alternative perspectives, effectively sealing off valuable input and losing the “shades of grey” that open up possibilities.

Reversion to Habits: The brain, seeking to conserve energy under duress, defaults to well-learned or dominant responses and existing practices, even if those are inappropriate for the current situation. This reliance on old habits bypasses the critical analysis needed for complex, novel challenges.

Constriction of Control: In institutional settings, a crisis or threat often leads to power and influence becoming concentrated at higher levels of the hierarchy. Decision-making becomes centralized and more directive, as leaders feel a need to control the situation through established chains of command, further reducing flexibility and collaboration.

Impulsive or Avoidant Decisions: Overloaded by stress, leaders may either freeze, avoiding decisions altogether, or make impulsive choices based on fear or urgency, rather than thoughtful consideration.

Reduced Empathy: The faculties for balanced judgment, attentive listening, and the ability to empathize are among the first to go when the amygdala takes over, damaging trust and team dynamics.

In essence, the threat response system, designed for immediate physical survival, compromises the cognitive flexibility and rational thinking required for effective, nuanced leadership, leading to a rigid and often counterproductive “fight or flight” approach to complex social and institutional challenges – Businesses, sometimes want, and force their employees to operate on the edge of panic, they want to see those “headless chickens”, it must be therefore, concluded,

Behavioural governance: Oversimplification at the top propagates downward as a culture of fear or loyalty-based framing. Behavioural governance refers to the application of insights from behavioural psychology to understand and influence the decisions and actions of individuals within an organization, particularly focusing on those in leadership roles.

The observation suggests a causal link where the complex realities of decision-making are condensed into simple, easily digestible narratives by senior leadership. This oversimplification can lead to problematic organizational dynamics:

Culture of Fear: When complex issues are reduced to simple “right vs. wrong” or “success vs. failure” scenarios, there is little room for nuance, mistakes, or open discussion. Subordinates may become afraid to present bad news, challenge assumptions, or admit errors for fear of being judged as disloyal or incompetent. This hinders transparency and effective risk management.

Loyalty-Based Framing: Simplification often aligns with the leader’s personal worldview or agenda. Dissenting opinions are framed not as valid business disagreements but as personal disloyalty. Success is attributed to loyalty and adherence to the simple plan, while failures are blamed on deviations or lack of commitment, further solidifying an insular, echo-chamber culture.

In essence, behavioural governance advocates for structures and processes that acknowledge human cognitive biases, encouraging leaders to seek diverse perspectives and avoid the traps of oversimplification that can undermine ethical and effective decision-making.

Summary Model

Black-and-White Thinking > Leadership Polarisation > Gatekeeper Buffering > Shadow Conflict-Management Structures > Inverted Compartmentalisation > Distorted Decision Architecture

This is a well-recognised phenomenon, though typically described through different academic lenses (organisational politics, sensemaking, behavioural decision theory).

The Inverse-Compartmentalisation Model of Organisation

A formal, organisational-level theory of how binary leadership cognition produces stovepiping and covert information firewalls

Core claim

When senior leaders rely on dichotomous (black/white) framing under conditions of stress, competition, or political threat, organisational actors create and enforce information-filters (gatekeepers, shadow channels) that preserve apparent unity while displacing unresolved tensions into inverted compartmentalised stovepipes — damaging decision quality and situational awareness.

Model components and causal sequence

Leadership Cognitive Framing (LCF) — senior leaders adopt binary frames (A vs B) for complex issues (cause: stress, competition, identity threat). this produces

Political Pressure & Identity Stakes (PPIS) — disagreements at the top become politically costly to expose. This produces

Gatekeeper Buffering (GB) — executive assistants, chiefs of staff, trusted deputies adopt filtering roles to shield leaders from disconfirming information (information triage & reframing). this produces

Shadow Channels & Selective Disclosure (SCSD) — side conversations, ad-hoc coalitions and selective meetings form to handle contentious matters out of public view.
this produces

Inverted Compartmentalisation (IC) — formal organisational structures reconfigure so that information flows along politically safe, not functionally optimal, pathways (departments align to political poles rather than domains).
which produces

Decision Distortion & Reduced Resilience (DDRR) — decisions are based on partial, curated knowledge; system loses error-detection, feedback loops are broken, and risk accumulates unnoticed.

Each arrow denotes a causal tendency backed by organisational research (see references below).

Mechanisms

Defensive sense-making: Leaders avoid cognitive dissonance and social threat by simplifying frames (Weick, 1995).

Two-faces of power / agenda control: Power operates by preventing issues from surfacing (Bachrach & Baratz, 1962).

Gatekeeper incentives: Assistants/COOs minimise conflict and preserve leader psychological capital by filtering (Mintzberg, 1973).

Strategic ambiguity: Organisations tolerate vagueness to co-exist with internal contradictions (Eisenberg, 1984).

Confirmation & motivated reasoning: Subordinates prefer and circulate “safe” narratives that align with leadership identity (Kahneman; Kunda, 1990).

Formal propositions

P1 (Buffering proposition): Greater perceived political cost of disagreement increases the probability that gatekeepers will filter disconfirming information (measurable by number of briefings filtered, meeting cancellations, editorial changes).

P2 (Shadowing proposition): Stronger LCF correlates with increased use of informal channels (measurable by frequency of off-agenda meetings, email/thread audits).

P3 (Siloing proposition): Organisations with persistent inverted compartmentalisation show higher functional misalignment (measurable by divergence between formal org chart and information flow network).

P4 (Failure proposition): The degree of decision distortion (DDRR) is positively associated with the extent of information filtering and negatively associated with cross-cutting feedback loops.

Operational indicators

  • High reliance on executive assistants/chiefs of staff to schedule/triage inputs.
  • Repeated cancellation of cross-functional meetings or limited attendee lists.
  • Memos that are heavily redacted or pre-framed before leader review.
  • Emergence of “quiet rooms” or ad-hoc task forces operating outside formal governance.
  • Pattern of decisions that ignore contrary evidence later revealed (post-hoc surprises).
  • Low whistleblower reporting, or reports routed through informal advisers rather than formal compliance channels.

Short diagnostics

  • Network analysis: map email/meeting networks vs org chart — high mismatch suggests IC.
  • Interview sample: ask mid-level managers whether they ever withhold information to avoid upsetting a particular leader (yes/no frequency).
  • Meeting audit: measure proportion of meetings where only “friendly” stakeholders are invited.
  • Document trail: look for repeated edits by gatekeepers that change framing or omit dissent.

Remedies / interventions (how to break the cycle)

Deliberate red-teaming: independent critical review teams with direct reporting lines to board/audit.

Institutionalised dissent forums: scheduled cross-cutting panels with protected time and immunity for minority views.

Transparent gatekeeper protocols: define what assistants may and may not filter; require logs.

Rotate briefings & broaden attendee lists to include contrarians.

Structural check & balance: route certain types of evidence (safety, legal, compliance) to independent channels.

Leadership cognitive training: bias awareness and stress-resilience (reduce LCF).

Whistleblower protections and anonymous reporting channels routed to an independent office.

Real-world examples (mapped to model stages)

Here are presented case summaries and map each to the model stage(s) they exemplify.

Challenger Shuttle (NASA, 1986) — Gatekeeper buffering, suppressed dissent, decision distortion

What happened (short): Engineers raised concerns about O-ring performance in low temperatures; management framed the launch decision within schedule and political pressures; dissenting voices were downplayed and problematic data were not fully surfaced; result: launch failure and loss of life.

Model mapping: LCF (schedule vs safety) → PPIS (political/agency pressure) → GB (engineering concerns not escalated to decision forum) → SCSD (informal reassurances) → DDRR (launch decision based on incomplete info).

Research: Investigations (e.g., Vaughan, 1996) identify managerial normalization of deviance, buffering of technical warnings, and organisational sense-making failures.

BP Deepwater Horizon (2010) — Competing frames, shadow handling, structural misalignment

What happened (short): Safety warnings and risk indicators were present in operations; organisational incentives, cost/efficiency framing, and segmented responsibilities meant critical risk signals failed to alter operational choices; post-spill inquiries highlighted fragmented decision pathways and inadequate safety governance.

Model mapping: LCF (production/efficiency vs safety) → PPIS (commercial incentives) → SCSD & IC (operational units aligned to production goals; safety signals insufficiently centralised) → DDRR (failure to act on warnings).

Research: Independent commission reports documented governance failures, poor communication, and offshore decision bottlenecks.

Enron (early 2000s) — Strategic ambiguity, stovepiping, shadow channels

What happened (short): Financial reporting and off-balance-sheet vehicles obfuscated risk; executive narratives emphasised growth and market genius while dissenting controls were weak or bypassed; complex shadow transactions insulated executives from accountability.

Model mapping: LCF (growth/market success vs accounting reality) → GB (selective exposure of analyses to board) → SCSD (SPVs and off-books channels) → IC (accounting and trading functions misaligned with corporate governance) → DDRR (financial collapse).

Research: Post-mortems (Healy & Palepu; ethics scholars) document controlled information flow and hidden risks.

Intelligence failure / 9/11 Commission (2001) — Stovepiping and suppression of cross-cutting evidence

What happened (short): Fragmented intelligence collection systems, stovepiped reporting lines, and failures to connect disparate signals contributed to missed signals about impending attacks. Certain reporting channels did not share details across agencies due to organizational boundaries and political constraints.

Model mapping: LCF (bureau/mission protection) → PPIS (inter-agency competition) → GB (analysts vet reports to protect turf) → SCSD (restricted dissemination) → IC (information not fused across nodes) → DDRR (missed integrated warning).

Research: 9/11 Commission Report discusses stovepiping and lack of integrated analysis as central failures.

Corporate boardrooms / political administrations (generalized) — political smoothing and assistant gatekeeping

What happens (typical): Executive assistants and chiefs of staff triage flows, sometimes shielding leaders from dissent or defusing conflicts before they surface publicly. This can produce an illusion of consensus while unresolved strategic disagreements persist.

Model mapping: GB is primary; SCSD and IC follow.

Research: Mintzberg’s executive studies and organizational politics literature document the central political role of gatekeepers and chiefs of staff in agenda control and buffering.

Mapping table

Model Stage Real-world case(s)
LCF (binary framing) Challenger, BP, Enron
PPIS (political cost) Challenger, Enron
GB (gatekeeper buffering) NASA Challenger, political administrations
SCSD (shadow channels) Enron (SPVs), corporate side negotiations
IC (inverted stovepipes) Intelligence stovepiping, BP operational silos
DDRR (decision distortion) All above (final common pathway)

How these observations were studied

  • Sense-making & normalization of deviance: Weick; Vaughan (Challenger).
  • Gatekeeper & executive work: Mintzberg (managerial roles; assistants as filters).
  • Power as agenda control / issue suppression: Bachrach & Baratz.
  • Strategic ambiguity as a social tool: Eisenberg.
  • Organisational politics and defensive structuring: Pfeffer, organisational politics literature.
  • Network / stovepipe analysis in intelligence: Commission reports and network analyses of information flow.

Practical checklist for analysts / auditors

  1. Map who filters what to whom (document edits, meeting invites).
  2. Identify topics that never appear on open agendas.
  3. Audit the diversity of voices present at decision points.
  4. Run network centrality vs formal role analysis (discrepancies are red flags).
  5. Institute independent review triggers when gatekeeper edits exceed threshold.
  6. Test evidence fusion across units (simulate red-teaming).

Caveats & limits

Not every instance of filtering is pathological — some gatekeeping is necessary (triage, confidentiality). The model distinguishes functional filtering from politically motivated inverted compartmentalisation.

Causal inferences require evidence beyond case narratives: network data, meeting logs, interview testimony, and document trails are ideal.

Remedies require cultural as well as structural change; simply mandating more meetings without psychological trust will not suffice.

References

Bachrach, P. & Baratz, M. (1962). “Two Faces of Power.” American Political Science Review.

Eisenberg, E. (1984). Ambiguity as Strategy in Organizational Communication.

Healy, P. M. & Palepu, K. G. (2003). Corporate failure studies (post-Enron analyses).

Kahneman, D. (2011). Thinking, Fast and Slow.

Kunda, Z. (1990). “The Case for Motivated Reasoning.” Psychological Bulletin.

Mintzberg, H. (1973). The Nature of Managerial Work.

Pfeffer, J. (1981). Power in Organizations.

Vaughan, D. (1996). The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA.

Weick, K. E. (1995). Sensemaking in Organizations.

National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling (2011). Report to the President.

The National Commission on Terrorist Attacks Upon the United States (2004). 9/11 Commission Report.

Categories: 1D ThinkingInverse CompartmentalisationInverse-Compartmentalisation Model of OrganisationInverted-Compartmentalisation Model of OrganisationOrganisational DistortionOrganisational ImbalanceOrganisational IncongruenceBehavioural theoryCompartmentalisationComplexity TheoryGatekeeping TheoryOrganisational BehaviourParadoxSensemaking theorySocial ControlSymbolic CommunicationSymbolismToxicity

0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *

Self-Transcendence
Contact Us
close slider