Information Economics: Why Transparency Isn't Always Optimal
Your leadership team preaches transparency. Open communication. Radical candor. Information sharing. All decisions should be visible. All data should be accessible. Sunlight is the best disinfectant.
Meanwhile, your negotiations fail because you've revealed your reservation price. Your strategy is copied by competitors who see exactly what you're doing. Your teams are paralyzed by too many opinions on every decision. Your employees game metrics because they know exactly what's being measured.
More information isn't always better. Sometimes ignorance is valuable. Sometimes opacity is strategic. Sometimes limiting information flow improves outcomes.
The question isn't whether to be transparent. It's what information should flow where, when, and to whom.
Information Has Costs
The conventional wisdom treats information as free and more as better. Information economics reveals this is wrong on both counts.
Gathering costs: Someone must collect data, verify it, structure it, and maintain it. Market research requires time and budget. Customer feedback requires systems to capture and organize. Competitive intelligence requires dedicated resources.
Processing costs: Humans have limited cognitive capacity. More information creates overload, decision paralysis, and analysis paralysis. The executive buried in reports makes worse decisions than one with curated summaries.
Coordination costs: When more people have information, more people have opinions. More opinions require more meetings, more consensus-building, more compromise. Decisions slow down or never get made.
Strategic costs: Information revealed to your team can leak to competitors. Information about your reservation price undermines negotiations. Information about strategic plans constrains flexibility.
The optimal amount of information isn't "all of it." It's the amount where marginal benefits equal marginal costs.
Types of Information Problems
Information economics identifies distinct problems that arise when information is asymmetric or imperfect.
Adverse Selection: Hidden Information Before Transaction
One party knows something the other doesn't, and this asymmetry shapes decisions.
Hiring is the classic example. Candidates know their actual capabilities, work ethic, and culture fit. Employers don't. This creates adverse selection; employers can't distinguish strong candidates from weak ones before hiring, leading to risk aversion and inefficient hiring processes.
Used car markets illustrate this perfectly. Sellers know whether their car is reliable or a lemon. Buyers don't. Buyers assume the worst, offering low prices. Only owners of actual lemons accept low prices. Good cars exit the market. The market collapses to only lemons.
This dynamic appears throughout business: acquiring companies without knowing their internal problems, choosing vendors without knowing their delivery capability, forming partnerships without knowing hidden liabilities.
Moral Hazard: Hidden Action After Transaction
One party takes actions the other can't observe, and bears consequences differently than they bear risks.
Insurance creates moral hazard: once insured, people take more risks because they don't bear full costs of bad outcomes. This is why insurers require deductibles and exclude certain behaviors.
Employment relationships have identical dynamics. Employees with guaranteed compensation can reduce effort without immediate consequences. Managers can take organizational risks where they capture upside but the organization bears downside.
This is the principal-agent problem from Article 3 of this series, viewed through an information lens. Agents take hidden actions that principals can't fully monitor.
Signaling: Costly Actions That Reveal Hidden Information
When you can't observe quality directly, you look for costly signals that correlate with quality.
Education is signaling. A degree doesn't necessarily teach job-relevant skills, but completing a degree signals conscientiousness, intelligence, and ability to finish what you start. Employers pay for the signal, not just the education.
In organizations, signaling appears everywhere:
Working long hours signals commitment (even if actual productivity is low)
Detailed presentations signal thoroughness (even if decisions don't improve)
Using complex jargon signals expertise (even if simpler language would communicate better)
The problem with signaling: it creates costs without creating value. People invest in signals instead of substance. The organization pays for education that doesn't improve performance, rewards visible busyness over actual output, and values presentation polish over decision quality.
Screening: Design Choices That Reveal Hidden Information
Screening flips signaling; instead of the informed party sending signals, the uninformed party designs choices that reveal information.
Insurance companies screen through coverage options: high deductibles attract low-risk customers, low deductibles attract high-risk ones. Price separates customers by risk profile.
Hiring screens through interview design: coding challenges reveal technical capability, case studies reveal problem-solving approach, reference checks reveal past performance. Each screen reveals information candidates can't credibly claim.
The effectiveness of screening depends on designing choices that separate types in observable ways.
Why More Information Isn't Always Better
If information were free and costless to process, more would always be better. Reality is different.
Processing Costs: Cognitive Overload and Decision Paralysis
Humans can process perhaps 7 pieces of information in working memory at once. Beyond that, we don't get better decisions; we get overwhelmed people making worse decisions or avoiding decisions entirely.
The executive reviewing 50-page reports misses critical insights buried in detail. The team with access to all data can't identify what matters. The organization measuring 100 metrics can't prioritize any of them.
This is why executive summaries exist. Why dashboards show 5 key metrics, not 50. Why "too much information" is a real problem, not just a complaint.
Coordination Costs: More Informed People, Harder Coordination
In centralized decision-making, one person decides. Add more informed stakeholders and you add coordination costs.
Each person has preferences. Each sees different aspects of the problem. Each wants input. Reaching consensus requires meetings, compromise, and politics. The more people informed, the harder to coordinate.
This explains why successful organizations often limit information: not to hide things, but to keep decision-making efficient. The product team doesn't need to know financial details irrelevant to product decisions. Finance doesn't need to know technical implementation details irrelevant to budget allocation.
Strategic Costs: Information Helps Competitors and Reduces Flexibility
Transparent organizations telegraph strategy to competitors. Competitors can copy what works and avoid what doesn't, getting benefits without bearing experimentation costs.
Public companies face this constantly: detailed disclosure helps competitors more than it helps shareholders. The tension between transparency requirements and competitive advantage is real.
Internal transparency creates similar issues. If everyone knows you're considering a reorganization, people optimize for the expected structure rather than current reality. If teams know budget is tight, they hoard resources instead of sharing.
Information that would improve decisions in a cooperative world enables gaming in a competitive or political environment.
Gaming: When You Measure It, People Optimize for It
Goodhart's Law: "When a measure becomes a target, it ceases to be a good measure."
Make metrics public and people optimize for the metrics rather than underlying performance. Customer service reps optimize handle time over problem resolution. Sales reps optimize pipeline metrics over actual deals. Engineers optimize visible commits over actual code quality.
The more transparent the metric, the more aggressively it gets gamed. This doesn't mean people are bad; it means they respond rationally to incentives created by visible measurement.
Sometimes opacity is valuable precisely because it prevents gaming. If people don't know exactly what's measured, they can't optimize for it. They must actually perform well rather than perform well on measurements.
The Transparency Trade-Off
Transparency has real benefits. But it has real costs. The optimal level depends on context.
Benefits of Transparency
Trust: People trust what they can verify. Transparency builds trust by making actions and decisions visible. "Trust but verify" requires information to verify.
Informed decisions: Better information enables better decisions—when people can process it and when it's actually relevant.
Accountability: Visibility prevents abuse. When actions are visible, people can't easily hide mistakes, corner-cutting, or self-dealing.
Reduced information asymmetry: Making information available reduces adverse selection and signaling costs. Markets work better with symmetric information.
Costs of Transparency
Loss of negotiating leverage: Revealing your reservation price, your alternatives, or your constraints weakens your position.
Strategic exposure: Telegraphing plans helps competitors, reduces flexibility to pivot, and creates commitment devices you didn't intend.
Gaming and compliance theater: Visible metrics get gamed. People optimize for measurement over performance.
Coordination costs: More informed stakeholders mean more opinions, more politics, more time to reach decisions.
The trade-off is context-dependent. There's no universal right answer about transparency level.
When to Reduce Information Flow
Negotiations
Don't reveal your reservation price. Don't show your full hand. Don't telegraph your constraints.
Negotiation is adversarial. Information asymmetry is leverage. Giving away information is giving away value.
This doesn't mean lying. It means controlling what you reveal, when, and how.
Strategy Development
Early-stage strategic thinking benefits from opacity. If every iteration is public, people react to drafts as if they're final. If competitors see plans before execution, they can preempt.
Some strategic thinking should happen in small groups with limited information flow until plans are solid enough to communicate broadly.
Sensitive Personnel Issues
Privacy matters. Not everything should be public. Performance issues, compensation details, personal circumstances; these require discretion.
Transparency in aggregate (how compensation works, how performance is evaluated) is valuable. Transparency at individual level is invasive and creates unproductive comparison and resentment.
Early-Stage Projects
Too much feedback too early kills innovation. If every experiment requires consensus, nothing experimental happens.
Giving teams room to explore without constant visibility creates space for ideas that wouldn't survive premature scrutiny. Some projects should be stealth until they're ready for broader input.
When to Increase Information Flow
Coordination-Dependent Work
When success requires coordination across teams, people need shared context. Cross-functional projects. Integrated systems. Synchronized releases.
Limiting information in coordination-dependent work creates failures where teams optimize locally without understanding global constraints.
Crises
Information vacuums fill with rumors and speculation. In crises, uncertainty is worse than bad news.
Over-communicate in crises. Provide updates even when there's nothing new. Make information flow frequent and predictable. Reduce uncertainty by being transparent about what's known and what isn't.
Change Management
People resist change partly because uncertainty is threatening. Transparency about what's changing, why, and how reduces resistance.
This doesn't mean revealing everything. It means providing enough information that people can understand and prepare for changes affecting them.
Accountability Situations
When abuse or incompetence is possible, transparency prevents it. Financial oversight. Quality audits. Performance reviews that affect compensation.
Opacity enables hiding problems. Transparency exposes them, creating incentives for better performance.
Designing Information Architecture
Rather than defaulting to "transparent" or "opaque," design information flow deliberately.
Who Needs to Know?
Not everyone needs all information. Target information to who can use it.
Engineers need technical details. Leadership needs strategic context. Finance needs budget data. Not everyone needs everything.
Over-sharing creates noise. Under-sharing creates blind spots. Match information to audience.
What Do They Need to Know?
Different audiences need different granularity. Executives need summaries. Analysts need details. Stakeholders need highlights.
Provide appropriate detail level. Too much overwhelms. Too little leaves people unable to make informed decisions.
When Do They Need to Know?
Timing matters. Real-time information for operational decisions. Periodic updates for strategic oversight. Final outcomes for accountability.
Don't provide information before people can use it; that's just noise. Don't withhold information past when it's needed; that's information asymmetry creating problems.
How Should They Receive It?
Medium matters. Dashboards for ongoing monitoring. Reports for periodic review. Meetings for discussion and alignment.
Match medium to content and purpose. Not everything needs a meeting. Not everything should be asynchronous.
Information Cascades and Herding
A subtle problem with information sharing: people observe others' actions and discount their own information.
Classic example: A restaurant looks empty. Is it bad, or are you just early? You see the next person walk past. They saw you walk past. You each used the other's decision as information, ignoring your own judgment. The restaurant stays empty even if it's good.
This is information cascade - each person's decision provides information to others, creating herding behavior that can be collectively wrong.
Organizations exhibit identical dynamics:
Someone objects to a plan. Others pile on, ignoring their own assessment.
A project gets funded. Others assume it was vetted thoroughly and support it.
A strategy seems popular. People support it to avoid looking contrarian.
Information cascades can lead to correct convergence or collective mistakes. The problem is that later decisions aren't truly independent—they're contaminated by observing earlier decisions.
This argues for sometimes collecting information or decisions independently rather than sequentially. Don't let early opinions cascade into false consensus.
Signaling and Credibility
Some information is cheap talk—easy to claim, hard to verify. Other information is costly signal, expensive to fake, credible when demonstrated.
Cheap Talk vs. Costly Signals
Saying "we value quality" is cheap talk. Anyone can say it. It reveals nothing.
Slowing releases to fix quality issues is costly signal. It's expensive, so claiming to value quality while shipping broken products isn't credible.
Employees learn quickly whether words or actions matter. If you say collaboration is important but reward individual heroics, people believe the rewards, not the words.
Making Commitments Credible
Commitments without costs aren't credible. "We're committed to this strategy" means nothing if there's no cost to abandoning it.
Credible commitments require:
Irreversible investments (hiring specialized teams, building dedicated infrastructure)
Public declarations (reputation cost to backing out)
Contractual obligations (actual penalties for non-delivery)
In negotiations and strategy, credibility comes from demonstrating costly commitment, not just stating intent.
When to Be Transparent to Build Trust
Transparency can be a costly signal of trustworthiness. Voluntarily revealing information you could hide signals you have nothing to hide.
Open-source code signals confidence in quality. Public roadmaps signal commitment to direction. Transparent pricing signals you're not exploiting information asymmetry.
These signals work because they're costly; they give up information advantage, make you accountable, and constrain flexibility.
When to Be Opaque to Preserve Flexibility
Sometimes you want to preserve flexibility, not signal commitment. Revealing plans creates expectations and commitment devices.
If you're uncertain about direction, opacity preserves optionality. If you're negotiating, opacity preserves leverage. If you're exploring, opacity prevents premature reactions.
Opacity signals that commitment is not yet made. This can be valuable when you genuinely aren't ready to commit.
Bringing It Together
This article concludes the Strategic Design Frameworks series. Information economics ties together the preceding frameworks:
Game theory (Article 1) requires information. Cooperation depends on knowing others' actions. Competition depends on hiding yours. The games people play depend on what they know and when.
Network structure (Article 2) determines information flow. Brokers control information between groups. Central nodes aggregate information. Structural holes create information asymmetries.
Agency problems (Article 3) stem from information asymmetry. Agents know things principals don't. This information advantage enables pursuing their own interests. Solving agency problems requires reducing information gaps.
Systems dynamics (Article 4) depend on feedback, information about system state feeding back to affect decisions. Cut off information feedback and you're flying blind, making interventions that backfire.
All of these frameworks involve information: who has it, who doesn't, how it flows, and what effects this creates.
What This Means for You
Stop defaulting to "transparent" or "opaque" as binary choices. Instead, design information architecture deliberately:
Identify what information exists (strategic plans, financial data, performance metrics, competitive intelligence)
Assess costs and benefits of sharing:
Costs: Processing load, coordination burden, gaming risk, strategic exposure
Benefits: Trust, informed decisions, accountability, coordination
Design who gets what, when, how:
Who: Match information to who can use it
What: Appropriate granularity for different audiences
When: Timing aligned with when decisions get made
How: Medium matched to content and purpose
Adjust based on context:
Increase transparency for accountability and coordination
Reduce transparency for negotiations, early exploration, and gaming prevention
Use transparency strategically as signal:
Reveal to build trust when you have nothing to hide
Conceal to preserve flexibility when you haven't committed
The default advice, "be transparent", treats information as free and universally beneficial. Information economics reveals this is wrong. Information has costs. Transparency has trade-offs. The optimal information design depends on what you're trying to achieve.
Your organization has an information architecture whether you designed it or it emerged accidentally. The question is whether it serves your objectives or undermines them.
Design it deliberately. Recognize that more information isn't always better. Understand that sometimes strategic opacity is more valuable than naive transparency.
Information is a tool. Use it strategically.

