Thinker20th-centuryPostwar analytic and information age

Claude Elwood Shannon

Also known as: Claude E. Shannon, Father of Information Theory

Claude Elwood Shannon (1916–2001) was an American mathematician and electrical engineer whose work founded information theory and transformed modern conceptions of communication, randomness, and information. Trained at the University of Michigan and MIT, Shannon first gained attention by demonstrating that Boolean logic and electrical relay circuits were structurally isomorphic, providing an abstract, algebraic foundation for digital computing. During World War II, work on cryptography and secure communication at Bell Labs sharpened his sense that information could be rigorously quantified, regardless of its semantic content. In his groundbreaking 1948 paper “A Mathematical Theory of Communication,” Shannon introduced information entropy, channel capacity, and coding theorems that established limits on reliable communication in the presence of noise. While his theory was explicitly "syntactic"—concerned with structure rather than meaning—it indirectly reshaped philosophy of language, epistemology, philosophy of mind, and metaphysics by framing information as a measurable, physical and probabilistic quantity. Philosophers and theorists later extended and criticized this framework in debates about semantic information, cognition as information processing, and information-based accounts of reality. Shannon’s austere, engineering-oriented perspective thus became a central reference point for analytic philosophy of information and for broader reflections on the digital and cybernetic condition.

At a Glance

Quick Facts
Field
Thinker
Born
1916-04-30Petoskey, Michigan, United States
Died
2001-02-24Medford, Massachusetts, United States
Cause: Complications of Alzheimer's disease
Active In
United States, North America
Interests
Information theoryCommunication systemsCryptographyDigital logicError-correcting codesRandomness and probabilityMachine intelligenceMathematical foundations of communication
Central Thesis

Information can be treated as a quantifiable, probabilistic property of messages, independent of their meaning, and the effectiveness and limits of communication systems can be rigorously analyzed in terms of this measured information, the capacity of channels, and the presence of noise.

Major Works
A Symbolic Analysis of Relay and Switching Circuitsextant

A Symbolic Analysis of Relay and Switching Circuits

Composed: 1937–1938

Communication Theory of Secrecy Systemsextant

Communication Theory of Secrecy Systems

Composed: 1945–1949

A Mathematical Theory of Communicationextant

A Mathematical Theory of Communication

Composed: 1945–1948

The Mathematical Theory of Communicationextant

The Mathematical Theory of Communication

Composed: 1948–1949

Prediction and Entropy of Printed Englishextant

Prediction and Entropy of Printed English

Composed: 1950–1951

Coding Theorems for a Discrete Source with a Fidelity Criterionextant

Coding Theorems for a Discrete Source with a Fidelity Criterion

Composed: 1958–1959

Key Quotes
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.
Claude E. Shannon, "A Mathematical Theory of Communication," Bell System Technical Journal, 1948.

Opening formulation of the communication problem; exemplifies his deliberately austere and structural approach, which later provoked philosophical reflection on what this leaves out about meaning and understanding.

Frequently the messages have meaning; that is, they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.
Claude E. Shannon, "A Mathematical Theory of Communication," Bell System Technical Journal, 1948.

Shannon explicitly brackets semantics in order to construct a "syntactic" theory of communication, a move that became central to philosophical debates about whether and how semantic information can be reduced to or built on top of Shannon information.

Information is the resolution of uncertainty.
Paraphrased formulation derived from Shannon’s definition of information as reduction in logarithmic uncertainty; widely attributed to Shannon in secondary literature drawing on his 1948 paper.

A concise characterization used by philosophers and theorists to capture the epistemic dimension of Shannon’s measure: information as that which narrows the space of possibilities.

The significant aspect is that the actual message is one selected from a set of possible messages.
Claude E. Shannon, "A Mathematical Theory of Communication," Bell System Technical Journal, 1948.

Emphasizes that information depends on the range of alternatives, inspiring later philosophical analyses in which information and content are tied to counterfactual possibilities and probabilistic spaces.

My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea: 'You should call it entropy...'
Recounted in E. T. Jaynes, "Information Theory and Statistical Mechanics" (1957), quoting an anecdote attributed to Shannon about naming entropy.

Illustrates the conceptual kinship between Shannon’s information measure and thermodynamic entropy, a connection that has generated extensive philosophical debate about information, physical law, and the arrow of time.

Key Terms
Shannon Information (Information Entropy): A quantitative measure of the uncertainty or "surprise" associated with a random variable, defined as the expected value of the negative logarithm of its probability distribution.
Channel Capacity: The maximum rate at which information can be transmitted over a communication channel with arbitrarily low error, given the channel’s noise characteristics.
Signal-to-Noise Ratio (SNR): A measure comparing the strength of the desired signal to the level of background noise, crucial in Shannon’s analysis of how much information can be reliably transmitted.
Redundancy: The portion of a message that is predictable or repeated, which in Shannon’s theory can be exploited both to compress data and to correct errors in noisy communication channels.
Shannon–Weaver Model of Communication: A structural model distinguishing information source, transmitter, channel, receiver, and destination, focusing on the technical transmission of signals rather than their semantic content.
Shannon Entropy vs. Thermodynamic Entropy: A conceptual linkage between the probabilistic measure of information and the physical measure of disorder in statistical mechanics, inviting philosophical analysis of information as a physical quantity.
Computational Theory of Mind: A family of views in [philosophy](/topics/philosophy/) and cognitive science that conceive mental states and processes as forms of information processing, heavily indebted to Shannon’s formalization of information.
Intellectual Development

Formative Years and Engineering Intuition (1916–1936)

Growing up in rural Michigan, Shannon showed early talent for tinkering with radios, telegraph lines, and mechanical devices. Undergraduate studies in electrical engineering and mathematics at the University of Michigan exposed him to Boolean logic and classical physics, nurturing a habit of thinking simultaneously in concrete hardware and abstract symbolic terms.

Digital Logic and Abstraction at MIT (1936–1941)

As a graduate student at MIT, Shannon worked with Vannevar Bush’s differential analyzer and produced his master’s thesis demonstrating how Boolean algebra could describe relay and switching circuits. This phase crystallized his characteristic style: representing physical systems with minimalist mathematical structures, a methodological stance that later informed his treatment of information and communication.

War Work and the Birth of Information Theory (1941–1949)

At Bell Labs during World War II, Shannon studied error, noise, and secrecy in communication systems and wrote a classified report on cryptography. These experiences led directly to his 1948–49 work on the mathematical theory of communication, where philosophical questions about meaning were bracketed in favor of structural properties—yet his formalism implicitly challenged existing views on language, signal, and knowledge.

Consolidation, Generalization, and Play (1950–1966)

In the decades after his seminal paper, Shannon explored coding theory, sequential machines, and early machine intelligence (including chess-playing machines and mechanical mice). At MIT he occupied a key role in shaping communications science; philosophically, this phase deepened the reach of his information-theoretic concepts into cybernetics, systems theory, and emerging cognitive science.

Late Life and Posthumous Influence (1966–2001 and beyond)

After early retirement from MIT, Shannon published little and spent much of his time building whimsical gadgets and juggling robots. However, his mathematical ideas gained increasing prominence in philosophy, especially as digital computing, cognitive science, and the philosophy of information developed. Posthumously, his work is integral to debates on semantic information, computational theories of mind, and information-centric metaphysics.

1. Introduction

Claude Elwood Shannon (1916–2001) is widely regarded as the originator of modern information theory, a mathematical framework that reshaped understandings of communication, randomness, and information across the sciences and in philosophy. His 1948 paper A Mathematical Theory of Communication introduced a precise quantitative measure of information—now called Shannon entropy—and a general model of communication that separates source, channel, and receiver. These constructions became foundational for digital communication and for later philosophical reflection on information and meaning.

Shannon’s background in both electrical engineering and mathematics enabled him to connect Boolean logic with relay circuits, giving an abstract basis for digital computation, and to treat messages as probabilistic structures transmitted through noisy channels. Proponents of his approach emphasize its economy: it deliberately brackets questions of semantics and interpretation in order to analyze how reliably symbols can be encoded, transmitted, and decoded.

Philosophers, cognitive scientists, and theorists of language later appropriated Shannon’s ideas in divergent ways. Some interpret his work as supplying a neutral, syntactic core on which richer theories of semantic information and mental representation can be built; others read it as emblematic of a broader “information age” mentality that risks reducing meaning, agency, and social context to quantifiable signals.

The following sections examine Shannon’s life and context, the development of his thinking, his principal technical achievements, and the diverse philosophical uses and critiques of his conception of information, while maintaining the distinction he drew between engineering problems and questions of meaning.

2. Life and Historical Context

Shannon’s life unfolded alongside the emergence of radio, telephony, digital computing, and Cold War communications research, and his career is often situated within this broader transformation of twentieth‑century technology.

Biographical outline and milieu

Born in rural Michigan in 1916, Shannon studied electrical engineering and mathematics at the University of Michigan before moving to MIT in the late 1930s, a period when analog devices like Vannevar Bush’s differential analyzer dominated computation. His early exposure to telegraphy, radio, and mechanical gadgets is frequently cited as a formative backdrop to his later interest in abstraction grounded in hardware.

World War II marked a decisive historical context. Working at Bell Telephone Laboratories, Shannon contributed to military communication and cryptography efforts. Historians of technology emphasize that Bell Labs, combining industrial research with national defense projects, provided an environment where practical engineering problems (noise, bandwidth, secrecy) intersected with high-level mathematical tools.

Historical positioning

Shannon’s major achievements appeared in the late 1940s, just as:

DomainConcurrent developments (approx.)
ComputingEarly digital machines (ENIAC, EDVAC)
PhysicsConsolidation of quantum mechanics, statistics
CyberneticsNorbert Wiener’s Cybernetics (1948)
CommunicationExpansion of telephone, radio, and radar

Scholars disagree on how tightly Shannon should be linked to cybernetics. Some frame him as a central cybernetic thinker, given shared emphases on feedback and communication; others stress his relative independence from Wiener’s explicitly interdisciplinary and philosophical program.

Cold War conditions—including interest in secure communication, command-and-control, and early computing—also shaped the reception of Shannon’s work. Critics in media and social theory later interpreted this context as contributing to an instrumental, militarized conception of information, whereas engineering historians more often portray it as an enabling background for a generalizable theory of communication.

3. Intellectual Development

Shannon’s intellectual trajectory is often divided into phases that reflect changing problems and methods, while retaining a distinctive tendency toward structural abstraction.

Early formation and dual training

During his undergraduate years at the University of Michigan, Shannon studied electrical engineering and mathematics in parallel. Commentators note that this dual training encouraged him to move easily between circuit diagrams and symbolic expressions, setting the stage for his later unification of Boolean algebra and relay circuits.

At MIT in the mid‑1930s, work with Vannevar Bush’s differential analyzer led to his master’s thesis, where he modeled switching circuits using Boolean logic. This step from physical machinery to algebraic representation is frequently identified as his first major conceptual leap.

War-time focus and the birth of information theory

At Bell Labs, wartime projects on cryptography, fire-control systems, and noisy telephone channels sharpened Shannon’s interest in probability, error, and redundancy. His classified report on secrecy systems anticipated many ideas later published in “Communication Theory of Secrecy Systems.” Historians of science argue that this period consolidated his characteristic style: treating diverse communication problems as instances of one abstract structure.

Postwar generalization and playful exploration

In the 1950s and early 1960s, Shannon broadened his formal tools to include Markov processes, coding theory, and models of sequential machines, while also building chess-playing programs and maze-solving mechanical mice. Some interpreters see these experiments as tentative explorations toward machine intelligence; others view them mainly as demonstrations of information‑theoretic principles rather than precursors to full-fledged artificial intelligence.

After his early retirement from MIT in 1966, Shannon’s published output declined, but his private tinkering with gadgets, juggling robots, and unicycles is often read as a continuous extension of his interest in combinatorics, randomness, and games into more whimsical domains.

4. Major Works

Shannon’s major publications span switching theory, cryptography, and information theory proper. They are frequently treated as landmarks in the mathematical and engineering literature.

Key texts

WorkFocusTypical significance attributed
A Symbolic Analysis of Relay and Switching Circuits (1937)Application of Boolean algebra to relay circuitsFoundation for digital circuit design and switching theory
Communication Theory of Secrecy Systems (1945, declassified 1949)Mathematical analysis of cryptographic systemsFormal conditions for “perfect secrecy”; links between cryptography and communication theory
A Mathematical Theory of Communication (1948)General theory of information transmissionIntroduction of entropy, channel capacity, and coding theorems
The Mathematical Theory of Communication (1949, with Weaver)Book version of 1948 paper plus philosophical introductionVehicle for cross‑disciplinary dissemination, especially via Weaver’s interpretive essay
Prediction and Entropy of Printed English (1951)Estimating redundancy and entropy of English textEmpirical grounding for information measures applied to language
Coding Theorems for a Discrete Source with a Fidelity Criterion (1959)Rate–distortion theoryExtension of capacity concepts to lossy compression with controlled distortion

Interpretive perspectives

Engineering and mathematical commentators typically emphasize the technical originality of these works, especially the coding theorems and capacity formulas. Historians of computing focus on the 1937 thesis as a turning point toward digital logic, whereas information theorists regard the 1948 paper as inaugurating an autonomous discipline.

Philosophically oriented readers often single out the combination of the 1948 paper with Warren Weaver’s interpretive introduction as a crucial moment where a strictly syntactic communication theory was presented alongside broader speculations about human communication, meaning, and society, which then influenced subsequent philosophical debates.

5. Core Ideas and Concepts

Shannon’s core concepts form a tightly interrelated framework for analyzing communication systems, emphasizing probabilistic structure over semantic content.

Information entropy and uncertainty

Shannon defined entropy as a function of a probability distribution over possible messages, quantifying expected “surprise” or uncertainty. Higher entropy corresponds to more unpredictable sources. For many interpreters, this formalization underlies the oft‑quoted paraphrase that “information is the resolution of uncertainty.”

Proponents highlight several features:

  • It depends on a set of alternatives and their probabilities, not on message meaning.
  • It is additive for independent sources.
  • It provides natural limits on compression and transmission.

Channel capacity, noise, and coding

Shannon’s channel capacity concept specifies the maximum rate at which information can be sent with arbitrarily low error over a given channel. His noisy-channel coding theorem asserts that, below this rate, appropriate coding can make error probabilities as small as desired.

Associated ideas include:

  • Signal-to-noise ratio (SNR) as a physical determinant of capacity in analog channels.
  • Redundancy as both a cost (for compression) and a resource (for error correction).
  • The Shannon–Weaver model separating source, transmitter, channel, receiver, and destination.

Relation to thermodynamic entropy

The mathematical resemblance between Shannon entropy and thermodynamic entropy has prompted extensive discussion. Some writers stress a deep conceptual link, treating information as a physical magnitude, while others argue that any connection is largely formal and depends on how probability is interpreted in statistical mechanics.

Scope and limits

Shannon repeatedly emphasized that his theory concerns the technical aspects of communication. Later sections examine how this restriction has been interpreted, extended, or questioned in efforts to connect Shannon information with semantics, cognition, and physical reality.

6. Methodology and Style of Reasoning

Shannon’s work exhibits a distinctive methodological profile, often cited as exemplary of mid‑twentieth‑century mathematical engineering.

Structural abstraction from concrete systems

Commentators emphasize his habit of beginning with practical communication or circuit problems and then stripping away detail to reveal a minimal abstract structure. In his switching-circuit thesis, physical relays become realizations of Boolean variables; in his communication theory, diverse media (telephone lines, radio, telegraph) are modeled as channels characterized by probability distributions.

This style is characterized by:

  • Model minimalism: retaining only properties relevant to a specific question (e.g., reliability, rate), leaving aside meaning or psychological states.
  • Probabilistic formalization: systematic use of random variables, distributions, and expectations.

Balance of rigor and engineering intuition

Shannon’s proofs, particularly in the 1948 paper, have been described as combining rigorous arguments with coding constructions that are conceptually clear but sometimes non‑constructive (e.g., proofs that good codes exist without specifying them in detail). Mathematicians later refined aspects of these proofs, while engineers appreciated their problem‑driven clarity.

Experimental and playful reasoning

His laboratory work on mechanical mice, juggling machines, and game‑playing devices reveals a complementary experimental style: exploring information-processing ideas through physical artifacts. Some analysts view these devices as heuristic tools that made abstract concepts more tangible; others emphasize their role in testing limits of automation and randomness.

Attitude toward interpretation

Shannon tended to avoid extensive philosophical elaboration of his own results. He framed his theory as an engineering tool, leaving broader interpretive questions largely to others. This methodological restraint has been praised as preserving clarity of scope, and also criticized by those who see it as encouraging later overextensions or misreadings of his formalism.

7. Philosophical Relevance and Key Contributions

While Shannon himself presented his theory as an engineering discipline, philosophers and theorists have treated his ideas as central to the philosophy of information, language, and mind.

Syntax–semantics distinction

Shannon’s explicit bracketing of meaning—his claim that “semantic aspects of communication are irrelevant to the engineering problem”—has been foundational for later debates about the relation between syntactic information and semantic content. Proponents of an information‑theoretic approach to meaning (e.g., Fred Dretske, Luciano Floridi) typically begin by distinguishing Shannon information from semantic information, then argue for ways to build the latter on the former (via causal correlation, truth conditions, or constraints on possibilities).

Critics contend that Shannon’s framework, when generalized, risks treating meaning as a mere by‑product of signal correlation, ignoring intentionality, context, or normativity.

Epistemology and evidence

Entropy and mutual information have been interpreted as measures of uncertainty reduction, inspiring accounts where learning and evidence are modeled as probabilistic updates. Some epistemologists use Shannon measures to formalize confirmation, surprise, or relevance. Others argue that such measures capture only part of epistemic appraisal, omitting justification, warrant, or understanding.

Philosophy of mind and cognition

Computational and information‑processing theories of mind rely heavily on Shannon-style concepts of representation and coding. Neural and cognitive models often treat perception and cognition as information transmission and transformation under uncertainty. Supporters regard Shannon’s framework as providing a neutral vocabulary for such models. Opponents maintain that mental content involves more than informational correlations—for instance, normative or phenomenological aspects that cannot be reduced to Shannon information.

Metaphysics and philosophy of science

Shannon entropy’s similarity to thermodynamic entropy has motivated proposals in which information is treated as a fundamental physical quantity, on par with matter and energy. “Information‑centric” metaphysical views use Shannon’s formalism to argue that reality at base may be informational. Alternative perspectives caution that Shannon’s concept is explicitly model‑relative and system‑bound, and therefore should not be straightforwardly reified into a universal metaphysical substrate.

8. Impact on Information Theory and Cognitive Science

Shannon’s influence within information theory and cognitive science is both direct and mediated through subsequent formalisms and empirical programs.

Development of information theory as a discipline

Following the 1948 paper, researchers extended Shannon’s framework to:

  • Error‑correcting codes and coding theory (Hamming, Reed–Solomon, etc.).
  • Rate–distortion theory for lossy compression.
  • Multiuser and network information theory.

Within this tradition, Shannon’s concepts of entropy, mutual information, and channel capacity are regarded as the basic toolkit. Some later theorists emphasize the ongoing relevance of his asymptotic, probabilistic viewpoint; others argue for alternative frameworks (e.g., algorithmic information theory, Kolmogorov complexity) as complementary or, in some contexts, more fundamental.

Influence on cognitive science and neuroscience

Cognitive scientists adopted Shannon measures to characterize sensory coding, perceptual efficiency, and neural information transmission. Examples include:

  • Studies of retinal and sensory neuron responses framed in terms of information per spike.
  • Models of perception as optimal information extraction under resource constraints.
  • Applications of mutual information to quantify dependence between stimuli and neural responses.

Supporters claim that Shannon information offers a neutral metric for evaluating how well biological systems encode environmental variables. Critics suggest that such measures may overlook the role of internal goals, semantics, and higher‑level cognitive structures.

In theoretical neuroscience, efficient coding and predictive coding principles often explicitly reference Shannon’s capacity and redundancy ideas, although their philosophical implications remain debated.

Relation to artificial intelligence and machine learning

Early AI work on game‑playing and pattern recognition drew directly on information‑theoretic notions of search, heuristics, and uncertainty. In contemporary machine learning, cross‑entropy loss, mutual information objectives, and information bottleneck methods trace conceptually back to Shannon, even when not explicitly framed as “Shannonian.”

Some researchers propose that information theory provides a unifying language for learning and inference; others argue that many advances in machine learning proceed largely independently of classical information theory, relying instead on optimization and statistical learning theory, with Shannon measures used mainly as convenient loss functions or diagnostics.

9. Critiques, Limitations, and Misinterpretations

Shannon’s work has inspired a substantial critical literature that distinguishes between the original engineering theory and later uses or extrapolations.

Scope and semantic limitations

A frequent critique emphasizes that Shannon’s theory, by design, does not address meaning, reference, or intentionality. Philosophers of language and semioticians argue that attempts to treat his notion of information as sufficient for semantics conflate correlation with aboutness. Proponents of information‑based semantics respond that Shannon measures provide a necessary quantitative backbone, to be supplemented with additional conditions (e.g., truth, context, or use).

Human communication and social context

Communication and media theorists contend that the Shannon–Weaver model, when generalized to human communication, can obscure power relations, pragmatic nuances, and mutual interpretation. They argue that treating communication as transmission of signals through a channel neglects dialogue, miscommunication, and context. Defenders of Shannon’s framework typically reply that such extensions exceed his intended domain and that the model is best viewed as one layer within richer theories.

Overextension into metaphysics and physics

Some critics caution against reifying Shannon information as a universal ontological substrate, suggesting that its definition is model‑dependent and tied to an agent’s partition of possibilities. Similarly, the analogy with thermodynamic entropy is seen by skeptics as sometimes overstated, especially when used to argue for sweeping conclusions about time’s arrow or the nature of reality.

In popular and cross-disciplinary writing, “information” is often used loosely, leading to conflation of:

Term as usedTypical misreading
Shannon informationMeaning, knowledge, or significance
Entropy (information)Disorder or chaos in a literal physical sense
RedundancyMere repetition, without role in error correction

Scholars across fields stress the importance of distinguishing Shannon’s precise technical definitions from colloquial uses to avoid attributing to his theory claims it does not make—for example, that it explains consciousness, culture, or understanding by itself.

10. Legacy and Historical Significance

Shannon’s legacy spans engineering practice, theoretical science, and philosophical reflection on the “information age.”

Transformation of communication and computing

Within engineering and computer science, Shannon is often named the “father of information theory”. His results underpin digital communication standards, data compression, and error‑correcting codes. Historians highlight his 1937 switching-circuit work as pivotal for the conceptualization of digital computers, even though practical machines were designed by others.

Position in twentieth‑century intellectual history

Shannon is frequently situated alongside figures such as Norbert Wiener, John von Neumann, and Alan Turing in shaping the conceptual foundations of cybernetics, computing, and systems theory. Some narratives emphasize his relative reticence about philosophical speculation compared with Wiener or von Neumann, suggesting that this restraint contributed to the durability and wide applicability of his formalism.

Symbol of the information age

In broader culture, Shannon’s name has become associated with the shift from industrial to informational infrastructures. Sociologists and media theorists refer to his work when tracing how communication, control, and data have become central to economic and political organization. Some critical perspectives portray Shannon’s theory as emblematic of a technocratic orientation that privileges quantification and control; others see it as a neutral toolkit adaptable to diverse values and purposes.

Continuing influence and reinterpretation

Shannon’s measures and models remain standard references in fields as varied as genetics (information in DNA), neuroscience (neural coding), and cosmology (information in black holes), even when these applications extend beyond his original intentions. Contemporary philosophy of information often treats his framework as a starting point to be generalized, constrained, or supplemented, rather than as a complete theory of information or meaning.

In this way, Shannon’s historical significance lies not only in the specific theorems he proved, but also in establishing information as a central, mathematically tractable concept in twentieth‑ and twenty‑first‑century thought.

How to Cite This Entry

Use these citation formats to reference this thinkers entry in your academic work. Click the copy button to copy the citation to your clipboard.

APA Style (7th Edition)

Philopedia. (2025). Claude Elwood Shannon. Philopedia. https://philopedia.com/thinkers/claude-elwood-shannon/

MLA Style (9th Edition)

"Claude Elwood Shannon." Philopedia, 2025, https://philopedia.com/thinkers/claude-elwood-shannon/.

Chicago Style (17th Edition)

Philopedia. "Claude Elwood Shannon." Philopedia. Accessed December 11, 2025. https://philopedia.com/thinkers/claude-elwood-shannon/.

BibTeX
@online{philopedia_claude_elwood_shannon,
  title = {Claude Elwood Shannon},
  author = {Philopedia},
  year = {2025},
  url = {https://philopedia.com/thinkers/claude-elwood-shannon/},
  urldate = {December 11, 2025}
}

Note: This entry was last updated on 2025-12-10. For the most current version, always check the online entry.