Jump to content

Claude Shannon

From Wikipedia, the free encyclopedia
(Redirected from C. E. Shannon)

Claude Shannon
Shannon c. 1950s
Born
Claude Elwood Shannon

(1916-04-30)April 30, 1916
DiedFebruary 24, 2001(2001-02-24) (aged 84)
EducationUniversity of Michigan (BS, BSE)
Massachusetts Institute of Technology (MS, PhD)
Known for
Spouse(s)Norma Levor (1940–41)
Betty Shannon (1949–2001)
Awards
Scientific career
FieldsMathematics, computer science, electronic engineering, artificial intelligence
Institutions
Theses
Doctoral advisorFrank Lauren Hitchcock
Doctoral students

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist, cryptographer and inventor known as the "father of information theory" and as the "father of the Information Age".[1] Shannon was the first to describe the Boolean gates (electronic circuits) that are essential to all digital electronic circuits, and was one of the founding fathers of artificial intelligence.[2][3][4][1] Shannon is credited with laying the foundations of the Information Age.[5][6][7]

At the University of Michigan, Shannon dual degreed, graduating with a Bachelor of Science in both electrical engineering and mathematics in 1936. A 21-year-old master's degree student at the Massachusetts Institute of Technology (MIT) in electrical engineering, his thesis concerned switching circuit theory, demonstrating that electrical applications of Boolean algebra could construct any logical numerical relationship,[8] thereby establishing the theory behind digital computing and digital circuits.[9] The thesis has been claimed to be the most important master's thesis of all time,[8] as in 1985, Howard Gardner described it as "possibly the most important, and also the most famous, master's thesis of the century",[10] while Herman Goldstine described it as "surely ... one of the most important master's theses ever written ... It helped to change digital circuit design from an art to a science."[11] It has also been called the "birth certificate of the digital revolution",[12] and it won the 1939 Alfred Noble Prize.[13] Shannon then graduated with a PhD in mathematics from MIT in 1940,[14] with his thesis focused on genetics, with it deriving important results, but it went unpublished.[15]

Shannon contributed to the field of cryptanalysis for national defense of the United States during World War II, including his fundamental work on codebreaking and secure telecommunications, writing a paper which is considered one of the foundational pieces of modern cryptography,[16] with his work described as "a turning point, and marked the closure of classical cryptography and the beginning of modern cryptography."[17] The work of Shannon is the foundation of secret-key cryptography, including the work of Horst Feistel, the Data Encryption Standard (DES), Advanced Encryption Standard (AES), and more.[17] As a result, Shannon has been called the "founding father of modern cryptography".[18]

His mathematical theory of communication laid the foundations for the field of information theory,[19][14] with his famous paper being called the "Magna Carta of the Information Age" by Scientific American,[6][20] along with his work being described as being at "the heart of today's digital information technology".[21] Robert G. Gallager referred to the paper as a "blueprint for the digital era".[22] Regarding the influence that Shannon had on the digital age, Solomon W. Golomb remarked "It's like saying how much influence the inventor of the alphabet has had on literature."[19] Shannon's theory is widely used and has been fundamental to the success of many scientific endeavors, such as the invention of the compact disc, the development of the Internet, feasibility of mobile phones, the understanding of black holes, and more, and is at the intersection of numerous important fields.[23][24] Shannon also formally introduced the term "bit".[25][7]

Shannon made numerous contributions to the field of artificial intelligence,[2] writing papers on programming a computer for chess, which have been immensely influential.[26][27] His Theseus machine was the first electrical device to learn by trial and error, being one of the first examples of artificial intelligence.[28][29] He also co-organized and participated in the Dartmouth workshop of 1956, considered the founding event of the field of artificial intelligence.[30][31]

Rodney Brooks declared that Shannon was the 20th century engineer who contributed the most to 21st century technologies,[28] and Solomon W. Golomb described the intellectual achievement of Shannon as "one of the greatest of the twentieth century".[32] His achievements are considered to be on par with those of Albert Einstein, Sir Isaac Newton, and Charles Darwin.[5][19][4][33]

Biography

[edit]

Childhood

[edit]

The Shannon family lived in Gaylord, Michigan, and Claude was born in a hospital in nearby Petoskey.[3] His father, Claude Sr. (1862–1934), was a businessman and, for a while, a judge of probate in Gaylord. His mother, Mabel Wolf Shannon (1880–1945), was a language teacher, who also served as the principal of Gaylord High School.[34] Claude Sr. was a descendant of New Jersey settlers, while Mabel was a child of German immigrants.[3] Shannon's family was active in their Methodist Church during his youth.[35]

Most of the first 16 years of Shannon's life were spent in Gaylord, where he attended public school, graduating from Gaylord High School in 1932. Shannon showed an inclination towards mechanical and electrical things. His best subjects were science and mathematics. At home, he constructed such devices as models of planes, a radio-controlled model boat and a barbed-wire telegraph system to a friend's house a half-mile away.[36] While growing up, he also worked as a messenger for the Western Union company.

Shannon's childhood hero was Thomas Edison, whom he later learned was a distant cousin. Both Shannon and Edison were descendants of John Ogden (1609–1682), a colonial leader and an ancestor of many distinguished people.[37][38]

Logic circuits

[edit]

In 1932, Shannon entered the University of Michigan, where he was introduced to the work of George Boole. He graduated in 1936 with two bachelor's degrees: one in electrical engineering and the other in mathematics.

In 1936, Shannon began his graduate studies in electrical engineering at the Massachusetts Institute of Technology (MIT), where he worked on Vannevar Bush's differential analyzer, which was an early analog computer that was composed of electromechanical parts and could solve differential equations.[39] While studying the complicated ad hoc circuits of this analyzer, Shannon designed switching circuits based on Boole's concepts. In 1937, he wrote his master's degree thesis, A Symbolic Analysis of Relay and Switching Circuits,[40] with a paper from this thesis published in 1938.[40] A revolutionary work for switching circuit theory, Shannon diagramed switching circuits that could implement the essential operators of Boolean algebra. Then he proved that his switching circuits could be used to simplify the arrangement of the electromechanical relays that were used during that time in telephone call routing switches. Next, he expanded this concept, proving that these circuits could solve all problems that Boolean algebra could solve. In the last chapter, he presented diagrams of several circuits, including a digital 4-bit full adder.[40] His work differed significantly from the work of previous engineers such as Akira Nakashima, who still relied on the existent circuit theory of the time and took a grounded approach.[41] Shannon's idea were more abstract and relied on mathematics, thereby breaking new ground with his work, with his approach dominating modern-day eletrical engineering.[41]

Using electrical switches to implement logic is the fundamental concept that underlies all electronic digital computers. Shannon's work became the foundation of digital circuit design, as it became widely known in the electrical engineering community during and after World War II. The theoretical rigor of Shannon's work superseded the ad hoc methods that had prevailed previously. Howard Gardner hailed Shannon's thesis "possibly the most important, and also the most noted, master's thesis of the century."[42] One of the reviewers of his work commented that "To the best of my knowledge, this is the first application of the methods of symbolic logic to so practical an engineering problem. From the point of view of originality I rate the paper as outstanding."[43] Shannon's master thesis won the 1939 Alfred Noble Prize.

Shannon received his PhD in mathematics from MIT in 1940.[37] Vannevar Bush had suggested that Shannon should work on his dissertation at the Cold Spring Harbor Laboratory, in order to develop a mathematical formulation for Mendelian genetics. This research resulted in Shannon's PhD thesis, called An Algebra for Theoretical Genetics.[44] However, the thesis went unpublished after Shannon lost interest, but it did contain important results.[15] Notably, he was one of the first to apply an algebraic framework to study theoretical population genetics.[45] In addition, Shannon devised a general expression for the distribution of several linked traits in a population after multiple generations under a random mating system, which was original at the time,[46] with the new theorem unworked out by other population geneticists of the time.[47]

In 1940, Shannon became a National Research Fellow at the Institute for Advanced Study in Princeton, New Jersey. In Princeton, Shannon had the opportunity to discuss his ideas with influential scientists and mathematicians such as Hermann Weyl and John von Neumann, and he also had occasional encounters with Albert Einstein and Kurt Gödel. Shannon worked freely across disciplines, and this ability may have contributed to his later development of mathematical information theory.[48]

Wartime research

[edit]

Shannon had worked at Bell Labs for a few months in the summer of 1937,[49] and returned there to work on fire-control systems and cryptography during World War II, under a contract with section D-2 (Control Systems section) of the National Defense Research Committee (NDRC).

Shannon is credited with the invention of signal-flow graphs, in 1942. He discovered the topological gain formula while investigating the functional operation of an analog computer.[50]

For two months early in 1943, Shannon came into contact with the leading British mathematician Alan Turing. Turing had been posted to Washington to share with the U.S. Navy's cryptanalytic service the methods used by the British Government Code and Cypher School at Bletchley Park to break the cyphers used by the Kriegsmarine U-boats in the north Atlantic Ocean.[51] He was also interested in the encipherment of speech and to this end spent time at Bell Labs. Shannon and Turing met at teatime in the cafeteria.[51] Turing showed Shannon his 1936 paper that defined what is now known as the "universal Turing machine".[52][53] This impressed Shannon, as many of its ideas complemented his own.

In 1945, as the war was coming to an end, the NDRC was issuing a summary of technical reports as a last step prior to its eventual closing down. Inside the volume on fire control, a special essay titled Data Smoothing and Prediction in Fire-Control Systems, coauthored by Shannon, Ralph Beebe Blackman, and Hendrik Wade Bode, formally treated the problem of smoothing the data in fire-control by analogy with "the problem of separating a signal from interfering noise in communications systems."[54] In other words, it modeled the problem in terms of data and signal processing and thus heralded the coming of the Information Age.

Shannon's work on cryptography was even more closely related to his later publications on communication theory.[55] At the close of the war, he prepared a classified memorandum for Bell Telephone Labs entitled "A Mathematical Theory of Cryptography", dated September 1945. A declassified version of this paper was published in 1949 as "Communication Theory of Secrecy Systems" in the Bell System Technical Journal. This paper incorporated many of the concepts and mathematical formulations that also appeared in his A Mathematical Theory of Communication. Shannon said that his wartime insights into communication theory and cryptography developed simultaneously, and that "they were so close together you couldn't separate them".[56] In a footnote near the beginning of the classified report, Shannon announced his intention to "develop these results … in a forthcoming memorandum on the transmission of information."[57]

While he was at Bell Labs, Shannon proved that the cryptographic one-time pad is unbreakable in his classified research that was later published in 1949. The same article also proved that any unbreakable system must have essentially the same characteristics as the one-time pad: the key must be truly random, as large as the plaintext, never reused in whole or part, and kept secret.[58]

Information theory

[edit]

In 1948, the promised memorandum appeared as "A Mathematical Theory of Communication", an article in two parts in the July and October issues of the Bell System Technical Journal. This work focuses on the problem of how best to encode the message a sender wants to transmit. Shannon developed information entropy as a measure of the information content in a message, which is a measure of uncertainty reduced by the message. In so doing, he essentially invented the field of information theory.

The book The Mathematical Theory of Communication[59] reprints Shannon's 1948 article and Warren Weaver's popularization of it, which is accessible to the non-specialist. Weaver pointed out that the word "information" in communication theory is not related to what you do say, but to what you could say. That is, information is a measure of one's freedom of choice when one selects a message. Shannon's concepts were also popularized, subject to his own proofreading, in John Robinson Pierce's Symbols, Signals, and Noise.

Information theory's fundamental contribution to natural language processing and computational linguistics was further established in 1951, in his article "Prediction and Entropy of Printed English", showing upper and lower bounds of entropy on the statistics of English – giving a statistical foundation to language analysis. In addition, he proved that treating space as the 27th letter of the alphabet actually lowers uncertainty in written language, providing a clear quantifiable link between cultural practice and probabilistic cognition.

Another notable paper published in 1949 is "Communication Theory of Secrecy Systems", a declassified version of his wartime work on the mathematical theory of cryptography, in which he proved that all theoretically unbreakable cyphers must have the same requirements as the one-time pad. He is credited with the introduction of sampling theorem, which he had derived as early as 1940,[60] and which is concerned with representing a continuous-time signal from a (uniform) discrete set of samples. This theory was essential in enabling telecommunications to move from analog to digital transmissions systems in the 1960s and later. He further wrote a paper in 1956 regarding coding for a noisy channel, which also became a classic paper in the field of information theory.[61]

Claude Shannon's influence has been immense in the field, for example, in a 1973 collection of the key papers in the field of information theory, he was author or coauthor of 12 of the 49 papers cited, while no one else appeared more than three times.[62] Even beyond his original paper in 1948, he is still regarded as the most important post-1948 contributor to the theory.[62]

In May 1951, Mervin Kelly, received a request from the director of the CIA, general Walter Bedell Smith, regarding Shannon and the need for him, as Shannon was regarded as, based on "the best authority" the "most eminently qualified scientist in the particular field concerned".[63] As a result of the request, Shannon became part of the CIA's Special Cryptologic Advisory Group or SCAG.[63]

Artificial Intelligence

[edit]

In 1950, Shannon, designed, and built with the help of his wife, a learning machine named Theseus. It consisted of a maze on a surface, through which a mechanical mouse could move through. Below the surface were sensors that followed the path of a mechanical mouse through the maze. After much trial and error, this device would learn the shortest path through the maze, and direct the mechanical mouse through the maze. The pattern of the maze could be changed at will.[29]

Mazin Gilbert stated that Theseus "inspired the whole field of AI. This random trial and error is the foundation of artificial intelligence."[29]

Shannon wrote multiple influential papers on artificial intelligence, such as his 1950 paper titled "Programming a Computer for Playing Chess", and his 1953 paper titled "Computers and Automata".[64] Alongside John McCarthy, he co-edited a book titled Automata Studies, which was published in 1956.[61] The categories in the articles within the volume were influenced by Shannon's own subject headings in his 1953 paper.[61] Shannon shared McCarthy’s goal of creating a science of intelligent machines, but also held a broader view of viable approaches in automata studies, such as neural nets, Turing machines, cybernetic mechanisms, and symbolic processing by computer.[61]

Shannon co-organized and participated in the Dartmouth workshop of 1956, alongside John McCarthy, Marvin Minsky and Nathaniel Rochester, and which is considered the founding event of the field of artificial intelligence.[65][31]

Teaching at MIT

[edit]

In 1956 Shannon joined the MIT faculty, holding an endowed chair. He worked in the Research Laboratory of Electronics (RLE). He continued to serve on the MIT faculty until 1978.

Later life

[edit]

Shannon developed Alzheimer's disease and spent the last few years of his life in a nursing home; he died in 2001, survived by his wife, a son and daughter, and two granddaughters.[66][67]

Hobbies and inventions

[edit]
The Minivac 601, a digital computer trainer designed by Shannon

Outside of Shannon's academic pursuits, he was interested in juggling, unicycling, and chess. He also invented many devices, including a Roman numeral computer called THROBAC, and juggling machines.[68][69] He built a device that could solve the Rubik's Cube puzzle.[37]

Shannon also invented flame-throwing trumpets, rocket-powered frisbees, and plastic foam shoes for navigating a lake, and which to an observer, would appear as if Shannon was walking on water.[70]

Shannon designed the Minivac 601, a digital computer trainer to teach business people about how computers functioned. It was sold by the Scientific Development Corp starting in 1961.[71]

He is also considered the co-inventor of the first wearable computer along with Edward O. Thorp.[72] The device was used to improve the odds when playing roulette.

Personal life

[edit]

Shannon married Norma Levor, a wealthy, Jewish, left-wing intellectual in January 1940. The marriage ended in divorce after about a year. Levor later married Ben Barzman.[73]

Shannon met his second wife, Mary Elizabeth Moore (Betty), when she was a numerical analyst at Bell Labs. They were married in 1949.[66] Betty assisted Claude in building some of his most famous inventions.[74] They had three children.[75]

Shannon presented himself as apolitical and an atheist.[76]

Tributes and legacy

[edit]
Statue of Claude Shannon at AT&T Shannon Labs

There are six statues of Shannon sculpted by Eugene Daub: one at the University of Michigan; one at MIT in the Laboratory for Information and Decision Systems; one in Gaylord, Michigan; one at the University of California, San Diego; one at Bell Labs; and another at AT&T Shannon Labs.[77] The statue in Gaylord is located in the Claude Shannon Memorial Park.[78] After the breakup of the Bell System, the part of Bell Labs that remained with AT&T Corporation was named Shannon Labs in his honor.

In June 1954, Shannon was listed as one of the top 20 most important scientists in America by Fortune.[79] In 2013, information theory was listed as one of the top 10 revolutionary scientific theories by Science News.[80]

According to Neil Sloane, an AT&T Fellow who co-edited Shannon's large collection of papers in 1993, the perspective introduced by Shannon's communication theory (now called "information theory") is the foundation of the digital revolution, and every device containing a microprocessor or microcontroller is a conceptual descendant of Shannon's publication in 1948:[81] "He's one of the great men of the century. Without him, none of the things we know today would exist. The whole digital revolution started with him."[82] The cryptocurrency unit shannon (a synonym for gwei) is named after him.[83]

Shannon is credited by many as single-handedly creating information theory and for laying the foundations for the Digital Age.[84][85][15][21][86][7]

The artificial intelligence large language model family Claude (language model) was named in Shannon's honor.

A Mind at Play, a biography of Shannon written by Jimmy Soni and Rob Goodman, was published in 2017.[87] They described Shannon as "the most important genius you’ve never heard of, a man whose intellect was on par with Albert Einstein and Isaac Newton".[88] Consultant and writer Tom Rutledge, writing for Boston Review, stated that "Of the computer pioneers who drove the mid-20th-century information technology revolution—an elite men’s club of scholar-engineers who also helped crack Nazi codes and pinpoint missile trajectories—Shannon may have been the most brilliant of them all."[33] Electrical engineer Robert Gallager stated about Shannon that "He had this amazing clarity of vision. Einstein had it, too – this ability to take on a complicated problem and find the right way to look at it, so that things become very simple."[22] In an obituary by Neil Sloane and Robert Calderbank, they stated that "Shannon must rank near the top of the list of major figures of twentieth century science".[89] Due to his work in multiple fields, Shannon is also regarded as a polymath.[90][91]

Historian James Gleick noted the importance of Shannon, stating that "Einstein looms large, and rightly so. But we’re not living in the relativity age, we’re living in the information age. It’s Shannon whose fingerprints are on every electronic device we own, every computer screen we gaze into, every means of digital communication. He’s one of these people who so transform the world that, after the transformation, the old world is forgotten."[1] Gleick further noted that "he created a whole field from scratch, from the brow of Zeus".[1]

On April 30, 2016, Shannon was honored with a Google Doodle to celebrate his life on what would have been his 100th birthday.[92][93][94][95][96][97]

The Bit Player, a feature film about Shannon directed by Mark Levinson premiered at the World Science Festival in 2019.[98] Drawn from interviews conducted with Shannon in his house in the 1980s, the film was released on Amazon Prime in August 2020.

The Mathematical Theory of Communication

[edit]

Weaver's Contribution

[edit]

Shannon's The Mathematical Theory of Communication,[59] begins with an interpretation of his own work by Warren Weaver. Although Shannon's entire work is about communication itself, Warren Weaver communicated his ideas in such a way that those not acclimated to complex theory and mathematics could comprehend the fundamental laws he put forth. The coupling of their unique communicational abilities and ideas generated the Shannon-Weaver model, although the mathematical and theoretical underpinnings emanate entirely from Shannon's work after Weaver's introduction. For the layman, Weaver's introduction better communicates The Mathematical Theory of Communication,[59] but Shannon's subsequent logic, mathematics, and expressive precision was responsible for defining the problem itself.

Other work

[edit]
Shannon and his electromechanical mouse Theseus (named after Theseus from Greek mythology) which he tried to have solve the maze in one of the first experiments in artificial intelligence
Theseus Maze in MIT Museum

Shannon's mouse

[edit]

"Theseus", created in 1950, was a mechanical mouse controlled by an electromechanical relay circuit that enabled it to move around a labyrinth of 25 squares.[99] The maze configuration was flexible and it could be modified arbitrarily by rearranging movable partitions.[99] The mouse was designed to search through the corridors until it found the target. Having travelled through the maze, the mouse could then be placed anywhere it had been before, and because of its prior experience it could go directly to the target. If placed in unfamiliar territory, it was programmed to search until it reached a known location and then it would proceed to the target, adding the new knowledge to its memory and learning new behavior.[99] Shannon's mouse appears to have been the first artificial learning device of its kind.[99]

Shannon's estimate for the complexity of chess

[edit]

In 1949 Shannon completed a paper (published in March 1950) which estimates the game-tree complexity of chess, which is approximately 10120. This number is now often referred to as the "Shannon number", and is still regarded today as an accurate estimate of the game's complexity. The number is often cited as one of the barriers to solving the game of chess using an exhaustive analysis (i.e. brute force analysis).[100][101]

Shannon's computer chess program

[edit]

On March 9, 1949, Shannon presented a paper called "Programming a Computer for playing Chess". The paper was presented at the National Institute for Radio Engineers Convention in New York. He described how to program a computer to play chess based on position scoring and move selection. He proposed basic strategies for restricting the number of possibilities to be considered in a game of chess. In March 1950 it was published in Philosophical Magazine, and is considered one of the first articles published on the topic of programming a computer for playing chess, and using a computer to solve the game.[100][102] In 1950, Shannon wrote an article titled "A Chess-Playing Machine",[103] which was published in Scientific American. Both papers have had immense influence and laid the foundations for future chess programs.[26][27]

His process for having the computer decide on which move to make was a minimax procedure, based on an evaluation function of a given chess position. Shannon gave a rough example of an evaluation function in which the value of the black position was subtracted from that of the white position. Material was counted according to the usual chess piece relative value (1 point for a pawn, 3 points for a knight or bishop, 5 points for a rook, and 9 points for a queen).[104] He considered some positional factors, subtracting ½ point for each doubled pawn, backward pawn, and isolated pawn; mobility was incorporated by adding 0.1 point for each legal move available.

Shannon's maxim

[edit]

Shannon formulated a version of Kerckhoffs' principle as "The enemy knows the system". In this form it is known as "Shannon's maxim".

Miscellaneous

[edit]

Shannon also contributed to combinatorics and detection theory.[105] His 1948 paper introduced many tools used in combinatorics. He did work on detection theory in 1944, with his work being one of the earliest expositions of the “matched filter” principle.[105]

He was known as a successful investor who gave lectures on investing. A report from Barron's on August 11, 1986 detailed the recent performance of 1,026 mutual funds, and Shannon achieved a higher return than 1,025 of them. Comparing the portfolio of Shannon from the late 1950s to 1986, to Warren Buffet's of 1965 to 1995, Shannon had a return of about 28% percent, compared to 27% for Buffett.[106] One such method of Shannon's was labeled Shannon's demon, which was to form a portfolio of equal parts cash and a stock, and rebalance regularly to take advantage of the stock's randomly jittering price movements.[107] Shannon reportedly long thought of publishing about investing, but ultimately did not, despite giving multiple lectures.[107] He was one of the first investors to download stock prices, and a snapshot of his portfolio in 1981 was found to be $582,717.50, translating to $1.5 million in 2015, excluding another one of his stocks.[107]

Commemorations

[edit]

Shannon centenary

[edit]
Claude Shannon centenary

The Shannon centenary, 2016, marked the life and influence of Claude Elwood Shannon on the hundredth anniversary of his birth on April 30, 1916. It was inspired in part by the Alan Turing Year. An ad hoc committee of the IEEE Information Theory Society including Christina Fragouli, Rüdiger Urbanke, Michelle Effros, Lav Varshney and Sergio Verdú,[108] coordinated worldwide events. The initiative was announced in the History Panel at the 2015 IEEE Information Theory Workshop Jerusalem[109][110] and the IEEE Information Theory Society newsletter.[111]

A detailed listing of confirmed events was available on the website of the IEEE Information Theory Society.[112]

Some of the activities included:

  • Bell Labs hosted the First Shannon Conference on the Future of the Information Age on April 28–29, 2016, in Murray Hill, New Jersey, to celebrate Claude Shannon and the continued impact of his legacy on society. The event includes keynote speeches by global luminaries and visionaries of the information age who will explore the impact of information theory on society and our digital future, informal recollections, and leading technical presentations on subsequent related work in other areas such as bioinformatics, economic systems, and social networks. There is also a student competition
  • Bell Labs launched a Web exhibit on April 30, 2016, chronicling Shannon's hiring at Bell Labs (under an NDRC contract with US Government), his subsequent work there from 1942 through 1957, and details of Mathematics Department. The exhibit also displayed bios of colleagues and managers during his tenure, as well as original versions of some of the technical memoranda which subsequently became well known in published form.
  • The Republic of Macedonia issued a commemorative stamp.[113] A USPS commemorative stamp is being proposed, with an active petition.[114]
  • A documentary on Claude Shannon and on the impact of information theory, The Bit Player, was produced by Sergio Verdú and Mark Levinson.[115]
  • A trans-Atlantic celebration of both George Boole's bicentenary and Claude Shannon's centenary that is being led by University College Cork and the Massachusetts Institute of Technology. A first event was a workshop in Cork, When Boole Meets Shannon,[116] and will continue with exhibits at the Boston Museum of Science and at the MIT Museum.[117]
  • Many organizations around the world are holding observance events, including the Boston Museum of Science, the Heinz-Nixdorf Museum, the Institute for Advanced Study, Technische Universität Berlin, University of South Australia (UniSA), Unicamp (Universidade Estadual de Campinas), University of Toronto, Chinese University of Hong Kong, Cairo University, Telecom ParisTech, National Technical University of Athens, Indian Institute of Science, Indian Institute of Technology Bombay, Indian Institute of Technology Kanpur, Nanyang Technological University of Singapore, University of Maryland, University of Illinois at Chicago, École Polytechnique Federale de Lausanne, The Pennsylvania State University (Penn State), University of California Los Angeles, Massachusetts Institute of Technology, Chongqing University of Posts and Telecommunications, and University of Illinois at Urbana-Champaign.
  • A logo that appears on this page was crowdsourced on Crowdspring.[118]
  • The Math Encounters presentation of May 4, 2016, at the National Museum of Mathematics in New York, titled Saving Face: Information Tricks for Love and Life, focused on Shannon's work in information theory. A video recording and other material are available.[119]

Awards and honors list

[edit]

The Claude E. Shannon Award was established in his honor; he was also its first recipient, in 1973.[120][121]

Selected works

[edit]
  • Claude E. Shannon: A Symbolic Analysis of Relay and Switching Circuits, master's thesis, MIT, 1937.
  • Claude E. Shannon: "A Mathematical Theory of Communication", Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, 1948 (abstract).
  • Claude E. Shannon and Warren Weaver: The Mathematical Theory of Communication. The University of Illinois Press, Urbana, Illinois, 1949. ISBN 0-252-72548-4
  • Neil Sloane editor (1993) Claude Shannon: Collected Works, IEEE Press

See also

[edit]

References

[edit]
  1. ^ a b c d Roberts, Siobhan (April 30, 2016). "The Forgotten Father of the Information Age". The New Yorker. ISSN 0028-792X. Retrieved September 28, 2023.
  2. ^ a b Slater, Robert (1989). Portraits in Silicon. Cambridge, Mass.: MIT Pr. pp. 37–38. ISBN 978-0-262-69131-4.
  3. ^ a b c James, Ioan (2009). "Claude Elwood Shannon 30 April 1916 – 24 February 2001". Biographical Memoirs of Fellows of the Royal Society. 55: 257–265. doi:10.1098/rsbm.2009.0015.
  4. ^ a b Horgan, John (April 27, 2016). "Claude Shannon: Tinkerer, Prankster, and Father of Information Theory". IEEE. Retrieved September 28, 2023.
  5. ^ a b Atmar, Wirt (2001). "A Profoundly Repeated Pattern". Bulletin of the Ecological Society of America. 82 (3): 208–211. ISSN 0012-9623. JSTOR 20168572.
  6. ^ a b Goodman, Jimmy Soni and Rob (July 30, 2017). "Claude Shannon: The Juggling Poet Who Gave Us the Information Age". The Daily Beast. Retrieved October 31, 2023.
  7. ^ a b c Tse, David (December 22, 2020). "How Claude Shannon Invented the Future". Quanta Magazine. Retrieved September 28, 2023.
  8. ^ a b Poundstone, William (2005). Fortune's Formula : The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street. Hill & Wang. p. 20. ISBN 978-0-8090-4599-0.
  9. ^ Chow, Rony (June 5, 2021). "Claude Shannon: The Father of Information Theory". History of Data Science. Retrieved January 11, 2024.
  10. ^ Gardner, Howard (1985). The Mind's New Science: A History of the Cognitive Revolution. Basic Books. p. 144. ISBN 978-0-465-04635-5.
  11. ^ Goldstine, Herman H. (1972). The Computer from Pascal to von Neumann (PDF). Princeton, N.J.: Princeton University Press. pp. 119–120. ISBN 978-0-691-08104-5.
  12. ^ Vignes, Alain (2023). Silicon, From Sand to Chips, 1: Microelectronic Components. Hoboken: ISTE Ltd / John Wiley and Sons Inc. pp. xv. ISBN 978-1-78630-921-1.
  13. ^ Rioul, Olivier (2021), Duplantier, Bertrand; Rivasseau, Vincent (eds.), "This is IT: A Primer on Shannon's Entropy and Information", Information Theory: Poincaré Seminar 2018, Progress in Mathematical Physics, vol. 78, Cham: Springer International Publishing, pp. 49–86, doi:10.1007/978-3-030-81480-9_2, ISBN 978-3-030-81480-9, retrieved July 28, 2024
  14. ^ a b "Claude E. Shannon | IEEE Information Theory Society". www.itsoc.org. Retrieved October 31, 2023.
  15. ^ a b c Gallager, Robert G. (2001). "Claude E. Shannon: A Retrospective on His Life, Work, and Impact" (PDF). IEEE Transactions on Information Theory. 47 (7): 2681–2695. doi:10.1109/18.959253.
  16. ^ Shimeall, Timothy J.; Spring, Jonathan M. (2013). Introduction to Information Security: A Strategic-Based Approach. Syngress. p. 167. ISBN 978-1597499699.
  17. ^ a b Koç, Çetin Kaya; Özdemir, Funda (2023). "Development of Cryptography since Shannon" (PDF). Handbook of Formal Analysis and Verification in Cryptography: 1–56. doi:10.1201/9781003090052-1. ISBN 978-1-003-09005-2.
  18. ^ Bruen, Aiden A.; Forcinito, Mario (2005). Cryptography, Information Theory, and Error-Correction: A Handbook for the 21st Century. Hoboken, N.J: Wiley-Interscience. p. 3. ISBN 978-0-471-65317-2. OCLC 56191935.
  19. ^ a b c Poundstone, William (2005). Fortune's Formula : The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street. Hill & Wang. pp. 15–16. ISBN 978-0-8090-4599-0.
  20. ^ Goodman, Rob; Soni, Jimmy (2018). "Genius in Training". Alumni Association of the University of Michigan. Retrieved October 31, 2023.
  21. ^ a b Guizzo, Erico Marui (2003). The Essential Message: Claude Shannon and the Making of Information Theory (Master's thesis). University of Sao Paulo. hdl:1721.1/39429. Retrieved January 11, 2024.
  22. ^ a b "Claude Shannon: Reluctant Father of the Digital Age". MIT Technology Review. July 1, 2001. Retrieved June 26, 2024.
  23. ^ Chang, Mark (2014). Principles of Scientific Methods. Boca Raton: CRC Press, Taylor & Francis Group. p. 217. ISBN 978-1-4822-3809-9.
  24. ^ Jha, Alok (April 30, 2016). "Without Claude Shannon's information theory there would have been no internet". The Guardian. Retrieved July 21, 2024.
  25. ^ Keats, Jonathon (November 11, 2010). Virtual Words: Language from the Edge of Science and Technology. Oxford University Press. p. 36. doi:10.1093/oso/9780195398540.001.0001. ISBN 978-0-19-539854-0.
  26. ^ a b Apter, Michael J. (2018). The Computer Simulation of Behaviour. Routledge Library Editions: Artificial intelligence. London New York: Routledge Taylor & Francis Group. p. 123. ISBN 978-0-8153-8566-0.
  27. ^ a b Lowood, Henry; Guins, Raiford, eds. (June 3, 2016). Debugging Game History: A Critical Lexicon. The MIT Press. pp. 31–32. doi:10.7551/mitpress/10087.001.0001. ISBN 978-0-262-33194-4.
  28. ^ a b Brooks, Rodney (January 25, 2022). "How Claude Shannon Helped Kick-start Machine Learning". ieeespectrum. Retrieved October 31, 2023.
  29. ^ a b c Klein, Daniel (2019). Dragoon, aLICE (ed.). "Mighty mouse". MIT News (January/February). Cambridge Massachusetts: MIT Technology Review: 6–7.
  30. ^ McCarthy, John; Minsky, Marvin L.; Rochester, Nathaniel; Shannon, Claude E. (December 15, 2006). "A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence, August 31, 1955". AI Magazine. 27 (4): 12. doi:10.1609/aimag.v27i4.1904. ISSN 2371-9621.
  31. ^ a b Solomonoff, Grace (May 6, 2023). "The Meeting of the Minds That Launched AI". ieeespectrum. Retrieved June 19, 2024.
  32. ^ Golomb, Solomon W. (January 2002). "Claude Elwood Shannon (1916–2001)" (PDF). Notices of the American Mathematical Society. 49 (1).
  33. ^ a b Rutledge, Tom (August 16, 2017). "The Man Who Invented Information Theory". Boston Review. Retrieved October 31, 2023.
  34. ^ Sloane & Wyner (1993), p. xi.
  35. ^ Soni, J.; Goodman, R. (2017). A Mind at Play: How Claude Shannon Invented the Information Age. Simon & Schuster. p. 6. ISBN 978-1-4767-6668-3. Retrieved May 2, 2023.
  36. ^ Gleick, James (December 30, 2001). "THE LIVES THEY LIVED: CLAUDE SHANNON, B. 1916; Bit Player". The New York Times Magazine: Section 6, Page 48.
  37. ^ a b c "MIT Professor Claude Shannon dies; was founder of digital communications". MIT News office. Cambridge, Massachusetts. February 27, 2001.
  38. ^ Sloane, N.J.A; Wyner, Aaron D., eds. (1993). Claude Elwood Shannon: Collected Papers. Wiley/IEEE Press. ISBN 978-0-7803-0434-5. Retrieved December 9, 2016.
  39. ^ Price, Robert (1982). "Claude E. Shannon, an oral history". IEEE Global History Network. IEEE. Retrieved July 14, 2011.
  40. ^ a b c Shannon, C. E. (1938). "A Symbolic Analysis of Relay and Switching Circuits". Trans. AIEE. 57 (12): 713–723. doi:10.1109/T-AIEE.1938.5057767. hdl:1721.1/11173. S2CID 51638483.
  41. ^ a b Kawanishi, Toma (2019). "Prehistory of Switching Theory in Japan: Akira Nakashima and His Relay-circuit Theory". Historia Scientiarum. Second Series. 29 (1): 136–162. doi:10.34336/historiascientiarum.29.1_136.
  42. ^ Gardner, Howard (1987). The Mind's New Science: A History of the Cognitive Revolution. Basic Books. p. 144. ISBN 978-0-465-04635-5.
  43. ^ Guizzo, Erico Marui (2003). The Essential Message: Claude Shannon and the Making of Information Theory (PDF) (Master of Science thesis). Massachusetts Institute of Technology. p. 12. Retrieved July 29, 2024.
  44. ^ Shannon, Claude Elwood (1940). An Algebra for Theoretical Genetics (Thesis). Massachusetts Institute of Technology. hdl:1721.1/11174. — Contains a biography on pp. 64–65.
  45. ^ Chalub, Fabio A. C. C.; Souza, Max O. (December 1, 2017). "On the stochastic evolution of finite populations". Journal of Mathematical Biology. 75 (6): 1735–1774. doi:10.1007/s00285-017-1135-4. ISSN 1432-1416. PMID 28493042.
  46. ^ Hanus, Pavol; Goebel, Bernhard; Dingel, Janis; Weindl, Johanna; Zech, Juergen; Dawy, Zaher; Hagenauer, Joachim; Mueller, Jakob C. (November 27, 2007). "Information and communication theory in molecular biology". Electrical Engineering. 90 (2): 161–173. doi:10.1007/s00202-007-0062-6. ISSN 0948-7921.
  47. ^ Pachter, Lior (November 6, 2013). "Claude Shannon, population geneticist". Bits of DNA. Retrieved July 29, 2024.
  48. ^ Guizzo, Erico Marui (2003). The Essential Message: Claude Shannon and the Making of Information Theory (Thesis). Massachusetts Institute of Technology. hdl:1721.1/39429.
  49. ^ Gertner, Jon (2013). The idea factory: Bell Labs and the great age of American innovation. London: Penguin Books. p. 118. ISBN 978-0-14-312279-1.
  50. ^ Okrent, Howard; McNamee, Lawrence P. (1970). "3. 3 Flowgraph Theory" (PDF). NASAP-70 User's and Programmer's manual. Los Angeles, California: School of Engineering and Applied Science, University of California at Los Angeles. pp. 3–9. Retrieved March 4, 2016.
  51. ^ a b Hodges, Andrew (1992), Alan Turing: The Enigma, London: Vintage, pp. 243–252, ISBN 978-0-09-911641-7
  52. ^ Turing, A.M. (1936), "On Computable Numbers, with an Application to the Entscheidungsproblem", Proceedings of the London Mathematical Society, 2, vol. 42 (published 1937), pp. 230–65, doi:10.1112/plms/s2-42.1.230, S2CID 73712
  53. ^ Turing, A.M. (1938), "On Computable Numbers, with an Application to the Entscheidungsproblem: A correction", Proceedings of the London Mathematical Society, 2, vol. 43, no. 6 (published 1937), pp. 544–6, doi:10.1112/plms/s2-43.6.544
  54. ^ Mindell, David A. (October 15, 2004). Between Human and Machine: Feedback, Control, and Computing Before Cybernetics. JHU Press. pp. 319–320. ISBN 0801880572.
  55. ^ Kahn, David (1966). The Codebreakers: The Comprehensive History of Secret Communication from Ancient Times to the Internet. Macmillan and Sons. pp. 743–751. ISBN 0684831309.
  56. ^ quoted in Kahn, The Codebreakers, p. 744.
  57. ^ Quoted in Erico Marui Guizzo, "The Essential Message: Claude Shannon and the Making of Information Theory", Archived May 28, 2008, at the Wayback Machine unpublished MS thesis, Massachusetts Institute of Technology, 2003, p. 21.
  58. ^ Shannon, C. E. (1949). "Communication Theory of Secrecy Systems". Bell System Technical Journal. 28 (4): 656–715. doi:10.1002/j.1538-7305.1949.tb00928.x.
  59. ^ a b c Shannon, Claude Elwood (1998). The mathematical theory of communication. Warren Weaver. Urbana: University of Illinois Press. ISBN 0-252-72546-8. OCLC 40716662.
  60. ^ Stanković, Raromir S.; Astola, Jaakko T.; Karpovsky, Mark G. (September 2006). Some Historic Remarks On Sampling Theorem (PDF). Proceedings of the 2006 International TICSP Workshop on Spectral Methods and Multirate Signal Processing.
  61. ^ a b c d Kline, Ronald (2011). "Cybernetics, Automata Studies, and the Dartmouth Conference on Artificial Intelligence". IEEE Annals of the History of Computing. 33 (4): 5–16. doi:10.1109/MAHC.2010.44. ISSN 1058-6180.
  62. ^ a b McEliece, Robert J. (2004). The Theory of Information and Coding (Student ed.). Cambridge: Cambridge University Press. p. 13. ISBN 978-0-521-83185-7.
  63. ^ a b Soni, J.; Goodman, R. (2017). A Mind at Play: How Claude Shannon Invented the Information Age. Simon & Schuster. pp. 193–198. ISBN 978-1-4767-6668-3.
  64. ^ Cordeschi, Roberto (April 25, 2007). "AI Turns Fifty: Revisiting ITS Origins". Applied Artificial Intelligence. 21 (4–5): 259–279. doi:10.1080/08839510701252304. ISSN 0883-9514.
  65. ^ McCarthy, John; Minsky, Marvin L.; Rochester, Nathaniel; Shannon, Claude E. (December 15, 2006). "A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence, August 31, 1955". AI Magazine. 27 (4): 12. doi:10.1609/aimag.v27i4.1904. ISSN 2371-9621.
  66. ^ a b Weisstein, Eric. "Shannon, Claude Elwood (1916–2001)". World of Scientific Biography. Wolfram Research.
  67. ^ "Claude Shannon – computer science theory". www.thocp.net. The History of Computing Project. Retrieved December 9, 2016.
  68. ^ "People: Shannon, Claude Elwood". MIT Museum. Retrieved December 9, 2016.
  69. ^ Boehm, George A. W. (March 1, 1953). "GYPSY, MODEL VI, CLAUDE SHANNON, NIMWIT, AND THE MOUSE". Computers and Automation 1953-03: Vol 2 Iss 2. Internet Archive. Berkeley Enterprises. pp. 1–4.
  70. ^ Cavanaugh, Ray (April 29, 2016). "Claude Shannon: The Juggling Unicyclist Who Pedaled Us Into the Digital Age". Time. Retrieved October 15, 2024.
  71. ^ Advertisement: Minivac 601. October 1961. p. 33.
  72. ^ Thorp, Edward (October 1998). "The invention of the first wearable computer". Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215). pp. 4–8. doi:10.1109/iswc.1998.729523. ISBN 0-8186-9074-7. S2CID 1526.
  73. ^ Jimmy Soni; Rob Goodman (2017). A Mind At Play: How Claude Shannon Invented the Information Age. Simon and Schuster. pp. 63, 80.
  74. ^ "Betty Shannon, Unsung Mathematical Genius". Scientific American Blog Network. Retrieved July 26, 2017.
  75. ^ Horgan, John (April 27, 2016). "Claude Shannon: Tinkerer, Prankster, and Father of Information Theory". IEEE Spectrum. Retrieved June 19, 2020.
  76. ^ William Poundstone (2010). Fortune's Formula: The Untold Story of the Scientific Betting System. Macmillan. p. 18. ISBN 978-0-374-70708-8. Shannon described himself as an atheist and was outwardly apolitical.
  77. ^ "Claude Shannon Statue Dedications". Archived from the original on July 31, 2010.
  78. ^ "Michigan Roadside Attractions: Claude Shannon Park, Gaylord". Travel the Mitten. TravelTheMitten.com. August 11, 2018. Retrieved September 8, 2022. Gaylord, Michigan is home to a small park honoring Claude Shannon…
  79. ^ Soni, J.; Goodman, R. (2017). A Mind at Play: How Claude Shannon Invented the Information Age. Simon & Schuster. p. 188. ISBN 978-1-4767-6668-3.
  80. ^ Siegfried, Tom (November 13, 2013). "Top 10 revolutionary scientific theories". Science News. Retrieved November 5, 2024.
  81. ^ Shannon, C. E. (1948). "A mathematical theory of communication". Bell System Technical Journal. 27 (3): 379–423, 623–656. doi:10.1002/j.1538-7305.1948.tb01338.x.
  82. ^ Coughlin, Kevin (February 27, 2001). "Bell Labs digital guru dead at 84— Pioneer scientist led high-tech revolution". The Star-Ledger.
  83. ^ "Gwei". Investopedia.
  84. ^ "Claude Shannon". The Telegraph. March 12, 2001. Retrieved January 11, 2024.
  85. ^ Calderbank, Robert; Sloane, Neil J. A. (April 12, 2001). "Claude Shannon (1916–2001)". Nature. 410 (6830): 768. doi:10.1038/35071223. ISSN 1476-4687. PMID 11298432.
  86. ^ Collins, Graham P. (October 14, 2002). "Claude E. Shannon: Founder of Information Theory". Scientific American. Retrieved January 11, 2024.
  87. ^ George Dyson (July 21, 2017). "The Elegance of Ones and Zeroes". Wall Street Journal. Retrieved August 15, 2017.
  88. ^ Soni, Jimmy; Goodman, Rob (August 1, 2017). "10,000 Hours With Claude Shannon: How a Genius Thinks, Works and Lives". Observer. Retrieved October 31, 2023.
  89. ^ Calderbank, Robert; Sloane, Neil J. A. (2001). "Claude Shannon (1916–2001)". Nature. 410 (6830): 768. doi:10.1038/35071223. ISSN 0028-0836. PMID 11298432.
  90. ^ Goodman, Rob; Soni, Jimmy (August 30, 2017). "How a polymath transformed our understanding of information". Aeon. Retrieved November 7, 2024.
  91. ^ Guldi, Jo (2023). The Dangerous Art of Text Mining: A Methodology for Digital History. Cambridge: Cambridge University Press. pp. 144–145. ISBN 978-1-009-26298-9.
  92. ^ Claude Shannon’s 100th birthday Google, 2016
  93. ^ Katie Reilly (April 30, 2016). "Google Doodle Honors Mathematician-Juggler Claude Shannon". Time.
  94. ^ Menchie Mendoza (May 2, 2016). "Google Doodle Celebrates 100th Birthday Of Claude Shannon, Father Of Information Theory". Tech Times.
  95. ^ "Google Doodle commemorates 'father of information theory' Claude Shannon on his 100th birthday". Firstpost. May 3, 2016.
  96. ^ Jonathan Gibbs (April 29, 2016). "Claude Shannon: Three things you'll wish you owned that the mathematician invented". The Independent.
  97. ^ David Z. Morris (April 30, 2016). "Google Celebrates 100th Birthday of Claude Shannon, the Inventor of the Bit". Fortune.
  98. ^ Feder, Toni (July 19, 2019). "Review: The Bit Player, an homage to Claude Shannon". Physics Today (7): 5159. Bibcode:2019PhT..2019g5159F. doi:10.1063/PT.6.3.20190719a. S2CID 243548904. Retrieved August 3, 2019.
  99. ^ a b c d "Bell Labs Advances Intelligent Networks". Archived from the original on July 22, 2012.
  100. ^ a b Claude Shannon (1950). "Programming a Computer for Playing Chess" (PDF). Philosophical Magazine. 41 (314). Archived from the original (PDF) on July 6, 2010. Retrieved January 2, 2018.
  101. ^ Grime, James (July 24, 2015). How many chess games are possible?. Numberphile.
  102. ^ "Early Computer Chess Programs by Bill Wall". billwall.phpwebhosting.com.
  103. ^ Shannon, Claude E. (1950). "A Chess-Playing Machine". Scientific American. 182 (2): 48–51. Bibcode:1950SciAm.182b..48S. doi:10.1038/scientificamerican0250-48. ISSN 0036-8733. JSTOR 24967381. PMID 15402252.
  104. ^ Hamid Reza Ekbia (2008), Artificial Dreams: The Quest for Non-biological Intelligence, Cambridge University Press, p. 46, ISBN 978-0-521-87867-8
  105. ^ a b Effros, Michelle; Poor, H. Vincent (2017). "Claude Shannon: His Work and Its Legacy". EMS Newsletter. 2017–3 (103): 29–34. doi:10.4171/NEWS/103/5. ISSN 1027-488X.
  106. ^ Poundstone, William (2005). Fortune's Formula : The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street. Hill & Wang. p. 307. ISBN 978-0-8090-4599-0.
  107. ^ a b c Benello, Allen C.; Biema, Michael van; Carlisle, Tobias E. (2016). Concentrated Investing: Strategies of the World's Greatest Concentrated Value Investors. Hoboken, New Jersey: Wiley. pp. 79–81. ISBN 978-1-119-01204-7.
  108. ^ "Newsletter". IEEE Information Theory Society. IEEE. June 2015. Archived from the original on July 9, 2015.
  109. ^ "Videos". Israel: Technion. Archived from the original on July 6, 2015. Retrieved July 5, 2015.
  110. ^ "Sergio Verdú". Twitter.
  111. ^ "Newsletter". IEEE Information Theory Society. IEEE. September 2014. Archived from the original on September 4, 2015.
  112. ^ "Shannon Centenary". IEEE Information Theory Society. IEEE.
  113. ^ "Postage Stamps 2016". А.Д. Пошта на Северна Македонија. posta.com.mk. Retrieved July 26, 2024.
  114. ^ "Shannon's centenary US postal stamp — Information Theory Society". www.itsoc.org.
  115. ^ "Celebrating the Work and Life of Claude Elwood Shannon". IEEE Foundation. Archived from the original on August 3, 2019. Retrieved August 3, 2019.
  116. ^ "George Boole 200-Conferences". Archived from the original on September 6, 2015. Retrieved September 21, 2015.
  117. ^ "Compute and Communicate | A Boole/Shannon Celebration".
  118. ^ "Claude Shannon centennial logo, a Logo & Identity project by cfrag1". www.crowdspring.com.
  119. ^ "Saving Face: Information Tricks for Love and Life (Math Encounters Presentation at the National Museum of Mathematics". ).
  120. ^ "Claude E. Shannon". www.itsoc.org. Retrieved September 13, 2024.
  121. ^ Roberts, Siobhan (April 30, 2016). "Claude Shannon, the Father of the Information Age, Turns 1100100". The New Yorker. Retrieved April 30, 2016.
  122. ^ "Claude Elwood Shannon". The Franklin Institute. January 11, 2014.
  123. ^ "Claude Elwood Shannon". February 9, 2023.
  124. ^ "Harvey Prize". Technion — Israel Institute of Technology. Haifa, Israel.
  125. ^ "American Society of Civil Engineers Alfred Noble Prize". American Society of Civil Engineers. Retrieved April 27, 2020.
  126. ^ "The President's National Medal of Science: Recipient Details | NSF – National Science Foundation". www.nsf.gov.
  127. ^ "Claude Elwood Shannon | Kyoto Prize". 京都賞.
  128. ^ "IEEE Morris N. Liebmann Memorial Award Recipients" (PDF). IEEE. Archived from the original (PDF) on March 3, 2016. Retrieved February 27, 2011.
  129. ^ "Claude Shannon". National Academy of Sciences. July 2, 2015. Retrieved March 25, 2019.
  130. ^ "IEEE Medal of Honor Recipients" (PDF). IEEE. Archived from the original (PDF) on April 22, 2015. Retrieved February 27, 2011.
  131. ^ "Golden Plate Awardees of the American Academy of Achievement". www.achievement.org. American Academy of Achievement.
  132. ^ "C.E. Shannon (1916–2001)". Royal Netherlands Academy of Arts and Sciences. Retrieved July 17, 2015.
  133. ^ "APS Member History".
  134. ^ "Royal Irish Academy Acadamh Ríoga na hÉireann Annual Report 2001–2002" (PDF).
  135. ^ "Award Winners (chronological)". Eduard Rhein Foundation. Archived from the original on July 18, 2011. Retrieved February 20, 2011.
  136. ^ "Marconi Lifetime Achievement Award". marconisociety.org.
  137. ^ Staff (February 27, 2001). "MIT Professor Claude Shannon dies; was founder of digital communications". MIT News. Retrieved April 4, 2023.

Further reading

[edit]
[edit]