Grammatical Man: Information, Entropy, Language, and Life is a 1982 book written by Jeremy Campbell, then Washington correspondent for the Evening Standard.[1] The book examines the topics of probability, information theory, cybernetics, genetics, and linguistics. Information processes are used to frame and examine all of existence, from the Big Bang to DNA to human communication to artificial intelligence.
Author | Jeremy Campbell |
---|---|
Subject | Information theory, Systems theory, Cybernetics, Linguistics |
Publisher | Simon & Schuster |
Publication date | 1982 |
Pages | 319 |
ISBN | 0671440616 |
For Laplace's "intelligence," as for the God of Plato, Galileo and Einstein, the past and future coexist on equal terms, like the two rays into which an arbitrarily chosen point divides a straight line. If the theories I have presented are correct, however, not even the ultimate computer --the universe itself-- ever contains enough information to specify completely its own future states. The present moment always contains an element of genuine novelty and the future is never wholly predictable. Because biological processes also generate information and because consciousness enables us to experience those processes directly, the intuitive perception of the world as unfolding in time captures one of the most deepseated properties of the universe.
Campbell also discusses John von Neumann in relating information theory, evolution, and linguistics to machines. The chapter closes with an examination of emergent systems and their relation to Gödel incompleteness.To understand complex systems, such as a large computer or a living organism, we cannot use ordinary, formal logic, which deals with events that definitely will happen or definitely will not happen. A probabilistic logic is needed, one that makes statements about how likely or unlikely it is that various events will happen.