Science Building, Room 227
For further information contact Paul Grobstein.
Summary by Paul Grobstein
Given that I know what I intended to say, I'm probably not the best person to try and summarize ("compress" ?) this particular conversation, at least the part for which I was the stimulus. On the other hand, some bits resonated strongly ("expanded" ?) for me, and there is always the forum area for others to provide additional/corrective stories/expansions of what happened. I trust others will do so.
My suggestion that information needed to be seen as "relational" rather than an intrinsic characteristic of things clearly generated controversy. People seemed comfortable with the idea that "meaning" is relational/receiver dependent, but not with the idea that "information" is.
A discussion of proteins seemed helpful in thinking more about this and related matters. Protein tertiary structure is NOT a simple expression of "information" in DNA. It depends both on a complex "decoder" and a context, which in turn allows it to be variable in functionally significant ways. In the absence of the decoder, there is no "information" in a DNA sequence. Two different, statistically random, strands of DNA are distinguishable in terms of information content only in the presence of a relevant decoder. And even then, the tertiary structure at any given time is determined by "addition" of information (the surroundings for the protein).
The introduction of the immune system as an instructive example was, I thought, even more productive. Antibodies are (to a significant extent) randomly generated sequences of amino acids. The presence or absence of biologically significant "information" in such sequences is entirely a function of whether they do or do not "fit" antigens (here the notion of "potential information" is important ... sequences that do not yet fit antigens might in the future). Moreover, the immune system is a clear example of a "decoder", and one where the notion of a pre-existing catalogue of possible states (a sine qua non for classsical information theory) is missing (there are an infinite number of possible states). In addition, the immune system is an information "creator" with the capacity to discriminate and respond to "newness" (using a fundamental randomness in a way akin to evolution). It is also a "model", with the capacity to both be altered by past inputs and to "predict" to some degree future ones.
The latter was additionally relevant to Steve's brief synopsis of "Clock-Watchers" outlining Hartle's notion that "time" may have its origins in information-processing. Hartle uses a "consciousness" processor to get the "internal experience" of time but his core idea is expressed by the notion of successive storage registers through which information moves. An intriguing possibility here is that an adequate understanding of "information" may require not only a fusion of that concept with those of matter and energy but with that of "time" as well.
An intriguing, self-reflective issue that arose at the end of the discussion related to the homogeneity (along some dimensions if not others) of the working group participants in light of "expansion/contraction" cycles that are perhaps fundamental to information-processing devices. The intuition of at least some working group participants is that available materials may be adequate for a fundamentally new and improved understanding of information if one can just winnow and tweak them in the right way. That sort of "contraction" process may take substantial time, with few intermediate points by which to evaluate progress, and requires an ability/inclination to discard things that might or might not be relevant without knowing in any immediate sense which is the case. In short, it could be that the group is "wasting time just like (some people) think we are". There was a consensus that both compression and expansion are equally significant parts of meaningful information-processing. Whether we are wasting time in the current compressive effort will take time (iterations) to tell.
In the next session (beginning at 9:30 instead of 9), Steve will talk a bit about von Baeyer's Information. One issue that has already arisen from that book is the question of whether there is or is not additional "information" in the laborious expansions of Einstein's equations that have been a major contribution to 20th century physics. In the following session, Eric will talk a bit about some work he/I have been doing in relation to linguistics. Additional topics that seem worth taking up include algorithmic complexity, Hartle's more formal paper on time/info processing, and Bayesian inference.
| Calendar | About
| Getting Involved
| Groups | Initiatives | Bryn Mawr Home | Serendip Home
Director: Liz McCormack - email@example.com | Faculty Steering Committee | Secretary: Lisa Kolonay
© 1994- , by Center for Science in Society, Bryn Mawr College and Serendip
Last Modified: Wednesday, 26-May-2004 08:50:58 EDT