Brown Bag 2004 - 2005 Forum
Comments are posted in the order in which they are received, with earlier postings appearing first below on this page. To see the latest postings, click on "Go to last comment" below.
Go to last comment
response to Will and Anne
Name: Liz McCorm
Date: //2004-11-18 15:28:41 :
Link to this Comment: 11654
I would argue that the external judgment you both seek is that Real, physical world out there! We know it imperfectly and incompletely, and "the view from everywhere" is perhaps the closest we can ever get to it (this is what we mean by "distillation" I think). So I would argue that Reality is the arbiter between memory and knowledge. (Granted this may be a circular definition of what I mean by knowledge...)
My work has led me to view Reality as interaction and transformation. Our articulation of Reality can be (and often is) parameterized by a notion of time in order to tell the story of cause and effect and in order for us to relate to it, but I would argue that the parameterization of change by using time is not actually required, as in demanded by, the physical world out there.
Could it simply be that the "past, present and future" are complications that arise from our particular biology's interaction with the Real world?
Name: Paula Vite
Date: //2004-09-18 13:23:19 :
Link to this Comment: 10861
Name: Paul Grobstein
Date: //2004-09-17 14:42:29 :
Link to this Comment: 10859
As always, this is a place for informal discussion, for sharing ideas you have that might be useful to others and finding ideas from others that might be useful to you. There are no "last words" or "authorities" here, only the kind of sharing of thoughts that makes for good conversation. So join in, and let's see what sense we can make together of "Science's Audiences". Thanks to Sharon, Wil, and Paula for getting us together.
|memory, emotional needs, and ...|
Name: Paul Grobstein
Date: //2004-11-18 11:36:35 :
Link to this Comment: 11651
A few quick thoughts along these lines, as a contribution to further discussion? I do indeed think "historical memory springs from an illusion that we can better survive if there is one narrative to sustain us." In accepting the term "illusion", I don't intend to be derogatory. ALL understandings are "illusion" in the sense that they are "stories" or "paintings" of things that can in principle be painted/told differently. And most exist because they have "worked" in the past.
Hence, the operative question in this case is whether the notion of "one narrative to sustain us" continues to work well in the present or ought to be replaced by other stories/paintings that might work better. My own sense is that the reliance on ONE narrative (whether of "home" or of anything else) has proven to be a problem in a variety of contexts, and so we might look for/try and create a story that doesn't have this characteristic. "I Am, and I Can Think, Therefore ..." makes this argument more extensively and suggests what forms a new story might take.
Looking forward to further conversation along these lines, here and/or in several other relevant places on Serendip, including The Place of the US in the World Community - November 2004.
| Why and how memories get constructed|
Name: simone wad
Date: //2004-11-16 20:45:58 :
Link to this Comment: 11618
|Deconstructing goals of large intro courses|
Name: Wil Frankl
Date: //2004-11-22 12:09:41 :
Link to this Comment: 11711
I think we have a very dysfunctional educational system. In college we are teaching to such a diverse group we can never serve them all. Who should college serve? What should college serve? In keeping with Science in Society’s named goal, we can not abandon the hope that science becomes accessible to all in some form or another. Here’s an idea, then we can deconstruct what it’s based on:
In K-12 abolish all content based benchmarks in everything but reading, writing and mathematics. As soon as they reach a certain level of proficiency in these areas, they are released to undertake more electives. All other curriculum is chosen by students. …In fact each student can choose their own topics (within very wide limits) and researches them in a specified but flexible format (again, within limits that model different ways of learning, but achieves stated goals). The format reinforces critical thinking, making observations and marshalling relevant observations to construct and support reflective conclusions/arguments. At each grade level, projects become more sophisticated and the number of them to be completed is increased. In this way we have a student body that knows how to think for themselves and does not assume that we all are thinking about the same thing.
In college the goals change. Now we teach content. The teaching goal is only to raise the bar high and expose them to as much as possible. Those who wish to be scientist do whatever it takes for them to learn the material, those who find other topics more engaging dive head first into those disciplines.
I think this decouples two separate goals of college that cannot efficiently be met, namely teaching students to think and teaching them what to think.
Can it be done? Is it worth doing?
|A Fact Is an Act|
Name: Anne Dalke
Date: //2004-09-26 22:42:25 :
Link to this Comment: 10955
So, here's Anne, belatedly weighing in, and dragging along with her
What I actually want to say is something not said in the brown bag, and so not offered among the options Paula lists: it's a recuperation of "fact" from the O.E.D., where (in the first four instances) it means "act" (from L. fact-um, thing done; was adopted as "feat"). It's fact as act that I want to highlight, fact as action which most interests me: fact as praxis, pusher, mover, provocateur....
a fact, I'll go so far as to provocatively claim, is that which has the power to cause action. It's not just any one of an infinite number of stories, but only the story with the punch (perhaps it's the punch that fiction lacks?)
I can think of multiple examples of this from the summaries: Liz observing, for example, that we are all anxious writers because we fear connecting a particular set of dots, chosing a particular narrative (and so neglecting another), and facing the consequences of that act/fact. There are even stronger examples from the diversity discussions: conversations about what happens when a customs group becomes a "fact" (i.e., an "end in itself, protective of its borders, not just a pass-through to other things"), about what happens when history becomes a "fact," and so is "intractably" re-enacted in the present Middle East.
So: here's my contribution to the kind of productive provocation that keeps the ball moving: a fact is that which does just this--moves things along. Until the ball stops rolling, and is replaced by another....
Someone else's turn.
Name: Paul Grobstein
Date: //2004-09-23 20:46:57 :
Link to this Comment: 10938
WAS, I confess, taken aback by the introduction of the term "fact". USEFULLY so, but let me back up a few steps, and this is I'm afraid going to sound snide before I get to the useful parts. I stopped believing in "facts" sometime in my late teens or early twenties. And as I , having to face an apparent oddity in myself, think back on it (here comes the hopefully useful part) the dropping of the term "facts" from my conceptual vocabulary had nothing whatsoever to do with my education or experiences as a scientist. It had to do instead with the Vietnam war and the realization that one person's "fact" was another person's fiction, that "fact" meant nothing more than, at worst, something one person or group or people said to get other people to behave they way they wanted, and, at best, something that was understood in common by lots of people who had, for one reason or another, no reason/inclination to question it.
So, I really WAS puzzled by the term "fact", and have spent some time since trying to see where, if anywhere, it fits in my conception of "science". I'm quite comfortable with "observation" as Al used it (with the caveat I expressed at the time, that there is inevitably a brain-related construction going on). And I am quite comfortable with "story", as "a way to make sense of ("summarize in a convenient forum that motivates further inquiry") observations". I don't even have serious problems with "fact" in the sense of "constants of nature, independent of human action", as long as it is understood that "nature" is a story (subject to future revision as it has been in the past), and that neither "independent of human action" nor "constant" can be understood as anything more than a summary of observations made to date (and so "constants" may change and "independent of human action" may turn out not to be so as we come to better understand humans).
What I DON'T like (as a "scientist ", irrespective of what radical social constructivists/deconstructivists/any one else thinks), is the idea that there is something "established" about knowledge. Yes indeed "science" is a social enterprise, and yes indeed there is at any given time a dominant , negotiated, consensus "story" which is represented by some people (both scientists and non-scientists) as THE story. And yes indeed historians of science can (and do) usefully look into the factors and processes at play in the evolution of any given "dominant, negotiated, consensus story". That far I'm happy to go, BUT ...
For historians of science (or anyone else) to call THAT story "fact", given the term's overtones of univeral agreement and permanence (to say nothing of cosmic idealism), is to miss entirely the deeper and more important characteristic that science is fundamentally about skepticism as well as the associated fundamental scientific phenomena of continual story diversification and evolution (cf Science Matters ... How?). Assuming historians of science are actually trying to help humans better understand science as a human institution, to demystify it, doesn't it seem a bit odd to use a term that implies to most listeners something quite different: that scientists are engaged in an activity that gives them a special and distinctive claim to the cosmic, to stories entitled to be regarded as permanent and worthy of universal agreement?
There IS, for better or worse (I think better, for the most part) a social dynamic to science, and it is well worth studying and describing. And there is, not only despite it but also because of it, some "progress" in science. The progress is not, however, toward increasing the number or quality of "facts" (in the sense either of "constants of nature independent of human interaction OR of negotiated dominant consensus). The progress can be measured only as the spread of productive skepticism and (more or less the same thing) the increasing elaboration of stories that productively summarize ever greater numbers of observations (cf . Science, Pragmatism, and Multiplism).
I trust that at the least I am now immune to any charge that I neglected my contributions to the kind of productive provocation that keeps the ball moving?
Name: Paul Grobstein
Date: //2004-11-15 11:07:01 :
Link to this Comment: 11572
An additional thing that I heard myself say (in response to questions, challenges, as above; thanks to all for helping me think through these things) is that similarities in "memory" between people as well as over time within individuals is probably due to the fact not only that many of us have similar experiences (and hence similar traces) but also (even more?) to the fact that we have, as humans, great similarities as well (for both genetic and cultural reasons) in things OTHER than traces greater by similar experiences, ie in "genetic" traces as well as in the mechanisms we use for trace and story selection.
With that said, there probably ARE generalities about "Why and how ... particular memories get constructed" that could be developed (and have been developed, in other terms, in existing disciplinary literatures). But, at the same time, they are unlikely to fully account for why/how any PARTICULAR person constructs any PARTICULAR memory, and it is unlikely to be the cause that starting from a particular person and a particular memory will yield any general insights. The brain is, if nothing else, the best existing exemplar of diversity. And so memory is likely to be HIGHLY idiosyncratic ... the generalities to be found will not explain any particular memory (here, particularity reigns supreme) but may help to explain why there are common features of memory creation across populations of individuals (the question I said ought to replace the question of why people differ in their memories).
|Knowlege & memory?|
Name: Wil Frankl
Date: //2004-11-15 11:07:46 :
Link to this Comment: 11573
Name: Anne Dalke
Date: //2004-10-31 22:33:16 :
Link to this Comment: 11294
Can not help but think the readers of today's (10/31/04) New York Times article, "When No Fact Goes Unchecked," might find this forum (in particular, our earlier discussions about and definitions of "fact") of use. According to the Times,
"volunteer armies of nitpickers are taking facts down to the subatomic level where they can become as meaningless as a nose-to-canvas perspective on a pointillist painting...increasingly, these smallest indivisible hunks of information are being subjected to microscopic scrutiny and high-energy attacks in the realm of public discourse, which has made things look a little less solid and more malleable than they might once have seemed."
Indeed. For further discussion of this phenomenon, check out the account of Don Barber's interesting presentation, last Friday, on Interpreting Climatic Catastrophe, which includes (among other things) a conversation about the inadequacy of the past as a predictor of the future, and on the "irrational" (?) way we humans handle probability and calculate risk.
|In dialogue with history?|
Name: Anne Dalke
Date: //2004-11-03 10:17:48 :
Link to this Comment: 11330
Watching the returns last night and this morning, being told repeatedly (and convincingly) that the past works well as a guide to the future ("these states went for Bush four years ago, statistically it's unlikely that....") I was reminded of the conversation provoked by Don's talk last week: whether--and how-- the past works as an adequate predictor of the future.
I had two other related encounters w/ history (or: engagement with stories about history) this week. The first occured in Coetzee's review of Philip Roth's The Plot Against America, in the 11/18/04 NYReview of Books:
the history we learn from history books is only a domesticated version of the real thing. Real history is the unpredictable, "the relentless unforeseen." The terror of the unforeseen is what the science of history hides....a history book...chronicles the irruption of the relentless unforeseen...poetry is truer than history...because of its power to condense and represent the multifarious in the typical.
The second took place in the exhibit up now in Canaday of Works by Mark J. Aeschliman, which is prefaced with the passage,
"Maintain a dialogue with history, but do not be overwhelmed by it."
The exhibit is of images of the heads of Roman emperors, painted on packing crates, and the commmentary to the exhibit asks us to note how these rulers were revered (and so portrayed) as gods, incarnating a classical conception of just omnipotence and heroic power:
"But politicans today are profane...transcendent divinity incarnate in human form is a metaphysical possibility distant from us.... Where today does divinity reside? A place...vacant, unoccupied..."
So: I'm looking forward to whatever intervention Elliott and Paul will have to offer, when they talk next week about History, Memory and the Brain. What lessons for the future are to be learned from the outcomes of this election?
|translation and access|
Name: Anne Dalke
Date: //2004-11-19 18:04:06 :
Link to this Comment: 11687
I want to write a little bit here about the conference on Signs and Voices: Language, Arts, and Identity from Deaf to Hearing which was held in the Tri-Co last weekend. Several things in that astonishing four-day event stuck me as being highly relevant to, and useful for, our conversations here. First: the good groundwork that had been laid to assure that the varieties of people who came together around the topic of deafness could all hear one another (every event was represented in three ways: spoken, signed and synchronously-translated-and-displayed-on a-big-screen; there were also individual hearing assistance devices available). It was of geat interest to me to listen--and simultaneously watch--two performances of everything that was said, to notice the different representations of all that was offered, and to note (bounded by my own capacities for interpretation) the particular limits and expansiveness of each form.
In light of our repeated discussions in this series about "translation"--that is: how effectively to convey the work of scientists to the public--I was particularly impressed by David Corina's presentation on Saturday morning. Because I was responsible for the Bryn Mawr portion of the conference, I was worried about attendance for that session, and fretful ahead of time that--if anyone came!--they would have difficulty understanding what David said, or its relevance to their own experience. How could/would such "scientific" and "specialized" work "translate"? I needn't have worried. David is himself multi-lingual (=signs) and speaks clearly; many members of the audience spoke/signed afterward about the ways in which their own brain experiences corrobrated or challenged the work David was doing. This was for me a paradigmatic expression a different dynamic than the one traced by Kalala Ngalamulume and Don Barber in their talks--of an
"'informed' group who thinks they have some helpful ideas to contribute to a broader society, and find their efforts to convey scientific ideas confounded because each person has ideas on the topic that have been shaped by their own experiences."
What we got on Saturday morning was rather something much more insistently and effectively bi-directional: a brain researcher was telling people (who have particular sorts of brains) what studies on language processing were being done on folks like them; those folks talked back to him. Everybody (including listeners-like-me) learned.
This seems to me of evern larger relevance. In a nearby location, called Making Sense of Diversity, we've been talking a lot about how to keep conversations inclusive, not close them down, not shut folks out. During Brenda Jo Bruegemann's presentation on Saturday evening, she observed that ASL is the fastest-growing "foreign language" in American colleges and universities today; @ the same time, fewer deaf children are learning it--either because they are getting cochlear implants or because they are being educated orally. What is happening, then, to the uniqueness, the specialness, of "Deaf" culture, which has so insistently in recent years refused to define itself as "disabled" (and thereby instantiated a hierarchy between itself and those seen as "less abled"?) Who will "own" sign language--and the Deaf culture it figures--if more hearing people learn to use it? (Many interesting pedagogical observations arise here, too, around why so many hearing students are interested in this new-to-them language; might it have to do w/ teaching styles that are newer, more visually and kinesthetically oriented than those employed in the more-conventionally-taught foreign languages?)
All of this seems highly related to today's session on Exploring Pathways to Access and my own challenge that supplementary learning programs perpetuate "bad" learning practices in the "primary" locations (i.e. large introductory science lecture courses), which I hope we can discuss further....
|what works ...|
Name: Paul Grobstein
Date: //2004-11-14 18:12:52 :
Link to this Comment: 11559
"Useful" shouldn't be understood as a "guide" for "what works" anymore than "accuracy" should be, though for interestingly different yet related reasons. "Accuracy" in reconstruction of the past isn't a "guide" because the present is NOT the past (even assuming such a thing as the "past" actually exists); what worked in the past (again assuming it exists) will not necessarily work in the present. "Useful" is not a "guide" for "what works" because there is no way to to determine what works without doing it. To put it differently, "useful" and "what works" are the same thing, and that attribute cannot be assessed independently of action.
Another way of saying this is that just as the past is a a somewhat arbitrary reconstruction by the brain based on present traces and some somewhat arbitrary mechanisms of backward extrapolation so too is the future a somewhat arbitrary reconstruction based on present traces and some somewhat arbitrary mechanisms of anticipatory or forward extrapolation. There is no way to know with any degree of certainty what will "work/be useful" in the "future" anymore than there is any ability to recover with any certainty what existed in the "past".
As in biological evolution, what exists in the present is, by definition, what "works/is useful", in all its diverse forms. Extrapolating from biological evolution (with its associated presumption that the concepts of "past", "future", and "time" itself are meaningful), some things that exist in the present will persist in the future, other things will be influential for things that exist in the future, still other things will more or less disappear in the future, and there will continue to be a great multiplicity of things that "work/are useful". One may (usefully?) try and guess which present things will fall into which categories but there is no way to actually know except by doing the experiment (ie by letting the process of iterated present change take place).
And so it goes ... ?
|on wanting more....|
Name: Anne Dalke
Date: //2004-11-14 14:58:31 :
Link to this Comment: 11551
So: I just wrote up a report of our discussion on History, Memory and the Brain. Am still not satisfied. I buy the argument--that memories don't exist, in our brains, in a sort of "file" we can access; that we, rather, construct them out of neural traces....
but this just seems to push the interesting questions back a couple of steps: why and how then do particular traces get accessed, why and how do particular memories get constructed? And--once the rubric "useful" replaces "accuracy" as the guide for "what works"--where is that judgment located, how and by whom does it get exercised? How individual, how local is it, how collective, how global? If I find it "useful" to deny that the Holocaust took place (because I can escape the guilt, or the reparations...)? If I find it "useful" to tell the story of the elections in a red and blue, rather than a purple map, because I want to exacerbate the polarities in this country...?
Liz was suggesting that the process of selection is one of "distillation," of finding the places where the "memory," the "history," is shared by the largest number of people....But if the shared story is not useful to me...I'm "free" to re-write the history? Based on....? Are there any limits to my story? (If so,) where do they come from?
The Emergent group also arrived (via a different path: the history of the universe) @ this same series of queries last week: once we acknowledge the profound unrecoverability of the past and similar unpredictability of the future: where then lies the ability to recover any of the past (much less make predictions at all....)?
|observing good teaching|
Name: Anne Dalke
Date: //2004-10-03 15:45:38 :
Link to this Comment: 11009
I've just finished writing/putting up the summary of the discussion Tamara led last Friday on Presenting Science Within and Outside of the Lab. Took note, while doing so, of the closing observation that
The hard discipline of "attending to the observations, not the story" is not unique to science, and might be useful for all of us as we (for example) watch a presidential debate: how open are we to actually hearing what is said? How bound is what we notice to the story we already have at hand (i.e. which candidate we prefer)?
My daughter reported that her high school American history teacher played a portion of the debate for them on Friday with the sound turned off; then with the sound on, but no visuals; then gave them a transcript of the conversation to review. They were, in each case, to record the details they observed --a good way of teaching them, I thought, to "attend to their observations," rather than to the story they had going in....
|preserving the perverse: the use of scientific fic|
Name: Anne Dalke
Date: //2004-09-28 20:53:36 :
Link to this Comment: 10983
I'm teaching the core course on Gender and Sexuality over @ Haverford this semester. Today we read (among other things) Samuel Delany's essay "Aversion/Perversion/Diversion," which ends w/ the observation, "I am trying to put a bit of the perversity back into perversion." So much of the queer theory we've been working w/ in this class tries to do just this: preserve queerness, refuse its incorporation into any fixed, any "factual" (especially "gay") identity....
This is not as unrelated (in my queer mind, at least) to the current conversation as it may appear. I understand the gesture to preserve "queerness" (aka the refusal to make the strange familiar) as a particular expression of a more general theoretical refusal of the binary between a "confirmed" fact and an "interpretation" of it. Another very useful recent text in this class was Making Sex: Body and Gender from the Greeks to Freud (1990) by the historian Thomas Laqueur, who says,
"Like the other sciences," writes Francois Jacob, winner of the 1965 Nobel Prize for medicine, biology...is no longer seeking for truth. It is building its own truths. Reality is seen as an ever-unstable equilibrium...swinging to and fro between...the identity of phenomena and the diversity of being." The instability of difference and sameness lies at the very heart of the biological enterprise....August Comte, the guiding spirit of 19th century positivism, confessed that "there seems no sufficient reason why the use of scientific fictions...should not be introduced into biology." And Emile Durkheim...argued that "we buoy ourselves up with a vain hope if we believe that the best means of preparing for the coming of a new science is first patiently accumulating all the data it will use. For we cannot know what it will require unless we have already formed some conception of it." Science does not simply investigate, but itself constitutes, the difference [it] explores.....
In other words: constructing facts in order to act, building (temporary) structures for standing on and pushing against. Or, as Jody Cohen suggested in the Descartes forum some time ago, if we acknowledge that solid foundations are not really there--that they are instead our creations--then we can use these kind of like a springboard to push off into action and change.
|getting the observations straight ...|
Name: Paul Grobstein
Date: //2004-09-28 19:13:24 :
Link to this Comment: 10982
There is NO equivilence between "profound skepticism" and nihilism (see Writing Descartes for extensive discussion of this issue). As for "relativism" (which is closer but also not the same thing), it has a much better historical safety record than that of any of the other ism's (eg capitalism, catholicism, colonialism, communism) that use "facts" in any of the senses earlier considered ("constants of nature", "established knowledge", "negotiated consensus", etc). "Facts are agreed pieces of knowledge that empower" ... some people in relation to other people.
Productive provocations aside, I think there is, as Paula says, less disagreement between us than it might first appear, and that most of that disagreement can be chalked up to our different relations to the subject of discussion. Paula is an historian of science, an observer from the outside, for whom the description of broad patterns ("facts") is important as a foundation from which to achieve further understandings by applying the tools of her profession. I, on the other hand, am a scientist. As a participant in the enterprise, I am necessarily also a shaper of it, at least by default. And being congenitally unable to participate in the shaping of something without thinking about that shape, I can't but notice when others (both scientists and non-scientists) describe what I am involved in in ways that differ from what I myself am trying to achieve (and others as well). For me a "cultural consensus" (if it in facts exists) is nothing more (lor less) than an observation relevant to helping me decide how to better do my job. There may indeed be a tension between what I observe and what I aspire to help create but there is no contradiction (though I can see that such could exist for an outside observer).
The term "fact" is now ok with me (though I doubt I'll ever myself use it), as long as it is understood in terms of one or another of the various clarifications that have arisen in this (it seems to me productive) conversation. And I'm perfectly happy about and think quite useful the addition of "pusher, mover, provocateur" (though I'm a little dubious about the etymological connection).
Since we've gotten productively this far, let me though poke a little further. Gould's "fact" as that from which "it would be PERVERSE to withhold provisional assent" (emphasis mine) is right on the edge of unacceptability for me. Its ok IF it is explicitly understood that what is "perverse" is appropriately and necessarily an indivdiual judgement, and that an individual may at one or another time not only reasonably but wisely decide to withhold even "provisional" assent to see what that brings into view that couldn't be seen before.
The key here is the understanding that there really ARE, at any given time, an infinite number of stories yet to told/tested (as well as, yes, a very large number that have been told/tested/found wanting and been appropriately discarded), and that being "perverse" is a very good way to get to many of them. The expanse of potential stories NOT be "paralyzing". Indeed, the infinitude of the future is, I would argue, exactly what gives science its distinctive flavor and energy. Like evolution, it proceeds by exploration of a space of opportunities whose extent is continually expanded by its own activity.
Yes, I think there is an equation of some kind between "the number of stories one can access and implement" and how "privileged" one is or isn't. But what that relationship suggests to me is not that people need more "facts" (in the "traditional" sense of that word) so their choices are more limited, but rather that people, all people, are better off believing in the infinitude of the future and being provided with the wherewithal to maxmize the choices they can pursue to participate in exploring it. The really important issue for me, both as a scientist and as a human being, is not life "as we experience it here and now". In life, as in science, I am not an observer but a participant, and so I am interested in the description of what is now only insofar as it is relevant in deciding where to try to go next.
Looking forward to further observations and stories. Here and elsewhere.
|a f-act is an act forward|
Date: //2004-09-28 15:23:14 :
Link to this Comment: 10979
|Stephen Jay Gould's facts|
Date: //2004-09-28 15:48:57 :
Link to this Comment: 10980
|re-constructing the illusion|
Name: Anne Dalke
Date: //2004-11-18 17:07:56 :
Link to this Comment: 11657
Thanks, Liz--but I don't think the Real World quite gets me what I'm after.
I'm smiling just now @ Simone's evocation of Eiseley's work. I hauled out his essay on The Starthrower for my Sex and Gender course a few weeks ago, wanting to share with the students (who were looking insistingly for an "origin story" that would guide present behavior; and share now again w/ anyone interested here) Eiseley's observation that
form is an illusion of the time dimension...the eternal struggle of the immediate species against its dissolution into something other...Form, once arisen, clings to its identity. Each species and each individual holds tenaciously to its present nature....
I've been thinking an awful lot, since last Friday, of my investment in nostalgia, my own desire to preserve "the form" of what I was (yesterday, last year, a decade ago). Being invited, by Paul and Elliott, to step outside that "form," to recognize it as "an illusion" that I might re-construct, whatever the form of the external world...
well: that's helpful.
|to the "audience"|
Name: Paul Grobstein
Date: //2005-01-29 09:56:21 :
Link to this Comment: 12303
Thanks, all, for thoughts/questions/reactions this afternoon. A few things that stick in my mind ...
The issue of whether one needs a certain level of background/skills/sophistication to contribute to science ...
|addendum, added thought|
Name: Paul Grobstein
Date: //2005-01-29 10:01:20 :
Link to this Comment: 12304
A "science" / "research" distinction ...
Pursuing the "ethics of science" notion further ... I think the "getting it less wrong" perspective should also preclude scientists from masquerading as "objective" (as in courtrooms and some political arenas), as well as promising things in the name of science that they can't deliver, eg a cure for cancer or a technological fix for terrorism. And it follows from this that there may be a need to be clearer about the difference between "research" and "science". The business of "science", on the present story, is to create the new "uncertain" story by creating and challenging good stories based on observations. And the recursive method of science can, I think, be trusted to do this, always and indefinitely. That METHOD can ALSO be used for other purposes, ie to develop technology, medical or otherwise. BUT whether it will work in these cases (ie whether a particular end can be achieved over a particular time interval with a particular financial expenditure) is, it seems to me much less clear. And so I'd be inclined to distinguish between "research" (using the scientific method to try and obtain a particular objective) and "science" (with a foundation in "profound skepticism").
I fully realize I lay myself open here to the charge that I am being elitist, or nostalgic, or naive. In fact I don't think I'm being any of these. Lots of good things have come from the application of scientific method to real world/practical problems (including new stories) so I certainly don't mean to denigrate what I am calling "research" nor to deny its participation in and significance for what I am calling "science". But, equally, lots of bad things have resulted from a confusion between the use of scientific method to achieve particular goals and science as ongoing skeptical inquiry.
Just as I dislike the phrase "It was been proved by science that ... ", I equally mistrust the phrase "I am going to go about this scientifically ..." as a justification. If people want to use the scientific method of recursive observing, story telling, and testing to achieve a particular goal, they should be all means do so. But they oughtn't to be allowed to do so under the mantle of "science", whose success/integrity depends on NOT knowing (or caring) whether a particular objective can be achieved. Insisting on this distinction may well cost "science" social support. But not insisting on it has led and will continue to lead to popular disenchantment with science, and, even more importantly, to a lost opportunity for science to contribute to the rest of humanity a distinctive moral: "uncertainty" and "being wrong" are not only not eliminatable but should be celebrated as the source of all meaningful human creativity.
Name: Anne Dalke
Date: //2005-02-22 23:02:07 :
Link to this Comment: 13146
As I was typing up my summary of Hiroshi's talk last Friday on "Imaginary Dialogues," and as I have been discussing it since with colleagues around campus, I found myself still puzzling over his embrace (joined by @ least one scientist and one historian of science) of "illusory" as a description of the story-making (=audience-deceiving?) process. My usual source (the OED) confirmed that the "derisive" associations of the word constitute its primary meanings…which is why it doesn’t work for me as a descriptor of what any of us—humanists or scientists-- are doing (nor “should” be doing) w/ our audiences.
Now, I’ve been accused of an ineradicable belief that words have meanings independent of context; not so…but there are better and less good words for any given context. And I think, given the particular context of this year’s discussion of “Science’s Audiences,” and the even more particular context of Hiroshi’s comparing ‘em to theater audiences…
that "illuding" does not "fit," is not a useful word here. Although I’m familiar w/ the philosophical argument from illusion (that the objects of sense-experience cannot exist independently of the perceiver, since they vary according to her condition, her environment), I think that, in this context, “illusion,” in its 1st/2nd/3rd meanings as “an act of deceiving the bodily eye by false or unreal appearances, or the mental eye by false prospects, statements, etc.; deception, delusion, befooling”—is a decided misnomer. It’s the emphasis on falseness (which depends, of course, on a conviction of the true) which just doesn’t work for me.
Does it--"really"?--for others?
|illusion vs reality|
Name: Hiroshi Iw
Date: //2005-02-23 15:41:51 :
Link to this Comment: 13169
|more on the realness of the theatrical....and of t|
Name: Anne Dalke
Date: //2005-02-25 22:05:59 :
Link to this Comment: 13230
Cherríe Moraga spoke @ BMC on Friday afternoon about (among other things) the power of the theater to teach: its greatness "not dependent on literacy, but on the ways the body moves and can speak," w/out reading--w/out even using, sometimes-- words...
This intersected very powerfully for me w/ our year-long discussion about the need to open up the practice of science to include those not trained as professional scientists, to allow (for instance, again in Moraga's terms) the "citation of experience as justification for ideas," to recognize that "language influences ideas," and so not to insist on the use of a "foreign language (credentialized academic discourse)" which can "truncate thought." The academy is "not the only place people think w/ complexity," and we need to seek out and learn from the stories being told in other places....
|science as illusion?|
Name: the above
Date: //2005-02-27 14:02:08 :
Link to this Comment: 13257
The definitions of illusion Anne cites are not the only ones in the OED. In addition to being defined as an artifice to deceive, thus implying what Anne called a “derisive” action, illusion may also mean a passive, unintended result of such an act:
Something that deceives or deludes by producing a false impression;a deceptive or illusive appearance, statement, belief; the fact or condition of being deceived or deluded by appearances, or an instance of this.
(It’s also interesting that the Merriam-Webster considers the definition of illusion as "an act of deceiving" "obsolete.")
This is an instance of a concept that can have two meanings, one intentional, the other extensional. There’s a funny example of another of these notions (aiming at), which I read once in a philosophy article: Suppose Bob intends to shoot his donkey, but by mistake aims a gun at and kills his neighbor’s donkey, which looks very much like his own. “To aim” here has both an intentional meaning (Bob aims at the donkey with the intension or objective to kill his own donkey), and an extensional one (Bob aims or points his gun at the donkey and kills it).
When I defended scientific explanations as illusions (and since then I’ve revised my position, as explained below), I was ascribing them an extensional quality. My reasoning was that sophisticated scientists construct explanations with the realization that they are temporary, that they are the best ones (i.e., most useful, most agreed upon by the scientific community) in the current scientific and social contexts. They do not claim these explanations are the only, ultimate Truths, but simply that they are contextualized (thus temporary), negotiated and stabilized (for the time being) truths. These scientists are not trying to deceive their colleagues or the public. They may believe that the ultimate reality of the natural phenomena they study is intangible, and thus that their scientific theories are ultimate illusions, but this is unavoidable and unintentional. To be a good scientist means to make this clear. Anne is right when she points to the ethical implication of naively equating scientific stories with illusions.
But this is only one way to approach the problem, and I am now unsure I fully subscribe to it. The reason is that to call a scientific explanation an illusion presupposes the belief that there is a True Reality out there science tries to capture, that this Reality matters, and that it can be reached through science. Whereas I see the existence of a reality as plausible and consequential, I do not believe in the possibility of ever knowing it fully (there are too many internal and external mediators). Thus, arriving at a particular Truth becomes irrelevant for me. But while science does not reveal Truths, it enables us to eliminate explanations that are less true, that is, less useful (as defined above). For some, this is an indirect way to get at reality. Whether scientists believe they will arrive at Truths (or truths) or not, it is important that they aim (intentionally) at Truths (or truths).
Name: Jan Trembl
Date: //2005-03-20 17:54:47 :
Link to this Comment: 13677
Name: Jan Trembl
Date: //2005-03-20 18:01:06 :
Link to this Comment: 13678
|the dreary work of data-collection|
Name: Anne Dalke
Date: //2005-03-22 22:52:23 :
Link to this Comment: 13898
I was on the other "lab visit," where we learned about the manipulation of lasers. I cannot describe, in nearly the detail Jan described above, what we did and observed--but I can testify to the main thing I learned from our hour in that windowless, noisy basement room: the tedium involved in preparing to gather and then making multiple precise observations. The amount of dreary repetitive work that goes into collecting the data that contributes to the ideas that I find so interesting--well: I don't see why y'all do it. I couldn't keep @ it, even spurred on, as Liz and Bob seem to be spurred on, by the fact that they are making observations that no one else is making. I need a quicker pay-off for my work. And I need a setting that is aesthetically pleasing.
|the central dogma lives on|
Name: Paula Vite
Date: //2005-04-02 23:05:59 :
Link to this Comment: 14224
And the full reference:
FIELDS, R. Douglas. "Making Memories Stick. Scientific American 292 (2; Feb 2005): 74-81.
It turns out I gave a very poor, erroneous explanation of the article. It
does not challenge (at least obviously) the central dogma. Wil, Sharon, Anne, you were right to be skeptical.
Let's see if I can do a better job now. What Fields suggests happens, in order to transform a short-term into a long-term memory, is the following:
1) Responding to stimuli, 2 (usually more) neurons fire and communicate (establish a synapsis, or electro-chemical reaction)
2) Repeated stimulation temporarily strenghtens the synapsis
3) A (still hypothetical) molecule travels from the synapsis to the neuron's nucleus
4) In the nucleus this molecule activates a protein called CREB
5) CREB activates certain genes which code for proteins necessary to the long-term strenghtening of the synapsis
So, it's not that certain (repeated, important, useful, contextualized) memories become encoded in our genes. Our DNA already contains the coding for a protein responsible for making certain memories stick. It just needs to be "told" which memories are to be made long-term.
Fear not, this is not yet the end of the central dogma! Too bad for the Neo-Lamarckist among those of you... Or??
|Science VS the Human Spirit?|
Name: Wil Frankl
Date: //2005-04-05 08:45:11 :
Link to this Comment: 14321
Inside the Injured Brain, Many Kinds of Awareness
By BENEDICT CAREY
Published in the New York Times:
Yet these statistics cannot explain the stories of remarkable recovery that surfaced during the debate over Ms. Schiavo's fate. There was Terry Wallis, a mechanic in Arkansas who regained awareness in 2003, more than 18 years after he fell into unconsciousness from a car accident; Sarah Scantlin, a Kansas woman who, also a victim of a car accident, emerged from a similar state after 19 years; and several others, whose collective human spirit seemed to defy the experts, and trump science.
The human spirit triumphing over science, beating the odds, that is the story that everyone wants to hear. That is the story that plucks our heart strings and carves deep traces/memories. How are we, the skeptical, every to compete with that? It’s slanderous really, as if science is opposed to the human spirit. The Bold is my emphasis. Uurrgh!
|April Fool's Day, from the Sci American|
Date: //2005-04-05 19:17:30 :
Link to this Comment: 14332
Date: //2005-04-11 16:05:31 :
Link to this Comment: 14451
For once, I am not going to criticize anything (nor disagree with Paul!!)
And I don't think it's just the nice Spring weather. I really found Scott's presentation excellent, among the best in our BBag series this semester. It was important for our discussions about audiences, and interdisiplinary. Scott's delivery was informative, entertaining, provocative, and generated a good discussion at the end. The wine was good too... So, let's all sing once again:
Appropriately enough, do you know that "alma" is the Portuguese word for soul?
|From singing to punning...|
Name: Anne Dalke
Date: //2005-04-12 09:21:55 :
Link to this Comment: 14506
As I told Scott afterward his (yes, excellent!) presentation, I had a particular reason for wanting to know how/why he thought that "metonyms offered a different way of thinking"-- and I'm still curious. Part of my reason for asking is on record in a paper Paul G, Liz McC and I wrote last year, which actually explored some of the possibilities that metonymic thinking might open up--particularly in venues such as these brown bags.
Thanks to all for contributions to the landscape--'tis quite hilly and interesting!
Date: //2005-04-17 15:42:22 :
Link to this Comment: 14638
The Chronicle of Higher Education has a recent (4/08/05) piece on "The Thoughtful Distinction between Embryo and Human," which offers a perhaps-helpful extension of the conversation Scott led here last week on Fictions and Fetuses:
something else is at work here: our own brain's need to form beliefs...we all seem to be in agreement that there must be a point at which moral status should be conferred on an embryo. However, we seem to have a harder time defining that point....Mere possession of the genetic material for a future human being does not make a human being [insert here an analogy comparing embryos created for stem-cell research to a Home Depot]
Maybe the problem lies in the way the question is framed--that is, in the brain's need to form beliefs?
Name: Michelle F
Date: //2005-04-19 10:21:26 :
Link to this Comment: 14729
Reading many of the science blogs linked to some of those highlighted by Laura, I discovered "The Tangled Bank" which is a regular round-up of science blogging, hosted in turn by various bloggers.
|on Gender and Science|
Name: Arshiya Ur
Date: //2005-04-29 16:25:58 :
Link to this Comment: 14992
|a new year, a new forum|
Name: Ann Dixon,
Date: //2005-08-26 10:15:31 :
Link to this Comment: 15926