Bryn Mawr College Professor of Biology Paul Grobstein recently (fall, 1997) gave a presentation describing his sense of how technology can be used to enhance undergraduate education, using as an example a new introductory biology course he is teaching, Jan Richard, Acting Director of Computing at Haverford College, asked whether the new course would adequately prepare students for medical school, and how she (or anyone else) could know whether it would or not. Paul Grobstein's response reflected a distaste for allowing medical school admissions requirements to dictate the style or content of undergraduate science courses, but clearly failed to address larger issues on Jan Richard's mind. The larger issues began to be explored in a subsequent email exchange between Grobstein and Richard. Excerpts from that exchange are provided here, in hopes that they might help to further stimulate and focus discussion about the quite general issues posed by technological advance in relation to educational practice.
I've been mulling your question last Friday, which caught me a bit off guard and from whence I leapt to an issue of particular concern to me, but one perhaps not at all related to your concerns. If so, I apologize.
A much simpler (and perhaps more appropriate) answer would have been: Biology 103 is intended for students other than those who know they are committed to medical school. A different two semester sequence is available for such students. Students who take Biology 103 and subsequently decide to go to medical school need to do an additional semester from the two semester sequence. In either case, and irrespective of web usage, hard copy course syllabi are available to show exactly what topics have been treated and at what depth. That, at least, is the basic answer if the question was "am I providing adequate background for medical school?". To which I would add, as I have in the preweb past (and again irrespective of web use), that I think students do better in the long run (in medical school and elsewhere) if they are given the chance to work through a conceptual framework for biology which they can tie "facts" to, rather than simply being deluged with large amounts of information. I have increasingly taught biology this way over the years, and have good reports back from (at least some) students who have taken taken such courses from me in preweb days.
Moving in this direction does, however, elicit concerns from both students and other faculty, of precisely the kind you expressed. Was it a concern of yours personally? Or one you were expressing by proxy for people who have expressed such concerns to you as a representative of the new technology approach to education? In either case, I don't think its a matter of new technology but rather one of a new form of pedagogy which is emerging independently or the new technology, but which, as I argued, the new technology both encourages and makes more practical. And yes, it raises new questions about both expectations and evaluations of "education" (hence my getting up on my hobby horse about medical school demands on undergraduate education). These are, however, issues not about technology use itself, but rather about a new (and I believe preferable) form of pedagogy.
The bigger question was exactly what I was getting at. We are beginning, at
both schools, to talk about what expectations we should have for faculty
about integrating technology into their teaching. But I don't think we
know how to address the question of what will happen to education when we
do this. I'm not sure that most of us have the ability to envision what a
classroom will (or can) look like in this new paradigm, or what our
students will look like when they've been through this type of education.
We've basically been teaching the same way since the time of Socrates, so
how do we envision a different way? And how do we know that technology, and
the new form of pedagogy it enables, can solve some of the problems we
perceive in our current system, and won't create worse ones?
I should be in a position to help lead the revolution, and I myself am not
sure what our cause is. I look at some of the big problems in society --
violence, poverty, people who are disenfranchised from the system -- and
am not convinced that technology is the answer. Maybe this is looking at
too big a question. But even within the classroom -- how do we assess what
students are learning in these new learning environments, and whether this
is what they need to learn? We constantly hear how our students can't
compete in math and science with students from other countries with more
structured educational systems (e.g., Japan). Is making our classrooms
more exploratory the answer to this problem? Is it a problem we want to
I don't know the answer to any of these questions, but I would like to see
us address them, along with the simpler questions like "What level of
technological competence should we expect faculty and students to have?"
How do we begin, as colleges, to talk about changing our entire concept of
education? Can we make changes without knowing what the outcomes will be?
Can we determine outcomes before making changes? Can we create prototypes
of the "classroom of the future" to experiment with? Can we afford the
inevitable mistakes and failures in the transition period? Do other
faculty (I know many students do) see a need for change?
So, yes, my question about premed advising was getting at a question that I
think is in a lot of faculty members' minds: if we change our educational
system with technology, will we come out with students who are better
prepared for whatever they will need to be prepared for, or will we just
end up with students who are badly prepared for what we currently prepare
them for? If we don't know the answer to this question, is it worth the
Responding to ...
Nice and appropriate questions. Which have been very much on my mind since you wrote. Sorry about the delay in getting back to you. As I suggested in my talk, I do think we can envision a better way, not so much because of technology (which doesn't in and of itself show a better way) but rather from thinking about what demonstrably doesn't work as well as it should in the current educational mode. That's something I've been thinking about (and even writing about, see The Scientist/Teacher: A Call to Arms and Science Education: What's It All About) for quite a while, long before the technology revolution hit. So my interest in all this is really rooted in educational issues, and in an interest in what technology can do to improve education rather than in how to deal with the technology revolution itself. Yes, I see mutual reinforcement between technology and new forms of pedagogy as desireable, because it can correct some known educational problems. Will it create new ones? Probably. Change always does. But at least they will be NEW ones, rather than continuing to wallow in old ones. We may not be right, but we will at least be "less wrong", which is, in my scheme of things, the best one can ever do.
Responding to ...
Nicely put. Thanks. It is, from my point of view, PRECISELY the big problems we should have in mind. And no, I don't think technology per se is the solution to them. I do, however, for reasons too long to go into in detail here, believe that technology cum new pedagogy is the best route we currently have to attacking the big problems. Basically, the argument says that what humanity needs is more uniform respect for, and hence enfranchisement of, the enormous diversity of individuals and individual perspectives which make up our species. And that this in turn depends on an educational system (world wide) which makes available to each individual the difference experiences each needs to maximize their own distinctive creative abilities. No, I don't take more successful competition in standardized examinations (or tasks) by American students as the educational objective. And yes, we will need to develop new mechanisms of assessment more in tune with a broader educational objective. But it does seem to me that the educational objective is reasonably clear ... and that existing and future technology makes it possible in a way that it has never before been possible. With web access and educators who conceive their task as guides/facilitators/critical responders, I honestly think the broader educational objective can be achieved, both locally and world wide
Responding to ...
How may faculty see a need for change? I don't know for sure, but certainly enough to get started. Can we afford mistakes? My feeling is that the mistakes resulting from the technological revolution (an inevitable change) will be worse UNLESS we give it some foundation in educational reform. And there are certainly people willing and able to support the latter (see Alison Cook-Sather's piece on educational theory on Serendip)
Responding to ...
Exactly the core of the issue. Neither you, nor I, nor anyone else, can possibly know enough to say what it is that today's students need to be prepared for, except that they must be capable of thinking flexibly and for themselves. The rate of change of human society is simply too fast at the moment to bet on the importance of preparing them in terms of less ambitious, though better defined, skills.
As you can probably tell, I am struggling with the challenge of being in the role of promoting technology to faculty when I'm not convince myself. I guess my assessment of where we stand today in terms of revolutionizing education is this: we all know that children learn by experiencing things with all of their senses and accepting random experiences into their brains without questioning or needing them to fit into a neat framework. Then we go to school, and we learn to weed out the important from the incidental (critical thinking) and organize them into a coherent frameworks or theories that we then learn how to present in coherent, linear arguments. Then along comes the Web, this disorganized, non-linear mass of important and unimportant information. And now we're supposed to tell faculty that this is better (or, at least, useful), more natural, more creative, more expressive was of expressing knowledge than the linear, ordered, painstakingly crafted manuscripts that we have always called scholarship. And, on top of that, they need to accept (and let their students experience) the inaccurate spelling, poor grammar, often bad writing that one finds on almost every Web site. Peer-reviewed, edited, published text is no longer the standard we expect. On top of that, while "going off on tangents" was always a criticism in lectures, now following tangents will be the norm in class. Can I really argue to faculty that this is progress in terms of education and scholarship? Or are scholars to tame this uncontrolled frontier and turn it into something more like what they're comfortable with and good at -- or into something new altogether? I imagine the latter.
You don't need to respond to these questions. I'm just trying to think them through in writing.
Very much the right questions to be asking. And looking for answers for. So, needless to say, I can't resist trying to give some.
No, I don't think that the "disorganized, non-linear mass of important and unimportant information" which is the web should be held up as the standard for expression of knowledge. Critical and coherent synthesis will always be the ideal, and achieving the underlying skills will increasingly (I think) be recognized as the principal objective of education.
At the same time, teaching based primarily on "the linear, ordered, painstakingly crafted manuscripts that we have always called scholarship" has, demonstrably, its own serious problems. One is the linearity, which is increasingly being recognized as a serious limitation in creative intellectual activities generally. A second is the conflict between attempted uniformity of presentation and the steadily more apparent diversity among receiving minds. What is clear and coherent to one student is profoundly opaque to another and what is structured to be made crystal clear to the second turns out to be in consequence incomprehensible to the first. The third, and perhaps most important problem with trying to teach as we have been is that what is learned is too frequently the product rather than the process. The best and most coherent synthesis at any given time needs to be clearly understood not as the end but rather as a starting point from which one works to achieve still greater synthesis of the "disorganized, non-linear mass of important and unimportant information" which has been and will always be around, with or without the web, but which is more visible, accessible, and useful because of the web. It is the process of making sense of information, including deciding what is and is not important, that has always been the touchstone of good liberal arts education ... and will increasingly, I think, be the needed center point for education for everyone in an increasingly complex and interconnected world.
No, obviously, I don't think we should try and "tame this uncontrollable frontier" and turn it into something scholars "are more comfortable with and good at". I think we should instead regard it as what it is: an enormous amplification of the grist for the mill of scholarly activity, of the needed raw materials for efforts to synthesize and make sense of, a previously inconceivable wealth of opportunity rather than a challenging of sacred standards. And, even more importantly, we should regard it as the educator's dream, a way to transcend the limitations under which we have always worked in the past.
The web, as I've come to see it during the past several years of gaining experience with it, is literally the means to overcome educational shortcomings which everyone is aware of but presumed were unavoidable. Yes, of course, real knowledge and understanding is multidimensional, rather than linear, and the web provides the means to materialize that, so it can be made rigorous and further explored. Yes, different students need different forms of presentation of information and knowledge, and the web provides that. I certainly share your feeling that much of the material on the web is amateurish and unuseful, but I've also discovered enormous riches of material synthesized in ways that are not only as good as that available in standard scholarly material but qualitatively better. People whose forms of synthesis were not consistent with existing scholarly standards are blossoming in the new environment and creating different but no less wonderful, coherent, and useful products. And students (among others) are finding them, through their own efforts as well as guidance from faculty.
Which, of course, brings me to the final point. If what we want students to do is to learn to digest and make coherent, there is no better resource for them to learn with and from than the web. Not only can they find forms of presentation that connect to them, they can find things which relate to questions they themselves have and use those to themselves ask the next question and the next and the next ... in principal learning things we think it important for them to know through their own inquiries and interactions with others rather than because we told them what they had to learn. And they can do it feeling they are a meaningful part of the process of making sense of things, instead of simply the passive recipients of the sense made by others. The very chaos of the web, the half baked thoughts and ideas, is precisely the welcome students need to engage themselves in the process, to feel they are active participants, sharing steps in the process, instead of an audience for the "palnstakingly crafted" and hence apparently (but terribly misleadingly) final performance.
Something new altogether? Yes, in one sense: the wherewithal to shape education to the interests, abilities, inclinations of individual students, and for them to learn out of their own motivations, half baked ideas, and experiences. But, in another sense, no. The objective is unchanged: to provide an environment and context in which students can develop as independent, critical, creative, imaginative thinkers, ideally in ways that contribute in the process to everyone (faculty included) learning and conceiving in new ways. And the fundamental role of the faculty too is unchanged: to show students the excitement of continual creating and recreating, to react to and hence sharpen what students discover and make, and to share with them the process of learning and relearning. Yes, something new in practice, but very old in aspiration: new capabilities to do what we have always wanted to do. Hence my own enthusiasm and excitement, not for technology in search of a mission, but rather for education and intellectual activity, and what technology makes possible for them.
Science and Education | Serendip Home |