Dangers of the Paradigm Shift

**Abstract**: With the use of the new technologies teaching methods are
experiencing a dramatic change. But learning takes place not only through
*what* we teach, but through *how* we teach it. It is
difficult to assess how this teaching paradigm shift will affect the
knowledge our students will have, but studying how the use of the
ubiquitous pocket calculator has affected learning might give is some
clues.

Introduction

New technologies change the way we can teach but not the way people
learn. That has probably not changed much in the last few millenia.
What has been taught has changed substantially through the ages, but
the *how* has not until recently. This change of how we teach
has been called the *paradigm shift*
[1].

But learning takes place through both *what* we teach and
*how* we teach it. Profound knowledge has been woven into the
'how' of traditional education through centuries of evolution.
Therefore, changing the teaching paradigm changes
what we teach, what the students learn. Unfortunately, we are not
aware of the knowledge that is transmitted through the how. It is not
evident. If we suddenly change the way we teach, what is
going to happen with this woven knowledge? This is the danger of the
paradigm shift.

These dangers have been overlooked. There has been plenty of research
over the *difficulties* involved and the short-term differences
[1,4] between traditional
and new education. Only Peter G. Neumann, in his "Inside Risks"
column has hinted the risks involved in electronic
education [3]. Outside of the CS community
Steven Krantz has shown some serious concern on the uses of computers
when teaching mathematics [2]. There is no
reference to the dangers of the education changes that come with the
paradigm shift.

And dangers there are. Not tremendous, obvious dangers: those we see and avoid. The dangers lying ahead are subtle and fuzzy, but we should identify them and control them. If we are too enthusiastic in the use of new technologies some important knowledge and skills will be unnecessarily lost.

Identifying these dangers is difficult given that we do not know what knowledge is woven into the traditional teaching paradigm. It is equally difficult to know what the new paradigm might unweave. Fortunately, by analyzing the past we might get some clues of what the future might bring. Twenty years ago there was a smaller paradigm shift, when the 'other' PC appeared: the Pocket Calculator. In this paper I present some specific knowledge losses that can be directly attributed to the abusive use of calculators.

The pocket calculator

I have used pocket calculators since they appeared. I was the first kid
in my class to use one, I was the first kid in my class to own one. I
learned what a factorial was by programming a pocket calculator, and I
remember spending quite some time calculating *n!* and being
fascinated observing how quickly it grew with *n*. I did numerical
integration with my programmable HP-21 almost as soon as I knew what an
integral was. Almost all I know on numerical methods I learned on my
pocket calculator, much before computers became widely available.

In the courses I teach I rely on my students having calculators. In this
way I can add to my problem list interesting problems that involve a large
amount of computation, problems that cannot be done without the aid of the
calculator. I can tell them to compare a large amount of cases with
varying parameters, because I do not have to worry about the amount of
calculations involved. Actually I do not even *consider* the
amount of computation that goes into a problem.

But I was a lucky kid. My father and a neighbor were experienced computers (in the compute-with-pencil-and-paper sense) and showed me several pitfalls the calculator had. I had a high-school teacher that would not allow us to use our calculators and we had to learn advanced arithmetical skills using logarithms and such. In college I clearly remember when I curve-fitted some lab data and proudly showed it to a professor. He asked me what the curve meant. I did not know. "Then it is useless" he answered. Of course, at the time I did not agree with my father, my neighbor, my teacher or my professor. They were old fashioned, they did not understand the new world.

But I slowly became weary of the abusive use of calculators all around me. I began to get the feeling that something was getting lost. I couldn't say what. It definitely wasn't that people didn't know how to compute square roots. Perhaps it was that kids are not very proficient in basic arithmetic... I did not know, but began to observe. And I slowly began to identify the sources of my weariness. Up to now I have identified four knowledge losses that I can blame on the bad use of pocket calculators.

Loss of knowledge

The calculator as a limit

When doing a computation by hand we knew that it would take more or less time, but it could be done. If you wanted a division performed to 27 decimal digits, we could do it.

Last year I asked my students to find the decimal number represented by a floating point number. I wanted them to realize what kind of precision they had when using floating point. One student quickly realized the mathematical expression he needed to convert the number, keyed it into his calculator, wrote down the result on his paper and sat back. His procedure was correct. I told him that his result was incorrect. He stared at me, unbelieving. I told him that he had only 9 digits and the number had 15. And then came the question: "How can I do it if the calculator won't give me more than 9 digits?"

I myself used a calculator to aid me in the computation. The difference is that I knew what I had to do, and used the calculator as a tool to do it quicker; my student ordered the calculator to do the task, and as the calculator couldn't handle it, neither could he: it was his limit.

All operations are equal

While grading some assembly language programs my students had written I was
surprised to see that they often used a 'multiply' or 'divide'
instruction where an addition, a shift, or a judicious bit transformation
would have sufficed. One day I suddenly realized the reason for
this behavior. I know by *experience* that an addition is easy and a
multiplication is harder. I have done so many, that I've got the knowledge
engraved in my brain. I don't have to think about it, I automatically
notice when I can substitute a multiplication or a division by a sum or
some other simple operation. My students don't have that knowledge. They
have *been told* that multiplication is harder than addition, but
what they know by *experience* is that it costs just the same: you
only have to press a different button.

My students have by all purposes lost a fundamental arithmetical knowledge: that addition and subtraction are easier operations than multiplication and division. One direct result of this loss is clear: slower programs. There might be more consequences I am unaware of.

Loss of insight

In his thought-inspiring book *Silicon snake oil* [5] Clifford Stoll tells us how, when he was an
astronomy grad student, he went to China on a scholarship. There he
met Dr. Li who, using old chinese stellar data, was trying to obtain
information of the movements of the Earth's north pole. Stoll was
astounded to see that Dr. Li was calculating Fourier transforms with
the only aid of an abacus. He had brought his HP-85 calculator along
and quickly programmed it to do Fourier transforms. He showed Dr. Li
how a five month job could be done in a few minutes. Dr. Li was not
impressed... because the results were wrong. How could that be? As
it happened, Dr. Li was not just number crunching on an abacus, he
was carefully analyzing the old chinese data, compensating for
observer errors, crosschecking with other input... and getting not
only correct results, but also the insight that a barrelful of numbers
spewed out by a
computer cannot give you.

The unreliability of hand computation forces you to double check your numbers, to make sure they correlate correctly with other data you have, to make assumptions on the approximate value of the results to test the correctness of your computations... The reliability of machine calculations make all this 'unnecessary'. You gain time, and you loose insight, lots of insight. Also, when you are wrong, you don't find out. You trust your calculator too much. And that is the fourth loss.

The great sorcerer

When I perform a computation by hand I know what I am doing. If I
compute a division to 4 decimal places I *know* if I have
obtained an exact result or if I have a value approximated to four
digits. When my calculator performs even the simplest of operations,
I do not know what it is doing. Does it represent the numbers
internally as binary values or as decimal values? If it is in binary,
it cannot represent a simple number as 0.1 *exactly,* if it
uses a decimal representation, it can. I know how many digits it
shows (nine), but with how many digits does it work internally? I
don't know. I trust my calculator, but it isn't a matter of
knowing anymore. It has become a matter of *faith.*

While writing this paper I suddenly asked myself if the log values my calculator produced were correct. How can I even check? I have compared the results of two different calculators. There sometimes is a difference in the ninth digit. So one of them is wrong. Which one? Or are both wrong? Again, I trust my calculator. Even with an error on the ninth digit it is much better than log tables (which were not necessarily correct). But also again, I do not know, I believe.

If after a life of double checking results I have come to trust my
calculator perhaps a bit too much, what about my students? They trust
their calculators blindly. There are two ways I can tell. First:
they never second-guess the machine and are capable of handing in any
kind of absurd result (e.g. probabilities greater than 1) without
even realizing it. Second: they believe their calculator knows not
only about the operations it computes, but about the *problem*
they are solving. They commonly write down results with 9 digit
accuracy when the input data had only two digit accuracy. When asked
why they do that the answer more often than not is "That's what the
calculator shows". They believe their calculator never would have
output nine digits if only two were good. The calculator has become
the great sorcerer.

On a side note, one of the main complaints I have heard of students during the last decade or so, is that they lack critical thinking. Could it be related?

Conclusion

Changing the ways we teach will change the knowledge our students receive. The use of the pocket calculator in the last 20 years has shown at least four knowledge losses that, although subtle, are important. The massive and indiscriminate use of new technologies can only make these losses worse. We must be very careful when applying new technologies to our teaching so that we can use them to our advantage, while minimizing the losses.

References

[2] Steven G. Krantz. *How to teach mathematics:
a personal perspective.* American Mathematical Society.
Providence, RI 1993.

[3] Peter G. Neumann. Risks of e-Education. Inside risks column.
*Communications of the ACM,* 40(10). October 1998.

[4] Marian Petre. Assessing innovation in teaching: An example.
*SIGCSE Bulletin*, 30(2): 40 -- 42, June 1998.

[5] Clifford Stoll. *Silicon snake oil.* Pages 27 -- 29.
Anchor Books, Doubleday. New York 1995./

Author ©:Joe Miro

Dept. Matemàtiques i Informàtica

Universitat de les Illes Balears.