An illustrated calculus textbook

Aleks sends along this (by Robert Ghrist). It reminds me a lot of my class notes from high school.

I took a look at the book, starting on page 1 with the description of functions, and it made me think that maybe a more historical approach would be useful. The idea presented in this book, of a function as an arbitrary mapping from inputs to outputs, is a relatively modern idea (I associate it with Fourier in the 1800s, but my remembered history might be wrong here), and it might be helpful to build up to it from simpler ideas.

Similarly, instead of starting with definitions of limits etc., I suspect it would be better to start with the theorems and then motivate the definitions as the things you need to prove the theorems rigorously. That’s the historical order, right? Theorem, then proof, then defining the conditions of the theorem.

I sent the above paragraphs to Aleks, who commented:

Agreed strongly. But you’re assuming that theorems are intrinsically interesting, or that proving theorems is something one might want to do. History shows that all these tools have been developed to solve real problems in engineering – the same way Fisher developed his tools to solve real problems in agricultural experimentation and the same way probability was developed to make sense of gambling.

Yup. But then that just adds a couple more steps to the beginning of the process: first the real problem, then the series of possible solutions, then the theorem, etc. Definitions still come at the end.

That said, this attitude of mine is not new, yet it remains standard to teach math in this definition-theorem-proof style. So there must be some good arguments on the other side. To me, the Ghrist book looks like an excellent effort but from a backwards perspective. But I’m sure Ghrist could provide some good reasons why I’m wrong.

8 thoughts on “An illustrated calculus textbook

  1. In general I think that math and physics books are too “decontextualized” in that the methods and techniques are almost never put into the context of a real problem of great interest to the reader. It is hard for such a book to get written, in part because the people working on real problems and the people working on the theoretical justifications now sit very far apart. At the undergraduate level this is even worse because the people who write undergraduate textbooks (at least in physics) and the people who do research are starting to sit even further apart.

  2. Andrew: There are two books on analysis and Lebesgue integration which teach the topics via their historical development: “A Radical Approach to Real Analysis” and “A Radical Approach to Lebesgue’s Theory of Integration” both by David Bressoud. It’s not basic calculus, but they’re written as you describe.

  3. Going through all the history takes longer. Of course, it can help to know the history, and can be interesting. But, it does mean you cover less material. And, history can be messy. It can take a long time to figure out the cleanest way to develop the material.

    I think explaining functions as a mapping from inputs to outputs is the right place to start. (Then, at some point, the student can learn the definition as ordered pairs.) Too many people still think of functions as expressions. It took a long time for mathematicians to figure out the modern point of view. If we show students the older notion, they’ll probably never unlearn it. Just look at how many books say f(x) is a function when the function is really f.

    My impression of the history is that it wasn’t Fourier who came up with the modern idea of function. But, it was his work that forced people to take a closer look at what a function was. I think Fourier claimed some theorems were true for all functions, e.g., every function has a Fourier series that equals the function. Whether this is true depends on what you mean by “function”.

  4. I have always believed that if kids learned the history of advanced forms of mathematics–& in particular the origins of them–many more of them would find learning how to do them exciting. I had occasion to feel this way recently when I read Edward Dolnick’s The Clockwork Universe: Isaac Newton, the Royal Society, and the Birth of the Modern World (http://www.amazon.com/Clockwork-Universe-Newton-Society-Modern/dp/006171951X), which had a super engaging discussion of the development “the calculus,” including the feud between Newton & Leibniz (former comes off as major jerk, latter as lovable goof). Made me want to run out & do a lot of calculus! BTW, it would be odd to describe their motivation for developing calculus as “solving engineering problems”! They were mathematician/scientists (actually, it was Newton who mathmaticized natural science). Likewise, lots of other innovations were purely theory driven — think of imaginary numbers, which basically were developed by Gauss to help him explore theorems relating to prime numbers. See Marcus du Satoy’s Music of the Primes (http://www.musicoftheprimes.com/thebook.htm). Most of the cool things in number theory were also theory driven; the only “engineering” advances, if you want to think of it that way, were focus of Pascal & others on probability & gambling!

  5. Yup. To clarify a bit: the Ghrist book was charming but I was distressed that it seemed to accept without question the traditional definition-theorem-proof approach to mathematics. To me, it’s ridiculous to start a calculus course with the epsilon-delta definition of a limit. It makes much more sense to me to start by proving and using the theorems, then stepping back and considering under what conditions the theorems are not true, which takes you back to the necessary conditions and definitions.

  6. It makes sense to learn things in the context of interesting examples. Just two examples from my (long time ago) high school experience: The limit lim_{x -> 0} sin(x)/x =1 . We learnt this when needed in the analysis of the pendulum. The proof is by drawing a figure.

    Then, after physics experiment, we had to write reports including “error analysis” . For this use we were thaught
    (in my presentlanguage!) the use of the “delta rule” to approximate standard errors of nonlinear functions of the data.
    Again, proof is by drawing a picture.

Comments are closed.