After finishing this post about the derivative of the sine function, I decided to hunt around online to see how common its approach is.
It’s not common. Most sites take the derivative of sine by considering
and working from there.
Eventually, after wading through three pages of results, I found another write-up of the geometric argument from, of all places, a site called Biblical Christian World View. It is apparently the personal site of a guy who’s good at math and also thinks it makes sense to write things like,
I illustrated Biblical truths with mathematical expressions. For an example, I illustrated the Biblical truth, “With God, nothing is impossible” as “two negatives equate to a ringing positive.” In the arithmetic of negative numbers -(-7) = +7! Two negatives equal a positive.
So. There’s that.
But just a little further along the Google results I found one more presentation of the same idea. This one is from Victor J. Katz, a mathematician who wrote a book about the history of math, and was writing from the historical point of view.
His article is much better than mine. The proof is clearer and surrounded with tons of other insight.
Katz delightfully points out how great a term “arcsine” is – it’s the length of the arc associated with that value of the sine function. Then, at the end, he gives Leibniz’ original argument that satisfies , and it’s crazy! Differentials are applied willy-nilly and manipulated algebraically in ways nobody does any more. I felt disoriented at first, adapting to this new way of thinking about calculus, and then wondered why I’d never seen it until now.
It’s true that there are a lot of old techniques no one uses, and that’s because now we have better ones. Indeed, modern analysis, with its deltas and epsilons, is much better, mathematically, than manipulating differentials in dubious ways. It’s rigorous and logical.
It’s also hard. I’ve been asked to teach delta-epsilon proofs to quite a few people, and I’ve never been able to get it across. I’m giving up on that for beginners. I am going to teach the geometry stuff, and I’m not going to feel guilty about it.
It is okay to learn a thing the wrong way the first time. That first pass is only there to get you used to the main ideas, and the main idea a calculus is applying derivatives, integrals, and series. It is not the mean value theorem.
Once you learn a rough version, you practice it in the field until you’re comfortable. Do some physics. Learn some differential equations. After all that, it’s nice to come back, study calculus again, and finally understand all that’s really going on.
Actually, I like it better that way. Lots of my college classes made me think, “Oh, wow – so that’s what was behind the curtain!” But if you had shown me all the wheels and gears up front, I’d have been too busy checking how each one fit into the next to see what they accomplished.
A case-in-point is linear algebra. I remember almost nothing from my freshman linear algebra course. It wasn’t a bad course, but it was rigorous, proving theorems from the axioms of vector spaces, and it was beyond the level I was ready for at the time.
A couple years later, I found I really did need to know linear algebra to get through quantum mechanics, so I watched Gilbert Strang’s video lectures, which are far more concrete.
They were wonderful. I understood what was happening. I could do all the calculations and answer all the conceptual questions.
Then, finally, I went back to read Sheldon Axler’s Linear Algebra Done Right, a book that goes back again to the axioms-of-a-vector-space point of view, and thought it was wonderful.
Keith Devlin disagrees. Devlin takes up multiplication, claiming one should not tell young children that multiplication is repeated addition. Multiplication is its own fundamental operation. (The field axioms treat multiplication and addition independently.)
I was taught multiplication as repeated addition as a child, and then retaught multiplication as an fundamental operation in college. Do you know how confused I was by that? None. Zero confusion ever. In fact I never even noticed the discrepancy until Devlin pointed it out. I thought about multiplication as repeated addition when it was convenient, and thought about it as multiplication when that was convenient, and never realized I was switching.
I do the same for the geometric and analytic modes of thinking about calculus now. When I’m solving a physics problem, I don’t even notice whether I’m doing calculus or algebra at a given moment – it’s all just problem solving.
Why, then, do introductory calculus classes spend a month learning limits? Better just to ignore them and press on to the good stuff. There will be time later for learning what the difference between “continuous”, “differentiable”, and “smooth” is – modern medical science is working new miracles all the time.