Should We Replace Trigonometry with Basic Complex Geometry?

This is a follow-up to my previous post. I think it says quite a bit about a mathematician when you ask about what their initial definitions are. If you ask about the main trigonometric functions: sine and cosine, then there are plenty of options to choose from:

• classical geometry as corresponding to points on the unit circle,
• power series,
• solutions to the differential equation $y'' = y$ with appropriate initial conditions,
• real and imaginary parts of the function $e^{it}$,
• and plenty others (including one found in M. Spivak’s classic book Calculus, which strangely enough is not a formalization of the classical model).

Honestly, despite having difficulty dealing with it formally, my initial thoughts always land on the classical model. But the truth is that it’s easiest to deal with the objects directly in almost any other way. The problem with most of these definitions is that we usually want to define these functions before calculus, and most require at least a little knowledge of calculus to understand.

But there is an exception, namely, if you define cosine and sine as the real and imaginary parts of $e^{it}$, then there’s no calculus in sight. Of course, this requires introducing the complex numbers and you’re only pushing the calculus hiding to the exponential function. But my claim is that the latter is something we’re already comfortable doing. The former seems justified (at least to some extent), but introducing the complex numbers and their geometric picture seems to give an algebraic framework for understanding plane geometry.

And justifying the geometric picture seems no more hand-wavy to me than how we currently do things. And with this $pi$ is a constant with an actual definition (i.e. the first positive solution to $e^{it} = -1$), as opposed to the never quite defined number that most people think of it as. The addition formulas become extensions of exponential rules, which hopefully are well-known by that point.

I think that a major criticism is that Euler’s formula no longer becomes some special, but is just the definition. A fair criticism, but I would say it’s a pretty common phenomenon in math that a major result eventually becomes a definition.

Advertisements

About minimalrho
Unemployed guy with a PhD in math.