I think math is beautiful. I want to talk about it here. In writing down some of the stuff I've learned I hope to solidify my own understanding of it and coax out discussion from anyone who's interested.
I had my mind blown this semester when I was exposed to Fourier series.
For the layman, a signal is a string of information that varies (usually with respect to time). An example of a signal is the temperature outside at any time during the day. We can represent this temperature signal by a function, F(t), where t is the time of day, and F(t) is the temperature outside at that time. If a signal repeats itself after a set period of time, the signal is called periodic. F(t) is (roughly) periodic because, after 24 hours, the day's temperature cycle will repeat itself again. If a signal is periodic, then it has a frequency which simply refers to how many cycles the signal goes through in a certain time interval. F(t) has a frequency of 1/24 since it completes a full cycle in 24 hours.
As it turns out - if a signal is periodic, it can be represented as an infinite sum of scaled and shifted sinusoids, where the frequency of each individual sinusoid is an integer multiple of the original signal's frequency. This representation is called a Fourier series.
Look how powerful this is! The only restriction I placed on the original signal is that it be periodic. Then, according to Fourier, that signal is equivalent to a bunch of (infinitely many) smooth, curvy sinusoids, all added together.
Is this not completely unintuitive? What if, for example, our signal has sharp edges, like a square wave?
How can this signal possibly be represented by smooth, curvy things? Fourier's contemporaries had the same qualms, but, indeed, even square waves can be represented by sinusoids, although not perfectly:
Here's a neat visualization of the Fourier series in action:
Although Fourier series are quite the mathematical curiosity, their applications have had a foundational impact on modern technology, and thus modern society.
A system is an object that takes in one signal and outputs another according to some rule. For example, there's the system that takes in a signal x(t) and multiplies it by 2 to produce the output signal 2*x(t). A system is called linear if
- if you normally input some signal x(t) into the system and get out y(t), then when you input x(t) scaled by some constant, the output should be y(t) scaled by that same constant
- if you normally input some signal x1(t) into the system and get out y1(t), and you normally input x2(t) into the system and get y2(t), when you input x1(t) and x2(t) into the system at the same time ( i.e. x1(t) + x2(t) ), the system should spit out y1(t) + y2(t)
A system is called shift-invariant (or time-invariant) if you get out the same input from the system tomorrow as you would today given you supply the same input both times.
These properties are really just saying the system needs to be well behaved (how could we work with a system that outputs one thing today and another tomorrow, given the same input?) And it's not a stretch to mandate that systems have these properties because real-life systems often exhibit these properties. Furthermore, even non-linear (ill-behaved) systems can be approximated by linear ones. But that's the boring stuff... here's the really incredible fact:
If a system is linear and shift-invariant, then we automatically know what it will output for any periodic input signal. Why? Because of the Fourier series! If any periodic signal can be represented as a sum of scaled and shifted sinusoids, and we know how the system responds to an unscaled, unshifted sinusoid (we can determine this just by feeding in such a sinusoid and recording the output), then by linearity and shift-invariance, we know how the system will respond to the entire signal!
This fact belies image processing, audio processing, wireless data transfer, GPS, etc. etc. Basically the modern world wouldn't be possible without Mr. Fourier's insight, who at the time had no idea the practical consequences of his work.
And that, of course, is another one of the incredible things about mathematics. Mathematical discoveries which at the time of their inception had no foreseeable applications have a strange way of worming their way into some practical pursuit later on - sometimes, like in this case, hundreds of years later on.
Bookmarks