What does analog mean, in math/computer terms?
For an analog signal the information is directly in the variation of the signal. For a monitor the video drive is the voltage of a signal, for a FM radio the signal is the frequency deviation from the carrier wave. Analog signals are (in theory) infinitely variable. Back to the monitor example, there is an infinite number of levels that the signal can be at between the 0 volt minimum and the 0.7 volt maximum. However, this makes the signal very suceptible to noise. For digital signals the information is encoded into the sequence of highs and lows of the signal. This means that the signal will have to move much faster than the analog one to pass over the same information and that the signal needs extra processing to put it into a digital format and to interpret it at the other end. However, because the level of digital signals are either high or low they are very noise resistant.