What is a dynamical system?
A dynamical system consists of an abstract phase space or state space, whose coordinates describe the dynamical state at any instant; and a dynamical rule which specifies the immediate future trend of all state variables, given only the present values of those same state variables. Mathematically, a dynamical system is described by an initial value problem. Dynamical systems are “deterministic” if there is a unique consequent to every state, and “stochastic” or “random” if there is more than one consequent chosen from some probability distribution (the “perfect” coin toss has two consequents with equal probability for each initial state). Most of nonlinear science–and everything in this FAQ–deals with deterministic systems. A dynamical system can have discrete or continuous time. The discrete case is defined by a map, z_1 = f(z_0), that gives the state z_1 resulting from the initial state z_0 at the next time value. The continuous case is defined by a “flow”, z(t) = \phi_t(z_0), whic