Dynamical system - Foundations and Formal Definitions
Understand the fundamental concepts, formal definitions, and various mathematical formulations (geometric, measure‑theoretic, topological, and category‑theoretic) of dynamical systems.
Summary
Read Summary
Flashcards
Save Flashcards
Quiz
Take Quiz
Quick Practice
What does a dynamical system describe?
1 of 21
Summary
Definition and Basics
What Is a Dynamical System?
A dynamical system is a mathematical model that describes how a system changes over time. Rather than just looking at a system at one moment, a dynamical system tracks how the system evolves from some initial state through all future states.
You can think of it this way: if you observe a physical system (like a pendulum, weather patterns, or population growth) and record how its state changes from one moment to the next, you're observing a dynamical system in action.
Formally, a dynamical system provides three key pieces of information:
What describes the system — experimental data, differential equations, or a rule mapping present states to future states
Where the system evolves — a well-defined set of all possible configurations called the state space
How time progresses — through a time parameter $t$
State Space and Time Evolution
The state space is simply the set of all possible configurations or states that your system can occupy. Every point in the state space represents one complete description of the system at a given moment.
For example:
For a pendulum, the state space might be pairs $(θ, \dot{θ})$ where $θ$ is the angle and $\dot{θ}$ is the angular velocity
For a population model, the state space might be all possible population counts
Time evolution is described by an evolution function (also called a flow or evolution rule), which is a map that takes a state at time $t0$ and produces the state at a later time $t0 + \Delta t$. If we denote this map as $\Phi$, then:
$$\Phi(t): \text{state at } t0 \mapsto \text{state at } t0 + t$$
This map acts on the state space consistently over time, creating a family of transformations parameterized by $t$.
Important Properties
Several key properties determine whether a dynamical system is well-behaved and useful:
Existence and uniqueness of solutions means that starting from any initial state, there is exactly one trajectory that evolves from that point. This is crucial—without it, the system's future would be unpredictable or non-unique. For differential equations, this property is guaranteed under mild smoothness conditions (like the Lipschitz condition).
Integrability refers to systems that possess conserved quantities (called integrals of motion or conserved charges). These are quantities that remain constant as the system evolves. For instance, in a pendulum without friction, energy is conserved. Integrable systems are mathematically simpler because the conserved quantities reduce the effective dimensionality of the problem.
Systems can be classified along several dimensions:
Discrete vs. continuous: Discrete systems update in time steps (like a computer simulation), while continuous systems evolve smoothly in time
Differentiable vs. smooth: Systems can have different levels of mathematical smoothness in their evolution rules
Deterministic vs. stochastic: Deterministic systems have unique futures, while stochastic systems involve randomness
Chaotic vs. regular: Chaotic systems show sensitive dependence on initial conditions (small differences lead to vastly different outcomes)
Types of Trajectories
A trajectory (also called an orbit) is the complete path traced out by the system in state space as time progresses. It's the collection of all states visited by the system starting from a given initial point.
There are two fundamental types:
Periodic trajectories repeat exactly after some fixed time interval $T$, called the period. The system returns to the same state again and again. A simple example is a perfect pendulum oscillating back and forth—it traces the same path repeatedly.
Aperiodic trajectories never repeat. The system wanders through the state space visiting new states indefinitely, never returning to exactly the same state. Chaotic systems typically produce aperiodic trajectories.
These different trajectory types are visually distinct in state space—periodic trajectories form closed loops, while aperiodic trajectories create complex patterns.
<extrainfo>
Numerical Methods and Approximations
Modern computational power has made it possible to approximate trajectories for most dynamical systems that cannot be solved analytically. Numerical integration methods (like Runge-Kutta methods) allow us to compute approximate solutions step by step on electronic computers.
However, an important caveat: numerical approximations introduce errors, and these errors can accumulate over time. In chaotic systems particularly, small numerical errors can eventually lead the computed trajectory to diverge significantly from the true trajectory. This is why understanding both the power and limitations of numerical simulations is important.
</extrainfo>
Formal Definitions
Geometrical Definition: The Triple $(T, M, \Phi)$
Mathematicians formalize dynamical systems using a precise definition. A dynamical system is a triple $(T, M, \Phi)$ where:
$T$ is the time domain — typically the real numbers $\mathbb{R}$ (continuous time), the integers $\mathbb{Z}$ (discrete time), or some other semigroup structure
$M$ is a manifold — a smooth geometric space that serves as the state space
$\Phi : T \times M \to M$ is the evolution rule — a map that takes a time $t \in T$ and a state $x \in M$ and produces a new state $\Phi(t, x) \in M$
The evolution rule must satisfy two fundamental properties:
Identity: $\Phi(0, x) = x$ (no time passage means no change)
Composition: $\Phi(t + s, x) = \Phi(t, \Phi(s, x))$ (evolving forward $s$ time units then $t$ more time units is the same as evolving forward $t + s$ units)
If $\Phi(t)$ is a diffeomorphism for each $t$ (a smooth invertible map with a smooth inverse), the system is called smooth.
Continuous-Time Dynamical Systems
When the time domain is an open interval of real numbers (like $\mathbb{R}$ or $[0, \infty)$), the system is called a flow or a continuous-time dynamical system.
Differentiable dynamical systems are flows where $\Phi$ is continuously differentiable with respect to both time and state. These are typically generated by differential equations of the form:
$$\frac{dx}{dt} = f(x, t)$$
Global flows use the entire real line $\mathbb{R}$ as the time domain, meaning solutions can be extended backward and forward in time indefinitely. In contrast, a semiflow uses only non-negative time $[0, \infty)$, often because the system only makes physical sense moving forward in time (like in dissipative systems with friction).
Discrete Dynamical Systems
When the time domain is the integers (or non-negative integers), the system evolves in discrete time steps. A discrete dynamical system is typically defined by a single map $f: M \to M$, and the evolution is given by:
$$xn = f^n(x0)$$
where $f^n$ means applying the map $f$ a total of $n$ times.
Discrete systems are convenient for modeling systems that naturally evolve in steps (like populations counted year by year) or for numerically simulating continuous systems (using a time step). The state space for discrete systems can be a manifold, a graph, a lattice, or even a finite set with discrete topology.
<extrainfo>
Lagrangian and Hamiltonian Formulations
In many physical systems, the dynamics follow from a variational principle. The Lagrangian formulation describes the evolution by minimizing an action integral:
$$S = \int L(q, \dot{q}, t) \, dt$$
where $L$ is the Lagrangian (typically kinetic minus potential energy). This elegant principle shows that nature "chooses" paths that minimize action.
The Hamiltonian formulation is an alternative description where the evolution is generated by a Hamiltonian function $H$ on a symplectic manifold (or more generally, a Poisson manifold). The evolution equations take the form:
$$\dot{x} = \{x, H\}$$
where $\{·, ·\}$ denotes the Poisson bracket. Hamiltonian systems have special properties—they conserve energy and preserve the symplectic structure, making them particularly important in classical mechanics and physics.
</extrainfo>
<extrainfo>
Measure-Theoretic Definition
From a more abstract perspective, a dynamical system can be defined using measure theory. A measure-theoretic dynamical system is a measure-preserving transformation $\Phi$ on a probability space $(X, \Sigma, \mu)$, where:
$X$ is a set of states
$\Sigma$ is a collection of measurable subsets
$\mu$ is a probability measure (like a normalized distribution)
The key property is that measure is preserved: for any measurable set $A \in \Sigma$,
$$\mu(\Phi^{-1}(A)) = \mu(A)$$
This definition is particularly useful for studying invariant measures — probability distributions that remain unchanged as the system evolves. In Hamiltonian systems, the natural invariant measure is the Liouville measure, which reflects the symplectic structure.
Invariant Measures and Ergodic Theory
Different dynamical systems can have different invariant measures. For chaotic dissipative systems (like systems with friction that gradually lose energy), the invariant measures are often singular — they don't behave like ordinary distributions. These measures are typically supported on strange attractors, which are fractal-like structures in the state space.
</extrainfo>
<extrainfo>
Topological Dynamical Systems
For systems where the state space has a topological structure but not necessarily a smooth manifold structure, we use topological dynamical systems. A topological dynamical system is a triple $(T, X, \Phi)$ where:
$X$ is a locally compact Hausdorff space (a topological space with nice separation properties)
$\Phi : T \times X \to X$ is a homeomorphism for each $t$ (a continuous map with a continuous inverse)
This definition is more general than the smooth case and allows for state spaces with more complex topological structures.
Compactification Concepts
An important technique in topological dynamics is compactification, which adds "points at infinity" to make the space compact (closed and bounded). The one-point compactification adds a single "point at infinity" to capture the behavior of orbits escaping to infinity.
</extrainfo>
<extrainfo>
Category-Theoretic Definition and Invariant Sets
From the most abstract perspective, invariant subsets play a fundamental role in understanding dynamical systems. A subset $S \subset X$ is $\Phi$-invariant if:
$$\Phi(t, S) \subseteq S \quad \text{for all } t \in T$$
This means that if the system starts in $S$, it stays in $S$ forever. Invariant sets are the "building blocks" of dynamical systems, and studying their structure reveals much about the long-term behavior of orbits.
Category-theoretic definitions provide the most general abstract framework, but these are rarely needed for practical problems and are primarily of interest in advanced pure mathematics contexts.
</extrainfo>
Summary
A dynamical system provides a mathematical framework for describing how systems evolve in time. The core concepts—state space, evolution functions, and trajectories—apply across all the different formulations, whether you're working with continuous flows, discrete maps, smooth manifolds, or abstract topological spaces. Most exam questions will focus on these fundamentals and their applications to concrete systems.
Flashcards
What does a dynamical system describe?
How a system evolves in time.
By what three methods can the description of a dynamical system be given?
Experimental data
Differential equations
A map from the present state to a future state
In what predefined environment and with what parameter does the evolution of a dynamical system take place?
A state space with a time parameter $t$.
What term refers to the set of all possible configurations of a system?
State space.
What mathematical structure acts on the state space to represent time evolution as a family of maps?
A semigroup.
What two properties determine whether a trajectory in a dynamical system is well defined?
Existence and uniqueness of solutions.
What property does a system possess if it has conserved quantities that simplify its analysis?
Integrability.
What is the definition of a trajectory (or orbit)?
The collection of states visited by the system from a given initial point.
What is the difference between periodic and aperiodic trajectories?
Periodic trajectories repeat after a fixed time interval, while aperiodic trajectories wander without repeating.
What are the components of the triple $(T, M, \Phi)$ that geometrically defines a dynamical system?
$T$ is the time domain, $M$ is a manifold, and $\Phi$ is the evolution rule.
What is a dynamical system called if the evolution rule $\Phi(t)$ is a diffeomorphism for each $t$?
A smooth dynamical system.
What is a continuous-time dynamical system called when the time domain $T$ is an open interval of the real numbers?
A flow.
What is the distinction between a global flow and a semiflow?
A global flow uses the whole real line for time, whereas a semiflow uses only non-negative time.
From what principle do dynamics follow in the Lagrangian formulation?
A variational principle applied to an action integral.
In the Hamiltonian formulation, what generates the evolution on a symplectic or Poisson manifold?
A Hamiltonian function.
How is a dynamical system defined in measure-theoretic terms?
As a measure-preserving transformation $\Phi$ on a probability space $(X, \Sigma, \mu)$.
What is the condition for a measure $\mu$ to be invariant under the transformation $\Phi$?
$\mu(\Phi^{-1}(A)) = \mu(A)$ for every measurable set $A$.
What specific invariant measure naturally arises in Hamiltonian systems?
Liouville measure.
In chaotic dissipative systems, what is the nature of invariant measures relative to the Lebesgue measure?
They are singular and may be supported on strange attractors.
What are the components of the triple $(T, X, \Phi)$ in a topological dynamical system?
$T$ is the time domain, $X$ is a locally compact Hausdorff space, and $\Phi$ is a homeomorphism for each $t$.
What condition must a subset $S \subset X$ satisfy to be considered $\Phi$-invariant?
$\Phi(t, S) \subset S$ for all $t$ in $T$.
Quiz
Dynamical system - Foundations and Formal Definitions Quiz Question 1: What does a dynamical system describe?
- How a system evolves in time (correct)
- Its equilibrium states
- Its energy spectrum
- Its spatial configuration
Dynamical system - Foundations and Formal Definitions Quiz Question 2: Which properties determine whether a trajectory of a dynamical system is well defined?
- Existence and uniqueness of solutions (correct)
- Continuity and differentiability
- Stability and hyperbolicity
- Periodicity and ergodicity
Dynamical system - Foundations and Formal Definitions Quiz Question 3: In the geometrical definition, what are the components of the triple (T, M, Φ)?
- T is a time domain, M a manifold, Φ : T × M → M is the evolution rule (correct)
- T is temperature, M is a matrix, Φ is a force field
- T is torque, M is a metric, Φ is a probability distribution
- T is topology, M is a vector space, Φ is a linear operator
Dynamical system - Foundations and Formal Definitions Quiz Question 4: What term describes a dynamical system whose time domain T is an open interval of the real numbers?
- Flow (correct)
- Map
- Semigroup
- Lattice
Dynamical system - Foundations and Formal Definitions Quiz Question 5: Which invariant measure commonly appears in Hamiltonian systems?
- Liouville measure (correct)
- Gaussian measure
- Dirac measure
- Uniform measure on a cube
Dynamical system - Foundations and Formal Definitions Quiz Question 6: Which statement is correct regarding invariant measures?
- Different invariant measures can be associated with a given evolution rule (correct)
- An evolution rule has exactly one invariant measure
- Invariant measures do not exist for chaotic systems
- All invariant measures are absolutely continuous with respect to Lebesgue measure
Dynamical system - Foundations and Formal Definitions Quiz Question 7: What defines a Φ‑invariant subset S ⊂ X?
- Φ(t, S) ⊂ S for all t in T (correct)
- Φ(t, S) = X for all t
- Φ(t, S) ∩ S = ∅ for all t
- Φ(t, S) = S only for some t
Dynamical system - Foundations and Formal Definitions Quiz Question 8: In the mathematical description of a dynamical system, how is the passage of time most commonly represented?
- As a semigroup of maps acting on the state space (correct)
- As a vector space of possible states
- As a probability distribution over the state space
- As a group of symmetry transformations of the state space
What does a dynamical system describe?
1 of 8
Key Concepts
Dynamical Systems Concepts
Dynamical system
State space
Flow
Discrete dynamical system
Hamiltonian system
Topological dynamical system
Measure and Invariance
Measure‑preserving transformation
Invariant measure
Ergodic theory
Mathematical Techniques
Compactification
Definitions
Dynamical system
A mathematical model describing how a system’s state evolves over time according to a rule or map.
State space
The set of all possible configurations or states that a dynamical system can occupy.
Flow
The continuous‑time evolution map of a dynamical system, typically defined on a manifold for real‑valued time.
Discrete dynamical system
A system whose state updates occur at integer‑valued time steps via an iterated map.
Hamiltonian system
A dynamical system whose evolution is generated by a Hamiltonian function on a symplectic or Poisson manifold.
Measure‑preserving transformation
A map on a probability space that leaves the measure of every measurable set unchanged.
Invariant measure
A probability measure that remains unchanged under the action of a dynamical system’s evolution rule.
Ergodic theory
The study of the statistical properties of dynamical systems with respect to invariant measures.
Topological dynamical system
A dynamical system defined on a topological space where the evolution maps are homeomorphisms.
Compactification
The process of adding points (e.g., a point at infinity) to a space to make it compact, often used in dynamical systems.