Time scale calculus

From Wikipedia, the free encyclopedia

In mathematics, time scale calculus is a unification of the theory of difference equations and standard calculus. Discovered in 1988 by the German mathematician Stefan Hilger, it has applications in any field that requires simultaneous modelling of discrete and continuous data.

[edit] Basic theory

A time scale or measure chain T is a closed subset of the real line R.

Define:

σ(t) = inf{s an element of T, s > t} (forward shift operator)
ρ(t) = sup{s an element of T, s < t} (backward shift operator)

Let t be an element of T: t is:

left dense if ρ(t) = t,
right dense if σ(t) = t,
left scattered if ρ(t) < t,
right scattered if σ(t) > t,
dense if left dense or right dense.

Define the graininess μ of a measure chain T by:

μ(t) = σ(t) − t.

Take a function:

f : TR,

(where R could be any Banach space, but set it to be the real line for simplicity).

Definition: generalised derivative or fdelta(t)

For every ε > 0 there exists a neighbourhood U of t such that:

|f(σ(t)) − f(s) − fdelta(t)(σ(t) − s)| ≤ ε|σ(t) − s|

for all s in U.

Take T = R. Then σ(t) = t,μ(t) = 0, fdelta = f′ is the derivative used in standard calculus. If T = Z (the integers), σ(t) = t + 1, μ(t)=1, fdelta = Δf is the forward difference operator used in difference equations.

[edit] External links

Wikimedia Commons has media related to: