Posts

2024-02-11: Symbolic algebra and typing

2023-08-01: Population waves

2023-05-18: Math of telephone billing mystery

2023-05-05: Franklin and DNA More information…

2023-04-25: On angle and dimension

2023-02-20: On Leonardo da Vinci and Gravity

2022-04-29: Fabricating Evidence to catch Carmen Sandiego

2022-03-04: Probabilistic law of the excluded middle

2020-05-04: Archimedes and the sphere

2019-05-16: Glow worms return

2019-04-11: Original memetic sin

2019-01-31: The theory of weight

2018-11-06: Origins of telephone network theory

2018-10-24: Modern thought

2018-09-10: Feeding a controversy

2018-06-11: Glow worm distribution

2018-04-23: Outlawing risk

2017-08-22: A rebuttal on the beauty in applying math

2017-04-22: Free googles book library

2016-11-02: In search of Theodore von Karman

2016-09-25: Amath Timeline

2016-02-24: Math errors and risk reporting

2016-02-20: Apple VS FBI

2016-02-19: More Zika may be better than less

2016-02-17: Dependent Non-Commuting Random Variable Systems

2016-01-14: Life at the multifurcation

2015-09-28: AI ain't that smart

2015-06-24: Mathematical Epidemiology citation tree

2015-03-31: Too much STEM is bad

2015-03-24: Dawn of the CRISPR age

2015-02-12: A Comment on How Biased Dispersal can Preclude Competitive Exclusion

2015-02-09: Hamilton's selfish-herd paradox

2015-02-08: Risks and values of microparasite research

2014-11-10: Vaccine mandates and bioethics

2014-10-18: Ebola, travel, president

2014-10-17: Ebola comments

2014-10-12: Ebola numbers

2014-09-23: More stochastic than?

2014-08-17: Feynman's missing method for third-orders?

2014-07-31: CIA spies even on congress

2014-07-16: Rehm on vaccines

2014-06-21: Kurtosis, 4th order diffusion, and wave speed

2014-06-20: Random dispersal speeds invasions

2014-05-06: Preservation of information asymetry in Academia

2014-04-16: Dual numbers are really just calculus infinitessimals

2014-04-14: More on fairer markets

2014-03-18: It's a mad mad mad mad prisoner's dilemma

2014-03-05: Integration techniques: Fourier--Laplace Commutation

2014-02-25: Fiber-bundles for root-polishing in two dimensions

2014-02-17: Is life a simulation or a dream?

2014-01-30: PSU should be infosocialist

2014-01-12: The dark house of math

2014-01-11: Inconsistencies hinder pylab adoption

2013-12-24: Cuvier and the birth of extinction

2013-12-17: Risk Resonance

2013-12-15: The cult of the Levy flight

2013-12-09: 2013 Flu Shots at PSU

2013-12-02: Amazon sucker-punches 60 minutes

2013-11-26: Zombies are REAL, Dr. Tyson!

2013-11-22: Crying wolf over synthetic biology?

2013-11-21: Tilting Drake's Equation

2013-11-18: Why \(1^{\infty} eq 1\)

2013-11-15: Adobe leaks of PSU data + NSA success accounting

2013-11-14: 60 Minutes misreport on Benghazi

2013-11-11: Making fairer trading markets

2013-11-10: L'Hopital's Rule for Multidimensional Systems

2013-11-09: Using infinitessimals in vector calculus

2013-11-08: Functional Calculus

2013-11-03: Elementary mathematical theory of the health poverty trap

2013-11-02: Proof of the circle area formula using elementary methods

Functional Calculus

Before continuing, we need to have a basic understanding of the operations of functional calculus. Functional calculus is an extension of the ideas of multivariable calculus to problems in the framework of calculus of variations. My primary interest in functional calculus comes from the study of spatially ecology, and other mathematical biology topics, but it's also a powerful tool of modern physics [Donoghue'96] and these examples may be more familiar. A rigorous theory of functional calculus is a challenging topic, and I am not qualified to provide formal justification for the basic rules. But the basics aren't too hard to grasp, so let's explore.

The basic idea is to extend the concept of a gradient from vectors indexed over a finite set to functions indexed over a continuum using Dirac delta-functions \(\delta(x)\) and their derivatives. Let \(x\) correspond to a position on a line and \(\theta(x)\) be a function defined on that line. The operator \[ \frac{\delta W[\theta]}{\delta\theta} \] is the functional calculus equivalent of a Jacobian. We have replaced the standard Leibniz notation of \(d\) with \(\delta\) to emphasis the difference between classical and functional derivatives; sometimes \(\mathcal{D}\) is used instead. The functional derivative of \(W[\theta]:=\theta(x)\) is \begin{gather} \label{eq:ddef} \frac{ \delta \theta(x)}{ \delta \theta(y)} = \delta( x - y). \end{gather} The dummy index \(y\) represents the component of \(\theta\) we are differentiating with respect to. In this case, the functional derivative is a delta-function peaked when \(x\) coincides with \(y\) but zero otherwise.

In many cases, differentiation will be applied to an integral. For instance, \begin{gather} \frac{ \delta }{ \delta \theta(y) } \int_0^1 \theta(x) dx = \int_0^1 \delta(x-y) dx = \begin{cases} 1 & \text{if \( y \in ( 0, 1)\)}, \\ \text{context-sensitive} & \text{if \( y \in \{ 0, 1\}\)}, \\ 0 & \text{otherwise}. \end{cases} \end{gather}

Most of the standard rules of calculus hold for functional differentiation. The derivative of constant function \begin{gather} \frac{\delta f(x)}{\delta \theta(y)} = 0. \end{gather} The product rule holds, so for example \begin{gather} \frac{ \delta \left[ \theta(x) \theta(x - 2) \right] }{ \delta \theta(y) } = \theta(x-2) \delta(x - y) + \theta(x) \delta( x - 2 - y) \end{gather} and when we extend the product rule to powers, \begin{gather} \frac{ \delta \theta^n(x) }{ \delta \theta(y) } = n \theta^{n-1}(x) \delta(x - y). \end{gather} The chain rule implies \begin{gather} \frac{ \delta \left[ f(\theta(x),x) \right] }{ \delta \theta(y) } = \frac{ \partial f(\theta(x),x)}{\partial \theta} \; \delta(x-y). \end{gather}

In addition to the standard rules of differentiation, the continuity of the index \(x\) introduces some properties that are particular to functional calculus. For instance, using integration-by-parts, we can show that \begin{multline} \frac{ \delta }{ \delta \theta(y) } \int f(\theta'(x)) dx = \int \frac{\partial f}{\partial \theta'} \frac{\delta \theta'(x)}{\delta \theta} dx \\ = \int \frac{\partial f}{\partial \theta'} \delta'( x - y ) dx = \left. - \frac{d}{dx}\left[ \frac{\partial f}{\partial \theta'} \right] \right|_{x = y}. \end{multline} We derive rules for functions of higher order derivatives using similar methods. We've abused notation in a way that's conveniently allowed us to exchange the orders of some operations, which is something you always have to be careful about, but we fall back on the rule-of-thumb to "shoot first, ask questions later".

The rules of functional differentiation are applied to determine critical points of functionals, which may be local minima, local maxima, or saddles. Given a functional \begin{gather} W[ \theta ] = \int f( \theta(x), \theta'(x), x ) dx, \end{gather} the critical points satisfy \begin{gather} \frac{ \delta W[\theta] }{ \delta \theta(y) } = 0 \end{gather} for all \(y\). Evaluating the functional derivative, \begin{multline} \frac{ \delta }{ \delta \theta(y) } \int f( \theta(x), \theta'(x), x ) dx = \int \frac{\partial f}{\partial \theta} \frac{\delta \theta(x)}{\delta \theta(y)} + \frac{\partial f}{\partial \theta'} \frac{\delta \theta'(x)}{\delta \theta(y)} dx \\ = \left. \frac{\partial f}{\partial \theta} \right|_{x = y} \left. -\frac{d}{dx}\left[ \frac{\partial f}{\partial \theta'} \right] \right|_{x = y}, \end{multline} which is just the Euler--Lagrange equation from calculus of variations.

We can now derive the wave equation from the principle of least action. The action of a vibrating string is the total kinetic energy minus the total potential energy, or \begin{equation} A[\phi] = \iint \frac{m}{2} \left( \frac{\delta \phi}{\delta t} \right)^2 - \left( k \frac{\delta \phi}{\delta x} \right)^2 dx dt. \end{equation} To minimize the action, we differentiate, and set the derivative equal to zero. Using functional differentiation, we can show that \begin{equation} \frac{\delta A[\phi]}{\delta \phi} = - m \frac{d^2 \phi}{dt^2} + 2 k^2 \frac{d^2 \phi}{dx^2} = 0. \end{equation} We thus have the standard second-order wave equation for further study.

Functional differentiation can also be used to develop series approximations of nonlinear functionals like the generalized Taylor series \begin{multline} W[ \theta(x) + \epsilon(x) ] = W [ \theta(x) ] + \int \frac{ \delta W[\theta(x)] }{ \delta \theta(x_1) } \epsilon(x_1) dx_1 \\ + \frac{1}{2} \iint \frac{ \delta^2 W[\theta(x)] }{ \delta \theta(x_1) \delta \theta(x_2) } \epsilon(x_1) \epsilon(x_2) dx_1 dx_2 + \ldots . \end{multline}