2024-02-11: Symbolic algebra and typing

2023-08-01: Population waves

2023-05-18: Math of telephone billing mystery

2023-05-05: Franklin and DNA More information…

2023-04-25: On angle and dimension

2023-02-20: On Leonardo da Vinci and Gravity

2022-04-29: Fabricating Evidence to catch Carmen Sandiego

2022-03-04: Probabilistic law of the excluded middle

2020-05-04: Archimedes and the sphere

2019-05-16: Glow worms return

2019-04-11: Original memetic sin

2019-01-31: The theory of weight

2018-11-06: Origins of telephone network theory

2018-10-24: Modern thought

2018-09-10: Feeding a controversy

2018-06-11: Glow worm distribution

2018-04-23: Outlawing risk

2017-08-22: A rebuttal on the beauty in applying math

2017-04-22: Free googles book library

2016-11-02: In search of Theodore von Karman

2016-09-25: Amath Timeline

2016-02-24: Math errors and risk reporting

2016-02-20: Apple VS FBI

2016-02-19: More Zika may be better than less

2016-02-17: Dependent Non-Commuting Random Variable Systems

2016-01-14: Life at the multifurcation

2015-09-28: AI ain't that smart

2015-06-24: Mathematical Epidemiology citation tree

2015-03-31: Too much STEM is bad

2015-03-24: Dawn of the CRISPR age

2015-02-12: A Comment on How Biased Dispersal can Preclude Competitive Exclusion

2015-02-09: Hamilton's selfish-herd paradox

2015-02-08: Risks and values of microparasite research

2014-11-10: Vaccine mandates and bioethics

2014-10-18: Ebola, travel, president

2014-10-17: Ebola comments

2014-10-12: Ebola numbers

2014-09-23: More stochastic than?

2014-08-17: Feynman's missing method for third-orders?

2014-07-31: CIA spies even on congress

2014-07-16: Rehm on vaccines

2014-06-21: Kurtosis, 4th order diffusion, and wave speed

2014-06-20: Random dispersal speeds invasions

2014-05-06: Preservation of information asymetry in Academia

2014-04-16: Dual numbers are really just calculus infinitessimals

2014-04-14: More on fairer markets

2014-03-18: It's a mad mad mad mad prisoner's dilemma

2014-03-05: Integration techniques: Fourier--Laplace Commutation

2014-02-25: Fiber-bundles for root-polishing in two dimensions

2014-02-17: Is life a simulation or a dream?

2014-01-30: PSU should be infosocialist

2014-01-12: The dark house of math

2014-01-11: Inconsistencies hinder pylab adoption

2013-12-24: Cuvier and the birth of extinction

2013-12-17: Risk Resonance

2013-12-15: The cult of the Levy flight

2013-12-09: 2013 Flu Shots at PSU

2013-12-02: Amazon sucker-punches 60 minutes

2013-11-26: Zombies are REAL, Dr. Tyson!

2013-11-22: Crying wolf over synthetic biology?

2013-11-21: Tilting Drake's Equation

2013-11-18: Why \(1^{\infty} eq 1\)

2013-11-15: Adobe leaks of PSU data + NSA success accounting

2013-11-14: 60 Minutes misreport on Benghazi

2013-11-11: Making fairer trading markets

2013-11-10: L'Hopital's Rule for Multidimensional Systems

2013-11-09: Using infinitessimals in vector calculus

2013-11-08: Functional Calculus

2013-11-03: Elementary mathematical theory of the health poverty trap

2013-11-02: Proof of the circle area formula using elementary methods

L'Hopital's Rule for Multidimensional Systems

I have done a little work using branching process models to study extinction in ecology and emergence of infectious diseases. One of the questions that comes up in those studies is to determine how sensitive an extinction probability is to changes in the system. In some cases, this turns out to be a tricky question, requiring the generalization of L'Hopital's rule to systems of equations. Here's how the story goes...

Reinterpretting the standard result

As it is usually taught to calculus students, L'Hopital's rule is stated as follows:


If \( \lim_{p\rightarrow p_0} f(p) = \lim_{p\rightarrow p_0} g(p) = 0 \) then \begin{gather} \lim_{p\rightarrow p_0} \frac{f(p)}{g(p)} = \lim_{p\rightarrow p_0} \frac{\frac{df}{dp}(p)}{\frac{dg}{dp}(p)} . \end{gather}

One way to explain this is to rewrite it. An alternative interpretation is in terms of calculating \(x(p)\) where \(x\) solves an equation of the form \[ g(p) x - f(p) = 0.\] Naturally, in cases where \(g(p) \neq 0\), \[ x = \frac{ f(p) } { g(p) } .\] If \(g(p) = 0\), but \(f(p) \neq 0\) then \(x\) is infinite. If \(g(p_0) = 0\) and \(f(p_0) = 0\) then Taylor expansion comes to the rescue. \begin{align} g(p) x - f(p) &= \left[ g(p_0) + \frac{dg}{dp}(p_0) \Delta p + O(\Delta p^2) \right] x \\ & \quad \quad - \left[ f(p_0) + \frac{df}{dp}(p_0) \Delta p + O(\Delta p^2) \right] \\ &\approx \frac{dg}{dp}(p_0) \Delta p \; x - \frac{df}{dp}(p_0) \Delta p , \end{align} and our solution \(x(0)\) is given by L'Hopital's rule. Of course, we've made a subtle mistake. \(x\) is an implicit function of \(p\), so we also need to include it's expansion. However, since that expansion's terms will always be of higher order, they have no effect on the 1st-order solution.

Extending to linear systems

Now, the observant reader will notice that this alternative interpretation is a way to save our solution to a singular linear equation. But what if, instead of a scalar linear equation, we really had a singular system depending on a parameter? Well, as it just so happens, this problem can also be solved with a little ingenuity.

Suppose we need to solve \[ A(t) x - b(t) = 0.\] at \(t = 0\), but \(A(0)\) is singular AND \(b(0)=0\) although for \(t \neq 0\), \(b(t)\) is always in the column space of \(A(t)\). Well, if we Taylor expand, \begin{align} A(t) x - b(t) &=\\ &= \left[ A(0) + \frac{dA}{dt}(0) \Delta t + O(\Delta t^2) \right] \left[ x(0) + \frac{dx}{dt}(0) \Delta t + O(\Delta t^2) \right] \\ &\quad\quad - \left[ b(0) + \frac{db}{dt}(0) \Delta t + O(\Delta t^2) \right] \\ &= \left[ A(0) x(0) - b(0) \right] \\ &\quad \quad + \left[ \frac{dA}{dt}(0) x(0) + A(0) \frac{dx}{dt}(0) - \frac{db}{dt}(0) \right] \Delta t + O(\Delta t^2) \\ &\approx \left[ A(0) x(0) - b(0) \right] + \left[ \frac{dA}{dt}(0) x(0) + A(0) \frac{dx}{dt}(0) - \frac{db}{dt}(0) \right] \Delta t . \end{align} Now, this problem appears odd, but is exactly what we are hoping for. By assumption \(b(0)\) lies in the column space of \(A(0)\), leaving components of \(x(0)\) in the nullspace of \(A(0)\) undetermined. Now, \[ A(0) \frac{dx}{dt} = \frac{db}{dt} - \frac{dA}{dt} x \] only has a solution if we choose the nullspace components of \(x(0)\) so that \[ \frac{db}{dt} - \frac{dA}{dt} x(0) \] lies in the column space of \(A(0)\). Thus, \(x(0)\) is exactly determined by Taylor expansion of the problem. Well, actually, just like the traditional L'Hopital's rule, things may fail at this point. If this fails, we'll need to include higher order terms, or we've shown there is no solution. But when studying root collisions with positive velocity, this is sufficient.

Application to double roots of nonlinear systems

For sensitivity analysis in multitype branching processes, we have nonlinear systems of the form \begin{gather} s(t) = G(s(t),t), \end{gather} and we'd like to determine the solution of the linear system \begin{gather} \label{eq:sense} \left[ \frac{\partial s}{\partial s} - \frac{\partial G}{\partial s} \right] \frac{\mathrm{d} s}{\mathrm{d}t} = \frac{\partial G}{\partial t} \end{gather} However, for a critical process, our system is under-determined, so we augment it using the method above. Sensitivity equation for a critical process with PGF \(G\), variable \(s\), and parameter \(t\). \begin{gather} \left( \frac{\partial^2 G}{\partial s^2} \frac{\partial s}{\partial t} + 2 \frac{\partial^2 G}{\partial s \partial t} \right) \frac{\partial s}{\partial t} + \left( \frac{\partial G}{\partial s} - \frac{\partial s}{\partial s} \right) \frac{\partial^2 s}{\partial t^2} = 0 \end{gather}