2019-05-16: Glow worms return

2019-04-11: Original memetic sin

2019-01-31: The theory of weight

2018-11-06: Origins of telephone network theory

2018-10-24: Modern thought

2018-09-10: Feeding a controversy

2018-06-11: Glow worm distribution

2018-04-23: Outlawing risk

2017-08-22: A rebuttal on the beauty in applying math

2017-04-22: Free googles book library

2016-11-02: In search of Theodore von Karman

2016-09-25: Amath Timeline

2016-02-24: Math errors and risk reporting

2016-02-20: Apple VS FBI

2016-02-19: More Zika may be better than less

2016-02-17: Dependent Non-Commuting Random Variable Systems

2016-01-14: Life at the multifurcation

2015-09-28: AI ain't that smart

2015-06-24: MathEpi citation tree

2015-03-31: Too much STEM is bad

2015-03-24: Dawn of the CRISPR age

2015-02-12: A Comment on How Biased Dispersal can Preclude Competitive Exclusion

2015-02-09: Hamilton's selfish-herd paradox

2015-02-08: Risks and values of microparasite research

2014-11-10: Vaccine mandates and bioethics

2014-10-18: Ebola, travel, president

2014-10-17: Ebola comments

2014-10-12: Ebola numbers

2014-09-23: More stochastic than?

2014-08-17: Feynman's missing method for third-orders?

2014-07-31: CIA spies even on congress

2014-07-16: Rehm on vaccines

2014-06-21: Kurtosis, 4th order diffusion, and wave speed

2014-06-20: Random dispersal speeds invasions

2014-05-06: Preservation of information asymetry in Academia

2014-04-16: Dual numbers are really just calculus infinitessimals

2014-04-14: More on fairer markets

2014-03-18: It's a mad mad mad mad prisoner's dilemma

2014-03-05: Integration techniques: Fourier--Laplace Commutation

2014-02-25: Fiber-bundles for root-polishing in two dimensions

2014-02-17: Is life a simulation or a dream?

2014-01-30: PSU should be infosocialist

2014-01-12: The dark house of math

2014-01-11: Inconsistencies hinder pylab adoption

2013-12-24: Cuvier and the birth of extinction

2013-12-17: Risk Resonance

2013-12-15: The cult of the Levy flight

2013-12-09: 2013 Flu Shots at PSU

2013-12-02: Amazon sucker-punches 60 minutes

2013-11-26: Zombies are REAL, Dr. Tyson!

2013-11-22: Crying wolf over synthetic biology?

2013-11-21: Tilting Drake's Equation

2013-11-18: Why $1^\infty != 1$

2013-11-15: Adobe leaks of PSU data + NSA success accounting

2013-11-14: 60 Minutes misreport on Benghazi

2013-11-11: Making fairer trading markets

2013-11-10: L'Hopital's Rule for Multidimensional Systems

2013-11-09: Using infinitessimals in vector calculus

2013-11-08: Functional Calculus

2013-11-03: Elementary mathematical theory of the health poverty trap

2013-11-02: Proof of the circle area formula using elementary methods

Why $1^\infty != 1$

A debate that came up recently in some of my work was over the right way to define $1^\infty$. Many people say that since $1^n = 1$ for any integer $n$, then $1^\infty = 1$ also. That's a very reasonable position, but when we start using more powerful mathematics, it's not good enough any more.

One very good way to approach this is to define a power function \[ f(x,y) = x^{1/y} \] when $y > 0$ and $x >0$. We are interested in the value of \[ f(1,0) = 1^{1/0} = 1^{\infty}. \] However, we need a formal framework, as $\infty$ is not formally a number. Acknowledging Berkeley's criticism of the ambiguity of infinitessials, we employ Weierstrass's idea of limits. \[ f(1,0) = \lim_{x\rightarrow 1, y\rightarrow +0} x^{1/y} \] We can not easily calculate a 2-d limit, but if a unique limit does exist, then for any parameterization $x(t), y(t)$ where \[ \lim_{t\rightarrow 0} x(t) = 1 \] and \[ \lim_{t\rightarrow 0} y(t) = +0 \] then we must have \[ f(1,0) = \lim_{t\rightarrow 0} x(t)^{1/y(t)} \] Well, if $x(t) = \exp(t)$ and $y(t) = t/log(k)$ for some positive constant $k$. These satisfy our requirements on $x(t) \rightarrow 1$ and $y(t) \rightarrow +0$ as long as we restrict $t$ to it's positive values. Then \begin{align*} f(1,0) &= \lim_{t\rightarrow 0} x(t)^{1/y(t)} \\ &= \lim_{t\rightarrow 0} ( \exp(t) )^{log(k)/t} \\ &= \lim_{t\rightarrow 0} \exp(log(k)) \\ &= k \end{align*} But k is any arbitrary positive constant, so the limit IS NOT uniquely defined. So, a reasonable person can make a sound argument that $1^\infty = 2$ or any other number, even though, at first pass, this seems ridiculus. This is one of the weird, counter-intuitive problems with $\infty$ that pops up when you haven't carefully defined what you mean and somebody else is working with a slightly different definition.

This also provides an example of the Stokes phenomena outside of complex analysis. The Stokes phenomena is a classical applied-math result that observes that asymptotic approximations of functions in the complex plane are often discontinuous. However, I feel that it's a more general phenomena, that's easily overlooked by folks without specific training in asymptotic analysis. In the current example, for a given $x$ value, the asymptotic limit of $x^{1/y}$ is discontinuous in $x$ as $y\rightarrow 0$. \begin{align*} \lim_{y\rightarrow 0} x^{1/y} = \begin{cases} 0 & \text{if $0 < x < 1 $}, \\ 1 & \text{if $x = 1$}, \\ \infty & \text{if $x > 1$}. \end{cases} \end{align*}

Formalism can be very hand-tying, but sometimes it's the only way to get out of some tight squeezes.