Fast algorithms based on asymptotic expansions of special functions Alex Townsend MIT Cornell University, 23rd January 2015, CAM Colloquium Introduction Two trends in the computing era Tabulations Software Logarithms Gauss quadrature Special functions Alex Townsend @MIT 1/22 Introduction Two trends in the computing era Tabulations Software Logarithms Gauss quadrature Special functions Classical analysis Numerical analysis = ~ Alex Townsend @MIT Fast multipole method Butterfly schemes Iterative algorithms Finite element method 1/22 Introduction A selection of my research Algebraic geometry ๐(๐ฅ, ๐ฆ) =0 ๐(๐ฅ, ๐ฆ) [Nakatsukasa, Noferini, & T., 2015] ๐พ Approx. theory ๐(๐ฅ, ๐ฆ) ≈ ๐๐ ๐ฆ ๐๐ (๐ฅ) ๐=1 [Trefethen & T., 2014] Alex Townsend @MIT 2/22 Introduction A selection of my research Algebraic geometry ๐(๐ฅ, ๐ฆ) =0 ๐(๐ฅ, ๐ฆ) [Nakatsukasa, Noferini, & T., 2015] [Battels & Trefethen, 2004] ๐พ Approx. theory ๐(๐ฅ, ๐ฆ) ≈ ๐๐ ๐ฆ ๐๐ (๐ฅ) ๐=1 [Trefethen & T., 2014] Alex Townsend @MIT 2/22 Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(๐๐ฅ) Chebyshev polynomials Tn(x) Legendre polynomials Pn(x) Bessel functions Jv(z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. Alex Townsend @MIT 3/22 Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(๐๐ฅ) Chebyshev polynomials Tn(x) Legendre polynomials Pn(x) Bessel functions Jv(z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. Alex Townsend @MIT 3/22 Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(๐๐ฅ) Chebyshev polynomials Tn(x) Legendre polynomials Pn(x) Bessel functions Jv(z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. Alex Townsend @MIT 3/22 Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(๐๐ฅ) Chebyshev polynomials Tn(x) Legendre polynomials Pn(x) Bessel functions Jv(z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. Alex Townsend @MIT 3/22 Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(๐๐ฅ) Chebyshev polynomials Tn(x) Legendre polynomials Pn(x) Bessel functions Jv(z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. Alex Townsend @MIT 3/22 Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(๐๐ฅ) Chebyshev polynomials Tn(x) Legendre polynomials Pn(x) Bessel functions Jv(z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. Alex Townsend @MIT 3/22 Introduction Asymptotic expansions of special functions Legendre polynomials cos ๐−1 ๐๐ (cos ๐) ~ ๐ถ๐ โ๐,๐ ๐+๐+ 1 1 ๐ ๐− ๐+ 2 2 2 1 ๐+2 (2 sin ๐) ๐=0 , ๐→∞ Thomas Stieltjes Bessel functions 2 ๐ ๐ฝ0 ๐ง ~ (cos ๐ง − ๐๐ง 4 ๐−1 ๐=0 −1 ๐ ๐2๐ ๐ง 2๐ ๐ − sin(๐ง − ) 4 ๐−1 ๐=0 (−1)๐ ๐2๐+1 ) 2๐+1 ๐ง ๐ง→∞ Hermann Hankel Alex Townsend @MIT 4/22 Introduction Numerical pitfalls of an asymptotic expansionist ๐ฝ0 ๐ง = 2 ๐ (cos ๐ง − ๐๐ง 4 ๐−1 ๐=0 ๐๐ −1 ๐ง 2๐ ๐ 2๐ − sin(๐ง − ) 4 ๐−1 ๐=0 (−1)๐ ๐2๐+1 ) + ๐ ๐ (๐ง) 2๐+1 ๐ง Fix M. Where is the asymptotic expansion accurate? Alex Townsend @MIT 5/22 Introduction Talk outline Using asymptotics expansions of special functions: I. Computing Gauss quadrature rules. [Hale & T., 2013], [T., Trodgon, & Olver, 2015] II. Fast transforms based on asymptotic expansions. [Hale & T., 2014], [T., 2015] Alex Townsend @MIT 6/22 Part I: Computing Gauss quadrature rules Part I: Computing Gauss quadrature rules Pn(x) = Legendre polynomial of degree n Alex Townsend @MIT 7/22 Part I: Computing Gauss quadrature rules Definition n-point quadrature rule: ๐ 1 ๐ ๐ฅ ๐๐ฅ ≈ −1 ๐ค๐ ๐ ๐ฅ๐ ๐=1 Gauss quadrature rule: ๐ฅ๐ = ๐๐กโ zero of ๐๐ ๐ค๐ = 2(1 − ๐ฅ๐2 )-1 [๐๐′ (๐ฅ๐ )]-2 1. Exactly integrates polynomials of degree ≤ 2n - 1. Best possible. 2. Computed by the Golub–Welsch algorithm. [Golub & Welsch, 1969] 3. In 2004: no scheme for computing Gauss rules in O(n). [Bailey, Jeyabalan, & Li, 2004] Alex Townsend @MIT 8/22 Part I: Computing Gauss quadrature rules Gauss quadrature experiments Execution time Execution time Quadrature error Quadrature error n n The Golub–Welsch algorithm is beautiful, but not the state-of-the-art. There is another way... Alex Townsend @MIT 9/22 Part I: Computing Gauss quadrature rules Newton’s method Newton’s method (with asymptotics and a change of variables) Solve Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 = 0, θ ∈ (0, π). Isaac Newton 1. Evaluation scheme for Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 2. Evaluation scheme for ๐ Pn( ๐๐๐ θ) (๐ ๐๐ θ)1/21/2 ๐๐ 3. Initial guesses for cos-1(xk) Alex Townsend @MIT 10/22 Part I: Computing Gauss quadrature rules Newton’s method… with asymptotics and a change of variables Newton’s method (with asymptotics and a change of variables) Solve Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 = 0, θ ∈ (0, π). Isaac Newton 1. Evaluation scheme for Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 2. Evaluation scheme for Pn(x) ๐ Pn( ๐๐๐ θ) (๐ ๐๐ θ)1/2)1/2 ๐๐ 3. Initial guesses for cos-1(xk) Alex Townsend @MIT 10/22 Part I: Computing Gauss quadrature rules Newton’s method… with asymptotics and a change of variables Newton’s method (with asymptotics and a change of variables) Solve Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 = 0, θ ∈ (0, π). Isaac Newton 1. Evaluation scheme for Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 2. Evaluation scheme for Pn(x) ๐ Pn( ๐๐๐ θ) (๐ ๐๐ θ)1/2)1/2 ๐๐ 3. Initial guesses for cos-1(xk) Alex Townsend @MIT 10/22 Part I: Computing Gauss quadrature rules Newton’s method… with asymptotics and a change of variables Newton’s method (with asymptotics and a change of variables) Solve Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 = 0, θ ∈ (0, π). Isaac Newton Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 1. Evaluation scheme for Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 2. Evaluation scheme for ๐ Pn( ๐๐๐ θ) (๐ ๐๐ θ)1/2 ๐๐ 3. Initial guesses for cos-1(xk) Alex Townsend @MIT 10/22 Part I: Computing Gauss quadrature rules Newton’s method… with asymptotics and a change of variables Newton’s method (with asymptotics and a change of variables) Solve Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 = 0, θ ∈ (0, π). Isaac Newton Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 1. Evaluation scheme for Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 2. Evaluation scheme for ๐ Pn( ๐๐๐ θ) (๐ ๐๐ θ)1/2 ๐๐ 3. Initial guesses for cos-1(xk) Alex Townsend @MIT 10/22 Part I: Computing Gauss quadrature rules Newton’s method… with asymptotics and a change of variables Newton’s method (with asymptotics and a change of variables) Solve Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 = 0, θ ∈ (0, π). Isaac Newton Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 1. Evaluation scheme for Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 2. Evaluation scheme for ๐ Pn( ๐๐๐ θ) (๐ ๐๐ θ)1/2 ๐๐ 3. Initial guesses for cos-1(xk) Alex Townsend @MIT 10/22 Part I: Computing Gauss quadrature rules Evaluation scheme for Pn Interior (in grey region): Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 ๐๐ (cos ๐)(sin ๐)1/2 ≈ ๐ถ๐ 1 1 ๐ ๐+๐+ ๐− ๐+ 2 2 2 cos ๐−1 โ๐,๐ 2(2 sin ๐)๐ ๐=0 Boundary: ๐๐ (cos ๐)(sin ๐)1/2 ≈ 1 ๐ 2 (๐ฝ0 (๐๐) ๐−1 ๐=0 ๐ด๐ (๐) + ๐๐ฝ1 (๐๐) 2๐ ๐ ๐=0 ๐ต๐ (๐) ), 2๐+1 ๐ 1 ๐=๐+ 2 [Baratella & Gatteschi, 1988] Theorem For ๐ > 0, ๐ ≥ 200, and ๐−1 ๐๐๐๐ก θ๐ = (๐ 1 ๐ − ) , 2 ๐ Newton converges and the nodes are computed to an accuracy of ๐ช(๐) in ๐ช (๐log(1/๐)) operations. Alex Townsend @MIT 11/22 Part I: Computing Gauss quadrature rules Evaluation scheme for Pn Interior (in grey region): Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 ๐๐ (cos ๐)(sin ๐)1/2 ≈ ๐ถ๐ 1 1 ๐ ๐+๐+ ๐− ๐+ 2 2 2 cos ๐−1 โ๐,๐ 2(2 sin ๐)๐ ๐=0 Boundary: ๐๐ (cos ๐)(sin ๐)1/2 ≈ 1 ๐ 2 (๐ฝ0 (๐๐) ๐−1 ๐=0 ๐ด๐ (๐) + ๐๐ฝ1 (๐๐) 2๐ ๐ ๐=0 ๐ต๐ (๐) ), 2๐+1 ๐ 1 ๐=๐+ 2 [Baratella & Gatteschi, 1988] Theorem For ๐ > 0, ๐ ≥ 200, and ๐−1 ๐๐๐๐ก θ๐ = (๐ 1 ๐ − ) , 2 ๐ Newton converges and the nodes are computed to an accuracy of ๐ช(๐) in ๐ช (๐log(1/๐)) operations. Alex Townsend @MIT 11/22 Part I: Computing Gauss quadrature rules Evaluation scheme for Pn Interior (in grey region): Pn(๐๐๐ θ) (๐ ๐๐ θ)1/2 ๐๐ (cos ๐)(sin ๐)1/2 ≈ ๐ถ๐ 1 1 ๐ ๐+๐+ ๐− ๐+ 2 2 2 cos ๐−1 โ๐,๐ 2(2 sin ๐)๐ ๐=0 Boundary: ๐๐ (cos ๐)(sin ๐)1/2 ≈ 1 ๐ 2 (๐ฝ0 (๐๐) ๐−1 ๐=0 ๐ด๐ (๐) + ๐๐ฝ1 (๐๐) 2๐ ๐ ๐−1 ๐=0 ๐ต๐ (๐) ), 2๐+1 ๐ 1 ๐=๐+ 2 [Baratella & Gatteschi, 1988] Theorem For ๐ ≥ 200 and ๐๐๐๐ก θ๐ Alex Townsend @MIT = (๐ − 1 ๐ ) , 2 ๐ Newton converges to the nodes in ๐ช (๐) operations. 11/22 Part I: Computing Gauss quadrature rules Initial guesses for the nodes Interior: Use Tricomi’s Max error in the initial guesses Boundary: Use Olver’s initial guesses. [Olver, 1974] Ignace Bogaert Alex Townsend @MIT Max error initial guesses. [Tricomi, 1950] There are now better initial guesses. [Bogaert, 2014] 12/22 Part I: Computing Gauss quadrature rules The race to compute high-order Gauss quadrature “The race to compute high-order Gauss quadrature” [T., 2015] Alex Townsend @MIT 13/22 Part I: Computing Gauss quadrature rules Applications 1. Psychological: Gauss quadrature 2. Experimental mathematics: High precision quadratures. [Bailey, Jeyabalan, & Li, 2004] 3. Cosmic microwave background: Noise removal by L2-projection. Quadrature error rules are cheap. 1 1 dx 2 2 −1 1+100 ๐ฅ Generically, composite rules converge algebraically. Alex Townsend @MIT 14/22 Part II: Fast transforms based on asymptotic expansions Part II: Fast transforms based on asymptotic expansions ๐0 cos ๐0 โฎ ๐ท๐ต ๐ = ๐0 cos ๐๐−1 Alex Townsend @MIT โฏ โฑ โฏ ๐๐−1 cos ๐0 โฎ ๐๐−1 cos ๐๐−1 ๐1 โฎ ๐๐−1 ๐๐ , ๐๐ = ๐−1 15/22 Part II: Fast transforms based on asymptotic expansions Fast transforms as evaluation schemes DFT DCT DCT DLT DHT ๐๐ = ๐−1 ๐ ๐ ๐ง , ๐=0 ๐ ๐ ๐ง๐ = ๐ −2๐๐๐/๐ ๐๐ = ๐−1 ๐ ๐๐๐ (๐๐๐ ) , ๐=0 ๐ ๐๐ = ๐−1 ๐ ๐ (๐๐๐ ๐๐ ) ๐=0 ๐ ๐ ๐๐ = ๐−1 ๐ ๐ (๐๐๐ ๐๐ ) ๐=0 ๐ ๐ ๐๐ = ๐−1 ๐ ๐ฝ ๐=0 ๐ ๐ฃ Alex Townsend @MIT ๐๐ = ๐๐ฃ,๐ ๐๐ , ๐๐ = ๐๐ ๐๐ ๐−1 ๐๐ ๐๐ ๐0 ๐1 ๐๐ฃ,๐ ๐๐ฃ,๐ , ๐๐ฃ,๐ = ๐๐กโ ๐๐๐๐ก ๐๐ ๐ฝ๐ฃ ๐ง 16/22 Part II: Fast transforms based on asymptotic expansions Fast transforms as evaluation schemes DFT DCT DCT DLT DHT ๐๐ = ๐−1 ๐ ๐ ๐ง , ๐=0 ๐ ๐ ๐ง๐ = ๐ −2๐๐๐/๐ ๐๐ = ๐−1 ๐ ๐๐๐ (๐๐๐ ) , ๐=0 ๐ ๐๐ = ๐−1 ๐ ๐ (๐๐๐ ๐๐ ) ๐=0 ๐ ๐ ๐๐ = ๐−1 ๐ ๐ (๐๐๐ ๐๐ ) ๐=0 ๐ ๐ ๐๐ = ๐−1 ๐ ๐ฝ ๐=0 ๐ ๐ฃ Alex Townsend @MIT ๐๐ = ๐๐ฃ,๐ ๐๐ , ๐๐ = ๐๐ ๐๐ ๐−1 ๐๐ ๐๐ ๐0 ๐1 ๐๐ฃ,๐ ๐๐ฃ,๐ , ๐๐ฃ,๐ = ๐๐กโ ๐๐๐๐ก ๐๐ ๐ฝ๐ฃ ๐ง 16/22 Part II: Fast transforms based on asymptotic expansions Fast transforms as evaluation schemes DFT DCT DCT DLT DHT ๐๐ = ๐−1 ๐ ๐ ๐ง , ๐=0 ๐ ๐ ๐ง๐ = ๐ −2๐๐๐/๐ ๐๐ = ๐−1 ๐ ๐๐๐ (๐๐๐ ) , ๐=0 ๐ ๐๐ = ๐−1 ๐ ๐ (๐๐๐ ๐๐ ) ๐=0 ๐ ๐ ๐๐ = ๐−1 ๐ ๐ (๐๐๐ ๐๐ ) ๐=0 ๐ ๐ ๐๐ = ๐−1 ๐ ๐ฝ ๐=0 ๐ ๐ฃ Alex Townsend @MIT ๐๐ = ๐๐ฃ,๐ ๐๐ , ๐๐ = ๐๐ ๐๐ ๐−1 ๐๐ ๐๐ ๐0 ๐1 ๐๐ฃ,๐ ๐๐ฃ,๐ , ๐๐ฃ,๐ = ๐๐กโ ๐๐๐๐ก ๐๐ ๐ฝ๐ฃ ๐ง 16/22 Part II: Fast transforms based on asymptotic expansions Fast transforms as evaluation schemes DFT DCT DCT DLT DHT ๐๐ = ๐−1 ๐ ๐ ๐ง , ๐=0 ๐ ๐ ๐ง๐ = ๐ −2๐๐๐/๐ ๐๐ = ๐−1 ๐ ๐๐๐ (๐๐๐ ) , ๐=0 ๐ ๐๐ = ๐−1 ๐ ๐ (๐๐๐ ๐๐ ) ๐=0 ๐ ๐ ๐๐ = ๐−1 ๐ ๐ (๐๐๐ ๐๐ ) ๐=0 ๐ ๐ ๐๐ = ๐−1 ๐ ๐ฝ ๐=0 ๐ ๐ฃ Alex Townsend @MIT ๐๐ = ๐๐ฃ,๐ ๐๐ , ๐๐ = ๐๐ ๐๐ ๐−1 ๐๐ ๐๐ ๐0 ๐1 ๐๐ฃ,๐ ๐๐ฃ,๐ , ๐๐ฃ,๐ = ๐๐กโ ๐๐๐๐ก ๐๐ ๐ฝ๐ฃ ๐ง 16/22 Part II: Fast transforms based on asymptotic expansions Related work James Cooley Leslie Greengard Emmanuel Candes Daniel Potts Ahmed, Tukey, Gauss, Gentleman, Natarajan, Rao Alpert, Barnes, Hut, Martinsson, Rokhlin, Mohlenkamp, Tygert Boag, Demanet, Michielssen, O’Neil, Ying Driscoll, Healy, Steidl, Tasche Many others: Candel, Cree, Johnson, Mori, Orszag, Suda, Sugihara, Takami, etc. Fast transforms are very important. Alex Townsend @MIT 17/22 Part II: Fast transforms based on asymptotic expansions Asymptotic expansions as a matrix decomposition The asymptotic expansion cos ๐−1 ๐๐ (cos ๐๐ ) = ๐ถ๐ โ๐,๐ 1 1 ๐ ๐ + ๐ + ๐๐ − ๐ + 2 2 2 (2 sin ๐๐ ) ๐=0 ๐ + 1/2 + ๐ ๐,๐ (๐๐ ) gives a matrix decomposition (sum of diagonally scaled DCTs and DSTs): ๐−1 ๐ท๐ต = ๐ท๐ข ๐ ๐ช๐ต ๐ท๐ถโ๐ + ๐ท๐ฃ๐ ๐=0 ๐ท๐ต Alex Townsend @MIT = 0 0 0 ๐บ๐ต−๐ 0 0 ๐จ๐บ๐ ๐ท๐ต 0 0 ๐ท๐ถโ๐ + ๐น๐ด,๐ต 0 + ๐น๐ด,๐ต 18/22 Part II: Fast transforms based on asymptotic expansions Be careful and stay safe Error curve: ๐ ๐,๐ ๐๐ =๐ Fix M. Where is the asymptotic expansion accurate? ๐น๐ด,๐ต = 2๐ถ๐ โ๐,๐ ๐ ๐,๐ (๐๐ ) ≤ (2 sin ๐๐ )๐+1/2 Alex Townsend @MIT 19/22 Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product ๐ = PN ๐ can be computed in ๐ช (๐((log ๐)2 / log log ๐) operations. 0 1 α too small Alex Townsend @MIT 20/22 Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product ๐ = PN ๐ can be computed in ๐ช(๐((log ๐)2 / log log ๐) operations. 0 1 α too small Alex Townsend @MIT 20/22 Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product ๐ = ๐๐ ๐ can be computed in ๐ช(๐((log ๐)2 / log log ๐) operations. 0 1 α too small Alex Townsend @MIT 20/22 Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product ๐ = PN ๐ can be computed in ๐ช(๐((log ๐)2 / log log ๐) operations. 0 1 α too large Alex Townsend @MIT 20/22 Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product ๐ = PN ๐ can be computed in ๐ช(๐((log ๐)2 / log log ๐) operations. 0 1 α = O (1/ log ๐) (j) PN ๐ PNEVAL ๐ 2 ๐ช Alex Townsend @MIT N (log N) ( ) log log ๐ 2 ๐ช N (log N) ( ) log log ๐ 20/22 Part II: Fast transforms based on asymptotic expansions Numerical results Modification of rough signals Execution time 100 0.15 Direct approach Using asymptotics 0.1 0.05 0 10-1 -0.05 -0.1 -2 10 -0.15 -0.2 10-3 102 10 3 N 10 4 -0.25 105 0.3 0.32 x 1 No precomputation. ๐๐ ๐ฅ −1 Alex Townsend @MIT ๐ −๐๐๐ก ๐๐ฅ = ๐๐ 0.34 2๐ ๐ฝ 1 (−๐) −๐ ๐+2 21/22 Future work Fast spherical harmonic transform Spherical harmonic transform: Existing approaches Precomputation Fast transform Direct approach 4 ๐−1 ๐ ๐ผ๐๐ ๐๐ ๐ ๐ ๐, ๐ = (cos ๐)๐ ๐๐๐ ๐=0 ๐=−๐ [Mohlenkamp, 1999] [Rokhlin & Tygert, 2006] [Tygert, 2008] Execution time 10 102 100 10-2 103 104 105 N A new generation of fast algorithms with no precomputation. Alex Townsend @MIT 22/22 Thank you Alex Townsend @MIT References • D. H. Bailey, K. Jeyabalan, and X. S. Li, A comparison of three high-precision quadrature schemes, 2005. • P. Baratella and I. Gatteschi, The bounds for the error term of an asymptotic approximation of Jacobi polynomials, Lecture notes in Mathematics, 1988. • Z. Battels and L. N. Trefethen, An extension of Matlab to continuous functions and operators, SISC, 2004. • I. Bogaert, Iteration-free computation of Gauss–Legendre quadrature nodes and weights, SISC, 2014. • G. H. Golub and J. H. Welsch, Calculation of Gauss quadrature rules, Math. Comput., 1969. • N. Hale and A. Townsend, Fast and accurate computation of Gauss–Legendre and Gauss–Jacobi quadrature nodes and weights, SISC, 2013. • N. Hale and A. Townsend, A fast, simple, and stable Chebyshev–Legendre transform using an asymptotic formula, SISC, 2014. • Y. Nakatsukasa, V. Noferini, and A. Townsend, Computing the common zeros of two bivariate functions via Bezout resultants, Numerische Mathematik, 2014. Alex Townsend @MIT References • F. W. J. Olver, Asymptotics and Special Functions, Academic Press, New York, 1974. • A. Townsend and L. N. Trefethen, An extension of Chebfun to two dimensions, SISC, 2013. • A. Townsend, T. Trogdon, and S. Olver, Fast computation of Gauss quadrature nodes and weights on the whole real line, to appear in IMA Numer. Anal., 2015. • A. Townsend, The race to compute high-order Gauss quadrature, SIAM News, 2015. • A. Townsend, A fast analysis-based discrete Hankel transform using asymptotics expansions, submitted, 2015. • F. G. Tricomi, Sugli zeri dei polinomi sferici ed ultrasferici, Ann. Mat. Pura. Appl., 1950. Alex Townsend @MIT