### Abstract

Magnus' expansion solves the nonlinear Hausdorff equation associated with a linear time-varying system of ordinary differential equations by forming the matrix exponential of a series of integrated commutators of the matrix-valued coefficient. Instead of expanding the fundamental solution itself, that is, the logarithm is expanded. Within some finite interval in the time variable, such an expansion converges faster than direct methods like Picard iteration and it preserves symmetries of the ODE system, if present. For time-periodic systems, Magnus expansion, in some cases, allows one to symbolically approximate the logarithm of the Floquet transition matrix (monodromy matrix) in terms of parameters. Although it has been successfully used as a numerical tool, this use of the Magnus expansion is new. Here we use a version of Magnus' expansion due to Iserles [Iserles A. Expansions that grow on trees. Not Am Math Soc 2002;49:430-40], who reordered the terms of Magnus' expansion for more efficient computation. Though much about the convergence of the Magnus expansion is not known, we explore the convergence of the expansion and apply known convergence estimates. We discuss the possible benefits to using it for time-periodic systems, and we demonstrate the expansion on several examples of periodic systems through the use of a computer algebra system, showing how the convergence depends on parameters.

Original language | English (US) |
---|---|

Pages (from-to) | 4226-4245 |

Number of pages | 20 |

Journal | Communications in Nonlinear Science and Numerical Simulation |

Volume | 14 |

Issue number | 12 |

DOIs | |

State | Published - Dec 2009 |

Externally published | Yes |

### Fingerprint

### Keywords

- Chebyshev polynomials
- Magnus expansion
- Time-periodic systems

### ASJC Scopus subject areas

- Modeling and Simulation
- Numerical Analysis
- Applied Mathematics

### Cite this

*Communications in Nonlinear Science and Numerical Simulation*,

*14*(12), 4226-4245. https://doi.org/10.1016/j.cnsns.2009.02.030

**Magnus' expansion for time-periodic systems : Parameter-dependent approximations.** / Butcher, Eric; Sari, Ma'en; Bueler, Ed; Carlson, Tim.

Research output: Contribution to journal › Article

*Communications in Nonlinear Science and Numerical Simulation*, vol. 14, no. 12, pp. 4226-4245. https://doi.org/10.1016/j.cnsns.2009.02.030

}

TY - JOUR

T1 - Magnus' expansion for time-periodic systems

T2 - Parameter-dependent approximations

AU - Butcher, Eric

AU - Sari, Ma'en

AU - Bueler, Ed

AU - Carlson, Tim

PY - 2009/12

Y1 - 2009/12

N2 - Magnus' expansion solves the nonlinear Hausdorff equation associated with a linear time-varying system of ordinary differential equations by forming the matrix exponential of a series of integrated commutators of the matrix-valued coefficient. Instead of expanding the fundamental solution itself, that is, the logarithm is expanded. Within some finite interval in the time variable, such an expansion converges faster than direct methods like Picard iteration and it preserves symmetries of the ODE system, if present. For time-periodic systems, Magnus expansion, in some cases, allows one to symbolically approximate the logarithm of the Floquet transition matrix (monodromy matrix) in terms of parameters. Although it has been successfully used as a numerical tool, this use of the Magnus expansion is new. Here we use a version of Magnus' expansion due to Iserles [Iserles A. Expansions that grow on trees. Not Am Math Soc 2002;49:430-40], who reordered the terms of Magnus' expansion for more efficient computation. Though much about the convergence of the Magnus expansion is not known, we explore the convergence of the expansion and apply known convergence estimates. We discuss the possible benefits to using it for time-periodic systems, and we demonstrate the expansion on several examples of periodic systems through the use of a computer algebra system, showing how the convergence depends on parameters.

AB - Magnus' expansion solves the nonlinear Hausdorff equation associated with a linear time-varying system of ordinary differential equations by forming the matrix exponential of a series of integrated commutators of the matrix-valued coefficient. Instead of expanding the fundamental solution itself, that is, the logarithm is expanded. Within some finite interval in the time variable, such an expansion converges faster than direct methods like Picard iteration and it preserves symmetries of the ODE system, if present. For time-periodic systems, Magnus expansion, in some cases, allows one to symbolically approximate the logarithm of the Floquet transition matrix (monodromy matrix) in terms of parameters. Although it has been successfully used as a numerical tool, this use of the Magnus expansion is new. Here we use a version of Magnus' expansion due to Iserles [Iserles A. Expansions that grow on trees. Not Am Math Soc 2002;49:430-40], who reordered the terms of Magnus' expansion for more efficient computation. Though much about the convergence of the Magnus expansion is not known, we explore the convergence of the expansion and apply known convergence estimates. We discuss the possible benefits to using it for time-periodic systems, and we demonstrate the expansion on several examples of periodic systems through the use of a computer algebra system, showing how the convergence depends on parameters.

KW - Chebyshev polynomials

KW - Magnus expansion

KW - Time-periodic systems

UR - http://www.scopus.com/inward/record.url?scp=67349085340&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=67349085340&partnerID=8YFLogxK

U2 - 10.1016/j.cnsns.2009.02.030

DO - 10.1016/j.cnsns.2009.02.030

M3 - Article

AN - SCOPUS:67349085340

VL - 14

SP - 4226

EP - 4245

JO - Communications in Nonlinear Science and Numerical Simulation

JF - Communications in Nonlinear Science and Numerical Simulation

SN - 1007-5704

IS - 12

ER -