homepage/numérique/dAJLR.markdown
2022-03-23 16:28:04 +01:00

62 lines
2.5 KiB
Markdown

---
layout: default
math: true
lang: en
subtitle: Curves with modulated stiffness
---
## Parameters:
|$N$|$L$|$M$|
|-|
|$180$|$2\pi$|$0$|
|-|
{:.bordered.center}
# 23.03.2022 (loss of convexity for $c_0 \neq 0$)
|$\mu$|$c_0$|$\beta$|result|comment|
|-|
|$10^{-1}$|$10^{-1}$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/loss_of_convexity_c0_0.1_mu_0.1.mp4)||
|$10$|$10^{-1}$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/loss_of_convexity_c0_0.1_mu_10.mp4)||
|$10^{-1}$|$1 = 2\pi / L$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/loss_of_convexity_c0_1_mu_0.1.mp4)||
|$10^{-1}$|$4$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/loss_of_convexity_c0_4_mu_0.1.mp4)||
|$10$|$4$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/loss_of_convexity_c0_4_mu_10.mp4)||
{:.bordered.center}
# 08.02.2022 (focus on convexity)
## Results:
|$\mu$|$c_0$|$\beta$|result|comment|
|-|
|$10^{-2}$|$0$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/exponential_mu=1e-2.mp4)|cigar via "circle"|
|$10^{-1}$|$0$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/exponential_mu=1e-1.mp4)|cigar via "triangle"|
|$10^{-1}$|$2.5$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/exponential_mu=1e-1_c0=2.5.mp4)|non embedded curve (neg. curv.) at equilibrium|
|$10^{-1}$|$2.5$|$e^{-0.7\rho}$|[movie](/s/numerics/dAJLR/exponential_e=-0.7_mu=1e-1_c0=2.5.mp4)|embedded curve (neg. curv.) at equilibrium|
|$10^{-1}$|$1.9$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/exponential_mu=1e-1_c0=1.9.mp4)| $\dot{E}{\rho}$ and $\dot{E}{\theta}$ do not have a sign|
|$10^{-1}$|$2$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/exponential_mu=1e-1_c0=2.mp4)|"cigar"-like (neg. curve) at equilibrium|
|$10^{-1}$|$3$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/exponential_mu=1e-1_c0=3.mp4)|non embedded curve (neg. curv.) at equilibrium|
|$10^{-1}$|$5$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/exponential_mu=1e-1_c0=5_overstep.mp4)|timestep too large?|
|$10^{-1}$|$10$|$e^{-\rho}$|[movie](/s/numerics/dAJLR/exponential_mu=1e-1_c0=10_clipped.mp4)|clipped|
{:.bordered.center}
# 31.01.2022
## Results:
|$\mu$|$c_0$|$\beta$|$t_{max}$, iter$_{max}$|result|comment|
|-|
|$10^{-2}$|$0$|const.||[movie](/s/numerics/dAJLR/beta=1.mp4)|
|$10^{-2}$|$10$|const.||[movie](/s/numerics/dAJLR/const_c0=10.mp4)
|$10^{-2}$|$0$|SDW[^1]||[movie](/s/numerics/dAJLR/sym.mp4)|
|$10^{-2}$|$0.1$|SDW||[movie](/s/numerics/dAJLR/sym-c0=0.1.mp4)
|$10^{-2}$|$1$|SDW||[movie](/s/numerics/dAJLR/sym-c0=1.mp4)
|$10^{-2}$|$0$|ADW[^2]||[movie](/s/numerics/dAJLR/asym.mp4)|
|-|
{:.bordered.center}
[^1]: ($\beta(x) = 1.01 + (x - 1)^2 (x + 1)^2$))
[^2]: ($\beta(x) = 1.2 + (x - 1)^2 (x + 1)^2 + x$))