diff --git a/tutorial/01 - Optimization and Math/05 - Tricky Functions - Continuity and Differentiability.ipynb b/tutorial/01 - Optimization and Math/05 - Tricky Functions - Continuity and Differentiability.ipynb index dd690dd4..9e563c6b 100644 --- a/tutorial/01 - Optimization and Math/05 - Tricky Functions - Continuity and Differentiability.ipynb +++ b/tutorial/01 - Optimization and Math/05 - Tricky Functions - Continuity and Differentiability.ipynb @@ -545,7 +545,7 @@ "\n", "Imagine if, instead of directly minimizing $\\hat{f}_1(x)$, this was simply a piece of a larger optimization problem. Say, for example, we were instead minimizing $|x| + 0.02x^2 - 1.2x$. This problem turns out to be *unimodal* at $(5, -0.5)$, which means that there is only one local minimum and it is the global minimum. Because the function we're minimizing is convex, we know that it is either unimodal or unbounded.\n", "\n", - "Now imagine we instead used our approximation for $|x|$ with the smoothing parameter $\\alpha = 1$ to get rid of the discontinuity, giving $x \\tanh{x} + 0.02 x ^ 2 - 1.2 x$. This turns out to be *multimodal* at both (0.926, -0.419) and (4.979, -0.5)$ - there are multiple local minima. Of course, the optimizer will only give you one of these (IPOPT is a local optimizer, like all other practical optimizers suitable for general nonconvex problems that are even remotely high-dimensional), and it's effectively impossible to know which one you'll get until you run the algorithm.\n", + "Now imagine we instead used our approximation for $|x|$ with the smoothing parameter $\\alpha = 1$ to get rid of the discontinuity, giving $x \\tanh{x} + 0.02 x ^ 2 - 1.2 x$. This turns out to be *multimodal* at both $(0.926, -0.419)$ and $(4.979, -0.5)$ - there are multiple local minima. Of course, the optimizer will only give you one of these. (IPOPT is a local optimizer, like all other practical optimizers suitable for general nonconvex problems that are even remotely high-dimensional.) And, it's effectively impossible to know which one you'll get until you run the algorithm.\n", "\n", "Let's look at our other approximator.\n", "\n",