Tag: newton | Page 0
Post #15
Optimization Learning Notes 7
book Public Date: 2020-09-13 16:40:43.046931
update Update Date: 2020-09-13 16:40:43.046931
More on Newton's Method Convergence Rate Let $f$ be twice continuous differentiable and let $x^\ast$ be a local minimizer of $f$. For some given $\epsilon \gt 0$ assume that: - there exists $\mu \gt 0$ with $\nabla^2 f(x) \succcurlyeq \mu I$ for any $x \in B_\epsilon(x^\ast)$...Read More Raw Content