Title: Newtonian Methods in Nonsmooth Optimization via the Lens of Variational Analysis Abstract: This talk presents the results of the local and global convergence of our Newton-type methods for solving structured nonconvex and nonsmooth optimization problems, utilizing tools from variational analysis and generalized differentiation. The methods employ generalized Hessians (coderivatives of subgradient mappings) associated with objective functions. These objective functions are either prox-bounded functions or represented as the sum of a smooth function and an extended-real-valued (not necessarily smooth) function. Additionally, we introduce a new line search method, a generalization of the proximal gradient method, to globalize our coderivative-based Newton methods by incorporating the machinery of forward-backward envelopes. Further applications to l0-l2 least square regression problems are also introduced. Thanh Phat Vo Assistant Professor of Mathematics Department of Mathematics & Statistics University of North Dakota 101 Cornell St. Stop 8376 324 Witmer Hall Grand Forks, ND 58202, USA Google Site: https://sites.google.com/view/vtphat204