[Eoas-seminar] [Seminar-announce] Scientific Computing Seminar with Clayton Webster

eoas-seminar at lists.fsu.edu eoas-seminar at lists.fsu.edu
Tue Oct 31 14:43:14 EDT 2023


"Smoothing-based gradient descent for high-dimensional nonconvex optimization"

Dr. Clayton Webster
Oden Institute for Engineering & Computational Sciences,
The University of Texas at Austin, Austin, TX
Lirio AI Research & Behavioral Reinforcement and Learning Lab (BReLL),
Lirio, LLC., Knoxville, TN

NOTES:

  *   Please feel free to forward/share this invitation with other groups/disciplines that might be interested in this talk/topic. All are welcome to attend.

  *   IN-PERSON ATTENDANCE IS REQUESTED.
     *   499 DSL SEMINAR ROOM

  *   Zoom access is intended for external participants only.

https://fsu.zoom.us/j/94273595552
Meeting # 942 7359 5552

🎦 Colloquium recordings will be made available here, https://www.sc.fsu.edu/colloquium

Tuesday, Nov 7, 2023, Schedule:

* 3:00 to 3:30 PM Eastern Time (US and Canada)
☕ Nespresso & Teatime - 417 DSL Commons

* 3:30 to 4:30 PM Eastern Time (US and Canada)
🕟 Colloquium - 499 DSL Seminar Room

Abstract:
This talk is focused on a class of smoothing-based gradient descent methods when applied to high-dimensional non-convex optimization problems. In particular, Gaussian smoothing is employed to define a nonlocal gradient that reduces high-frequency noise, small variations, and rapid fluctuations in the computation of the descent directions and additionally preserves the structure or features of the loss landscape. The amount of smoothing is controlled by the standard deviation of the Gaussian distribution, with larger values resulting in broader and more pronounced smoothing effects, while smaller values preserve more details of the function. The resulting Gaussian smoothing gradient descent (GSmoothGD) approach can facilitate gradient descent in navigating away from and/or avoiding local minima with increased ease, thereby substantially enhancing their overall performance when applied to non-convex optimization problems. As such, this work also provides rigorous theoretical error estimates on the GSmoothGD iterates rate of convergence, that exemplify the impact of underlying function convexity, smoothness, and input dimension, as well as the smoothing radius. We also present several strategies to combat the curse of dimensionality as well as updating the smoothing parameter, aimed at diminishing the impact of local minima, and therefore, rendering the attainment of global minima more achievable. Computational evidence complements the present theory and shows the effectiveness of the GSmoothGD method compared to other smoothing-based algorithms, momentum-based approaches, backpropagation-based techniques, and classical gradient-based algorithms from numerical optimization. Finally, applications to various personalization tasks using MNIST, CIFAR10, and Spotify datasets demonstrate the advantage of GSmoothGD when used to solve reinforcement learning problems.

Additional colloquium details can be found here,
https://www.sc.fsu.edu/news-and-events/colloquium/1761-seminar-with-clayton-webster-2023-11-07
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.fsu.edu/pipermail/eoas-seminar/attachments/20231031/f1a2e4f5/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/calendar
Size: 5286 bytes
Desc: not available
URL: <http://lists.fsu.edu/pipermail/eoas-seminar/attachments/20231031/f1a2e4f5/attachment.ics>
-------------- next part --------------
_______________________________________________
SC-Seminar-announce mailing list
SC-Seminar-announce at lists.fsu.edu
https://lists.fsu.edu/mailman/listinfo/sc-seminar-announce


More information about the Eoas-seminar mailing list