Previous 30 days30 days60 days90 days | Short FormatLong Format


18104 bytes

fixing date 2022-07-12 18:30:00 US/Pacific

fixing date 2022-07-12 20:00:00 US/Pacific

3 meetings

How humanoid general purpose robots can solve the world’s labor crisis.
June 29th
6:00 PM (2 hours)
Specific AI and single task Robotics are common in manufacturing, and are slowly penetrating into home life, retail, and the hospitality market places.  However, as labor problems continue to grow, the need for general purpose robots with AI capable of completing complex tasks has emerged.  Humanoid Robots will be coming to market soon.  We will discuss development of our Beomni AI and Robotics platform and how it will address the eldercare crisis, the shortage of doctors and nurses, and many other applications.   Join us for a lively discussion on the future of humanoid robots. 


SPS Technical Seminar ("On Asymptotic Linear Convergence of Projected Gradient Descent for Constrained Least Squares")
July 12th
4:00 PM (1 hour)
Kelley Engineering Center

The SPS Chapter of Oregon Section will host a technical seminar by Mr. Trung Vu (a PhD candidate, Oregon State University) titled "On Asymptotic Linear Convergence of Projected Gradient Descent for Constrained Least Squares". The meeting will be held both in person and via Zoom. The Zoom meeting information will be sent out to the registrants by email a day before the events.

Time & Date: 4pm-5pm PST, July 12, Tuesday.

Location: Room 1005, Kelley Engineering Center (OSU Corvallis campus), Corvallis, OR.

Please see below for the title, abstract, and speaker bio.

Title: On Asymptotic Linear Convergence of Projected Gradient Descent for Constrained Least Squares

Abstract: Many recent problems in signal processing and machine learning such as compressed sensing, image restoration, matrix/tensor recovery, and non-negative matrix factorization can be cast as constrained optimization. Projected gradient descent is a simple yet efficient method for solving such constrained optimization problems. Local convergence analysis furthers our understanding of its asymptotic behavior near the solution, offering sharper bounds on the convergence rate compared to global convergence analysis. However, local guarantees often appear scattered in problem-specific areas of machine learning and signal processing. This manuscript presents a unified framework for the local convergence analysis of projected gradient descent in the context of constrained least squares. The proposed analysis offers insights into pivotal local convergence properties such as the conditions for linear convergence, the region of convergence, the exact asymptotic rate of convergence, and the bound on the number of iterations needed to reach a certain level of accuracy. To demonstrate the applicability of the proposed approach, we present a recipe for the convergence analysis of projected gradient descent and demonstrate it via a beginning-to-end application of the recipe on four fundamental problems, namely, linear equality-constrained least squares, sparse recovery, least squares with the unit norm constraint, and matrix completion.

Speaker bio: Trung Vu received the B.S. degree in Computer Science from Hanoi University of Science and Technology, Hanoi, Vietnam, in 2014. He has been working toward the Ph.D. degree in Computer Science since 2016 at the School of Electrical Engineering and Computer Science, Oregon State University, Corvallis, Oregon, USA. His current research interests include both theory and practice of scalable optimization methods for machine learning and signal processing

Oregon Section EXCOM Meeting On Webex (Link included)
July 12th
6:30 PM (1.5 hours)

Click on the link below to attend the meeting:

3 meetings. Generated Tuesday, June 28 2022, at 12:18:22 PM. All times America/Los_Angeles