Quarter
Course Type
Location
BUCHN 1920
Units
4
Day and Time
T/R 3:30-4:45pm
Course Description

This course provides a rigorous yet accessible introduction to the algorithmic foundations of optimization, with a focus on problems and applications in computer science and engineering. Topics include convex sets and functions; unconstrained methods such as gradient descent and Newton’s method; projection and gradient‐projection; equality and inequality constrained optimization via Lagrange multipliers and KKT conditions; duality theory; stochastic gradient techniques; and subgradient methods for nondifferentiable problems. Throughout the course, you’ll develop both theoretical insights to analyze convergence behavior and the practical skills to implement these algorithms on real‑world problems in machine learning, resource allocation, network design, and beyond. 

Prerequisites:
Students wishing to enroll should have completed—or have an equivalent background in—
Multivariable Calculus, including directional derivatives and Taylor expansions
Linear Algebra, covering vector spaces, inner products, eigenvalues, and positive‑definite matrices

Once the quarter starts, instructor approval is required to maintain enrollment in the course, including if students do not have the listed pre-requisite courses completed.