0% found this document useful (0 votes)
130 views

ISE 633 Large Scale Optimization For Machine Learning: Number of Units: 03

This document provides information about the ISE 633 Large Scale Optimization for Machine Learning course offered at USC. The course will introduce large scale optimization algorithms used in modern data science and machine learning applications, covering topics like stochastic optimization, accelerated methods, parallelization, and online optimization. It will be taught on Tuesdays and Thursdays from 5-6:20pm by Professor Meisam Razaviyayn, with TA Maher Nouiehed. Students will be evaluated based on a midterm, final exam, homework assignments, participation, and scribing notes for two lectures.

Uploaded by

uwinakick
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
130 views

ISE 633 Large Scale Optimization For Machine Learning: Number of Units: 03

This document provides information about the ISE 633 Large Scale Optimization for Machine Learning course offered at USC. The course will introduce large scale optimization algorithms used in modern data science and machine learning applications, covering topics like stochastic optimization, accelerated methods, parallelization, and online optimization. It will be taught on Tuesdays and Thursdays from 5-6:20pm by Professor Meisam Razaviyayn, with TA Maher Nouiehed. Students will be evaluated based on a midterm, final exam, homework assignments, participation, and scribing notes for two lectures.

Uploaded by

uwinakick
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

ISE 633 Large Scale Optimization for Machine Learning

Number of units: 03

Location and time: VKC 157, Tuesday/Thursday 5:00-6:20pm

Instructor: Meisam Razaviyayn


Email: [email protected]
Office hours: Thursday, 2-3pm, OHE 310G

Teaching Assistant: Maher Nouiehed


Email: [email protected]
Office hours: Monday, 12:30-1:30pm, GER 242C

Goal: The objective of the course is to introduce large scale optimization algorithms that arise in modern
data science and machine learning applications.

Course Description: Large scale optimization algorithms that arise in modern data science and
machine learning applications. Stochastic Optimization, Accelerated Methods, Parallelization, Online
Optimization, Randomized Linear Algebra

Textbook: There is no required textbook for the class. All course materials will be presented in class or
will be available online as notes. The following textbooks cover parts of the course materials and you may
find them useful:

• D. P. Bertsekas, Nonlinear Programming, Belmont: Athena scientific, 1999.

• S. Boyd and L. Vandenberghe, Convex optimization, Cambridge university press, 2004.


o The book is available for free here: http://web.stanford.edu/~boyd/cvxbook

• S. Shalev-Shwartz and S. Ben-David, Understanding Machine Learning: From Theory to


Algorithms. Cambridge University Press, 2014.
o The book is available for free here:
http://www.cs.huji.ac.il/~shais/UnderstandingMachineLearning/

• A. Shapiro, D. Darinka, and A. Ruszczynski, Lectures on Stochastic Programming: Modeling and


Theory, SIAM, 2009.
o The book is available for free here:
http://www2.isye.gatech.edu/~ashapiro/publications.html
Tentative Course Plan: (Course materials may change depending on the progress)

• Week 1: (Aug 21 – Aug 23)


o Optimization overview, examples in machine learning, large scale optimization,
memory/time/CPU requirement
o Mathematical Basics

• Week 2: (Aug 28 – Aug 30)


o Unconstrained optimization, necessary optimality conditions, smooth versus non-smooth
optimization
o Sufficient optimality conditions, convex versus non-convex optimization

• Week 3: (Sep 4 – Sep 6)


o Gradient methods (unconstrained), choices of direction,
o Asymptotic convergence, Newton method

• Week 4: (Sep 11 – Sep 13)


o Rate of convergence of gradient descent, first order oracle model
o Lower and upper bounds in the oracle model

• Week 5: (Sep 18 – Sep 20)


o Accelerated Nesterov
o Constrained optimization, optimality conditions

• Week 6: (Sep 25 – Sep 27)


o KKT optimality conditions and Lagrange multipliers
o Projection and algorithms, examples in machine learning

• Week 7: (Oct 2 – Oct 4)


o Exploiting multi-block structure of the problem, examples in machine learning, block
coordinate descent methods
o Different block selection rules and convergence analysis

• Week 8: (Oct 9 – Oct 11)


o Block successive upper-bound minimization and its convergence
o Midterm

• Week 9: (Oct 16 – Oct 18)


o Alternating direction method of multipliers
o Non-smooth optimization and examples in machine learning

• Week 10: (Oct 23 – Oct 25)


o Necessary and sufficient conditions in non-smooth optimization, successive upper-bound
minimization, proximal operator
o Multi-block methods in non-smooth optimization
• Week 11: (Oct 30 – Nov 1)
o Stochastic/Online/Incremental optimization
o Incremental gradient and its analysis

• Week 12: (Nov 6 – Nov 8)


o Sample Average Approximation and Stochastic Approximation
o Analysis

• Week 13: (Nov 13 – Nov 15)


o Parallelization: synchronous vs asynchronous
o Adversarial viewpoint and regret analysis

• Week 14: (Nov 20)


o Non-convexity and examples in machine learning: principal component analysis, deep
learning, non-negative matrix factorization
o Local optimality results

• Week 15: (Nov 27 – Nov 29)


o Randomized linear algebra: power method, faster than power method
o Randomized linear algebra: analysis

Course Requirement and Grading:


• In-class midterm (30%)
• Final exam (35%)
• Homework assignments (Best 4 out of 5: 20%)
• Participation (5%)
• Scribing (10%)

Homework assignments:
• All homework assignments are due by 4:30pm on the date indicated.
• Homework assignments must be submitted via Blackboard. Only one pdf file should be submitted
for each homework assignment. You can submit latex pdf files, word converted pdfs, or scanned
images which are converted to pdf format.
• Late homework submissions are not accepted under any circumstances. Start your homework
assignments early.
• There will be five homework assignments. The two lowest scores will not be considered in your
final grade.
• You are encouraged to discuss homework assignments with other students. However, each
student is required to submit his/her own personal work.

Scribing: In order to gain experience with technical writing, each student is required to scribe notes for
two lectures. These notes will be revised by the instructor and will be posted on the course website. The
scribed notes should be written in a way that they are completely understandable to a student who may
have missed the class.
University policies:

• Statement for Students with Disabilities. Any student requesting academic accommodations based
on a disability is required to register with Disability Services and Programs (DSP) each semester.
A letter of verification for approved accommodations can be obtained from DSP. Please be sure
the letter is delivered to your course instructor (or TA) as early in the semester as possible. DSP is
located in STU 301 and is open from 8:30am to 5:00pm, Monday through Friday. Website and
contact information for DSP:
http://sait.usc.edu/academicsupport/centerprograms/dsp/home_index.html, (213) 740 – 0776n
(Phone), (213) 740-6948 (TDD only), (213) 740-8216 (FAX), [email protected].

• Statement on Academic Integrity. USC seeks to maintain an optimal learning environment.


General Principles of academic honesty include the concept of respect for the intellectual
property of others, the expectation that individual work will be submitted unless otherwise
allowed by an instructor, and the obligations both to protect ones own academic work from
misuse by others as well as to avoid using another's work as ones own. All students are expected
to understand and abide by these principles. SCampus, The Student Guidebook, contains the
Student Conduct Code in Section 11.00, while the recommended sanctions are located in
Appendix A: http://usc.edu/dept/publications/SCAMPUS/gov/. Students will be referred to the
Office of Student Judicial Affairs and Community Standards for further review should there be
any suspicion of academic dishonesty. The Review process can be found at:
http://usc.edu/student-affaris/SJACS/. Information on intellectual property at USC is available at:
http://usc.edu/academe/acsen/issues/ipr/index.html.

• Emergency Preparedness/Course Continuity in a Crisis. In case of emergency, when travel to


campus is difficult, if not impossible, USC executive leadership will announce a digital way for
instructors to teach students in their residence halls or homes using a combination of the
Blackboard LMS (Learning Management System), teleconferencing, and other technologies.
Instructors should be prepared to assign students a ``Plan B" project that can be completed ``at a
distance". For additional information about maintaining your classes in an emergency, please
access: http://cst.usc.edu/services/emergencyprep.html.

You might also like