A1
A1
Instructions
1. Attempting all questions is mandatory.
2. You are expected to solve all the questions using the Python programming lan-
guage.
3. Use of any in-built libraries that directly solve the problem is not allowed.
5. Plagiarism is strictly prohibited. All code will be checked for plagiarism. Any
copied code or suspicious similarity will result in an F grade.
6. If any two students submit the exact same code, both get an F grade.
1 Functions
1
1.4 Rosenbrock Function
d−1 h i
X 2
f (x) = 100 xi+1 − x2i + (xi − 1) 2
i=1
2 Steepest Descent
Implement the Steepest Descent algorithm using inexact line search. For both the algo-
rithms, use the following stopping condition: Terminate the algorithm when the magni-
tude of gradient is less than 10−6 or after 104 iterations.
2
2.2 Backtracking with Armijo-Goldstein Condition
3 Newton’s Method
Implement the following variants of Newton’s Method. Use the following stop- ping
condition- Terminate the algorithm when the magnitude of gradient is less than 10−6 or
after 104 iterations.
3
3.1 Pure Newton’s Method
4
3.3 Levenberg-Marquardt Modification
Figure 6: Levenberg-Marquardt
5
3.4 Combining Damped Newton’s Method with Levenberg-Marquardt
Figure 7: Combined
4 Submission Instructions
2. NumPy
3. Matplotlib
4. PrettyTables
6
You are not allowed to create additional files nor modify main.py and functions.py for
the final submission. In algos.py you may create new helper functions if needed.
4.3 Report
You are required to submit a Report (as a PDF) that includes:
2. Using those Jacobians and Hessians, manually compute the minima for all functions
except Rosenbrock.
3. Mention which algorithms failed to converge (if any) and under what circumstances.
5. Make a contour plot with arrows indicating the update directions for all 2D func-
tions.
– algos.py
– functions.py
– main.py
– Report.pdf