propose a non-linear optimization problem and solve it using Gradient descent variation( adagrad,SGD,GD), the problem should be unique and not copied from the internet.Important items to include in the submitted Jupyter notebook:1- Clear description of the problem and mathematical formulation, include multiple variation and explain why we chose such formulation.2- Motivation behind the selected problem and benefits.3- Discuss the proposed solution, which method has been used any why? Also, include hyperparameter tunning (stepsize,momentum..)4- Include at least two methods for solving the problem.5- Conclusion section that explains the outcomes of the experimentations and mathamtical insights (was the convergence slow? Did you experiment divergence? Did you have to try other algorithms before? What do you believe is the mathematical explanation for what you see?)6- The notebook shall include python implementation with a clear comments to explain every step .7- The problem shall be moderate in terms of complexity.Note: the could should be genuine without using python optimization libraries