Search for question
Question

Use the gradient-based optimization algorithm (grad_opt1) to find the minima of the

function f(x) = cos(ex) + x²-1. Employ a rate parameter of -0.02 and -0.001, and start

from initial search points in the set (1.5, 2.5, 3). Set TOL=10^-6 and IMAX=1000. For each

case, generate a plot for the evolution of the search point x value vs iteration 't' (six plots

total). Compare your results to the true minima shown in the plot of f(x) within the interval

[04]. Does the method always lead to the minimum closest to the initial search point?

Fig: 1