Solving one-dimensional optimization problems
This section represent the solution of one-dimensional optimization problems.
Getting ready
This is the function that is required to solve the method:
scipy.optimize.minimize(fun, x0, args=(), method=None, jac=None, hess=None, hessp=None, bounds=None, constraints=(), tol=None, callback=None, options=None)
How to do it...
Minimization of the scalar function of one or more variables:
minimizef(x)subjecttog_i(x)>=0,i=1,...,mh_j(x)=0,j=1,...,p
How it works...
This is represented in the following code:
fromscipy.optimizeimportminimize,rosen,rosen_der
x0=[1.3,0.7,0.8,1.9,1.2]res=minimize(rosen,x0,method='Nelder-Mead',tol=1e-6)res.xarray([ 1., 1., 1., 1., 1.])
res=minimize(rosen,x0,method='BFGS',jac=rosen_der,... options={'gtol':1e-6,'disp':True})Optimization terminated successfully. Current function value: 0.000000 Iterations: 26 Function evaluations: 31 Gradient evaluations: 31res.xarray([ 1., 1., 1., 1., 1.])print...