Title: Non-Linear Optimization (Unconstrained) - Direct Search Method
Title: Non-Linear Optimization (Unconstrained) - Direct Search Method
Title: Non-Linear Optimization (Unconstrained) - Direct Search Method
Submitted to
Dr. Abdullahil Azeem
Professor, IPE Dept.
BUET
Submitted by
Group-3
SL Roll Name
1 0419082122 Rahul Carnalious Snal Course:
2 0419082124 Rajib Debnath AEM 6112
3 0419082120 Sujit saha Quantitative Analysis
4 0419082102 Kh Abdullah Al Masud
5 0419082107 Md. Mahadi Hassan
6 1015082106 Rasel Alam
Contents
Introduction
Golden Section Search Method
Dichotomous Search Method
Problem
Result and plot
Matlab Code
Conclusion
Introduction
The solution methods of nonlinear programming
generally can be classified as either direct or indirect
algorithms.
Idea of direct search method for unconstrained non-
linear optimization is to identify the interval of
uncertainty that is known to include the optimum
solution point.
There are two commonly used direct search methods
◦ Dichotomous Search Method
◦ Golden Search Method
Golden Section Search Method
The Golden Section Search Method chooses x1and x2
such that the one of the two evaluations of the function
in each step can be reused in the next step.
The golden ratio is the ratio r satisfying
Golden Section Search Method
Golden Section Search Method
Golden Section Search Method to maximize f (x) over the interval
a≤x≤b
STEP 1: Initialize: Choose a tolerance t > 0.
Steps
1. Input equation f(x)
2. Input interval value [a,b]
3. Input ∂ [lower value is better]
4. Input ∆
∆ is difference between f(x1) and f(x2)
5. Assign
• xL = a
• xR = b
• N = 1 [N is iteration number]
Dichotomous Search Method Algorithm
6. N=N+1
7. x1 = ( xL + xR) / 2 - ∂
8. x1 = ( xL + xR) / 2 + ∂
9. if abs( f(x1) - f(x2) ) <= ∆ Goto Step 14
10. If f(x1) > f(x2) xR = x2
11. If f(x1) < f(x2) xL = x1
12. If f(x1) = f(x2) xL = x1 and xR = x2
Dichotomous Search Method Algorithm
-1
-2
-3
-4
-5
-6
-7
-8
-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2
Maximization Problem
Matlab Code
(Using Dichotomous Search Method Algorithm)
clear all;
clc;
syms f(x);
f(x) = - x^2 - 2*x ;
a=-3; %Left limit
b=2; %Right limit
delta=.001; %Delta
diff=.0001; %Difference of f(x1)~f(x2)
xL=a;
xR=b;
n=1;
while(1)
Matlab Code
x1= (xL + xR)/2 - delta ;
x2= (xL + xR)/2 + delta ;
x0= double((x1 + x2)/2);
max= double(f(x0));
fprintf('n=%d, x1=%.5f, x2=%.5f,min=%.5f \n',n,double(x1),double(x2),max);
n=n+1;
if (abs( f(x1)-f(x2) )<=diff)
break;
end
if(f(x1)>f(x2))
xR=x2;
elseif(f(x1)<f(x2))
xL=x1;
elseif(f(x1)==f(x2))
xL=x1;
xR=x2;
end
end
Matlab Code
x0= double((x1 + x2)/2) ;
min= double(f(x0));
fprintf('At x = %.5f , Minimum value = %.5f\n',x0,min);
x = -3:.1:2;
plot(x,f(x));
Conclusion
The golden section search method converges more rapidly than the
Dichotomous method because, in the dichotomous method, the narrowing
of the interval of uncertainty slows down appreciably as I → ∆. In addition,
each iteration in the golden section method requires Half the computations
because the method always recycles one of Computations from the
immediately preceding iteration.