Cheat Sheet (Updated)

4
Unconstrained NLP For R 1 , Case 1: If f’(x 0 ) = 0, and 1. First nonzero derivative at x 0 is an odd order derivative, x 0 is neither. 2. First nonzero derivative at x 0 is an even order derivative and is +ve, x 0 is local minimum. 3. First nonzero derivative at x 0 is an even order derivative and is –ve, x 0 is local maximum. Case 2: Points where f’(x) do not exist. Let x 1 <x 0 <x 2 (Checking Both Sides) f(x 0 ) > f(x 1 ) ; f(x 0 ) < f(x 2 ) or f(x 0 ) > f(x 1 ) ; f(x 0 ) < f(x 2 ) – x0 not local extremum f f(x 0 ) ≥ f(x 1 ) ; f(x 0 ) ≥ f(x 2 ) – x0 local maximum f(x 0 ) ≤ f(x 1 ) ; f(x 0 ) ≤ f(x 2 ) – x0 local minimum Case 3: Endpoints a and b of [a,b] f’(a) > 0 – a local min f’(a) > 0 – a local max f’(b) > 0 – b local max f’(b) > 0 – b local min For R n , Hessian matrix (n x n matrix) : d 2 f / dx i dx j (ij entry of matrix) (n = how many variables) i th principal minor is determinant of i x i matrix obtained by deleting n – i rows and the n – j column. (i=j) Case 1: If all principal minors of H are +ve, then convex function. Case 2: If all nonzero principal minors have same sign as (-1) k , then concave function. (k=i) Golden Section Search Method (unimodal) [α = 0.618] 1) Choose initial [a 1 ,b 1 ] and set tolerance ε > 0 2) λ k = αa k + (1-α)b k , μ k = (1-α)a k + αb k 3) If f(λ k ) < f(μ k ) , a k+1 = λ k , λ k+1 = μ k , and b k+1 = b k , Compute μ k+1 . (b side has greater value, set a) 4) If f(λ k ) > f(μ k ) , a k+1 = a k , μ k+1 = λ k , and b k+1 = μ k , Compute λ k+1 . (a side has greater value, set b) 5) Check at each stage, if b k -a k ≤ ε, STOP! Final answer is the midpoint of a k and b k . Fibonacci Search Method 1) Choose initial [a 1 ,b 1 ] and set tolerance ε > 0 2) Fibonacci, F n =1,1,2,3,5,8,13,21,34,55,89,144,233,377,610 (n = 0,1,2,….) 3) Determine n from first (b 1 -a 1 )/F n ≤ ε (n = total number of iterations k) 4) λ k = a k +F n-k-1 /F n-k+1 *(b k -a k ), μ k = a k + F n-k /F n-k+1 *(b k -a k ) 5) Same as Golden Section Search step 3 and step 4.

Transcript of Cheat Sheet (Updated)

Constrained NLPFor R1,Case 1: If f(x0) = 0, and1. First nonzero derivative at x0 is an odd order derivative, x0 is neither.2. First nonzero derivative at x0 is an even order derivative and is +ve, x0 is local minimum.3. First nonzero derivative at x0 is an even order derivative and is ve, x0 is local maximum.Case 2: Points where f(x) do not exist. Let x1 f(x1) ; f(x0) < f(x2) x0 not local extremumf f(x0) f(x1) ; f(x0) f(x2) x0 local maximumf(x0) f(x1) ; f(x0) f(x2) x0 local minimumCase 3: Endpoints a and b of [a,b]f(a) > 0 a local min f(a) > 0 a local maxf(b) > 0 b local maxf(b) > 0 b local minFor Rn,Hessian matrix (n x n matrix) : (ij entry of matrix) (n = how many variables)ith principal minor is determinant of i x i matrix obtained by deleting n i rows and the n j column. (i=j)Case 1: If all principal minors of H are +ve, then convex function.Case 2: If all nonzero principal minors have same sign as (-1)k, then concave function. (k=i)Golden Section Search Method (unimodal) [ = 0.618]1) Choose initial [a1,b1] and set tolerance > 02) k= ak + (1-)bk, k = (1-)ak + bk3) If f(k) < f(k) , ak+1 = k , k+1 = k, and bk+1 = bk, Compute k+1.(b side has greater value, set a)4) If f(k) > f(k) , ak+1 = ak , k+1 = k, and bk+1 = k, Compute k+1.(a side has greater value, set b)5) Check at each stage, if bk-ak , STOP! Final answer is the midpoint of ak and bk.Fibonacci Search Method1) Choose initial [a1,b1] and set tolerance > 02) Fibonacci, Fn=1,1,2,3,5,8,13,21,34,55,89,144,233,377,610 (n = 0,1,2,.)3) Determine n from first (b1-a1)/Fn (n = total number of iterations k)4) k = ak +Fn-k-1/Fn-k+1*(bk-ak), k = ak + Fn-k/Fn-k+1*(bk-ak)5) Same as Golden Section Search step 3 and step 4.6) When k = n, set n = n-1 and n = n-1+ /10:a. If f(n) < f(n) , an = n , bn = bn-1b. If f(k) > f(k) , an = an-1 , bn = nc. STOP! Max value of x achieved Midpoint of an and bnBisection Search Method (Bolzano Search Plan) (Differentiable Function Only)1) Choose initial [a1,b1] and set tolerance > 02) xk = (ak + bk) / 23) Evaluate fk(x) at xk = xk , if it is 0, STOP! Maximum value has achieved. Else, step 4.4) If fk(xk) 0, reset ak = xk. If fk(xk) 0, reset ab = xk.If bk-ak , STOP! Else, step 2.Hooke and Jeeves Method (dj as coordinate directions) (n = number of dimensions)1) Set tolerance > 0, step size , acceleration > 0, start point x1, y1=x1, k=j=1.2) If f(yj + dj) > f(yj), SUCCESS! Let yj+1 = yj + dj. Step 4. Else, Step 3. (For each direction dj)3) If f(yj + dj) > f(yj), SUCCESS! Let yj+1 = yj - dj. Else, let yj+1 = yj. (For each direction dj)4) If j < n, set j = j+1. Go Step 2. Else if, f(yn+1) > f(xk), go Step 5. Else if, f(yn+1) f(xk), go Step 6.5) xk+1 = yn+1, y1 = xk+1 + (xk+1 xk), k = k+1 and j = 1. Go Step 2.6) If , STOP, xk is the solution. Else: = /2, y1 = xk, xk+1 = xk, k = k+1, j = 1, go Step 2.Gradient Search Method (Steepest Ascent Method)1) Set tolerance > 0, determine f(x) = [df/dx1, df/dx2], start point x, go Step 4.2) Find value for t that optimizes max f(x + tf(x)), s.t. t 0. Use calculus dy/dx.3) Reset x = x + tf(x).Unconstrained NLP4) Evaluate f(x). If |df/dxj| , STOP! Else, go Step 2. (||f(x)|| 0, determine f(x) = [df/dx1, df/dx2], start point x, go Step 4.2) Find direction that optimizes max f(x).*d, s.t. contraints of original problem (substitute the x and y with direction d). [d = (d1,d2)]3) Find value for t that optimizes max f(x + t*(d-x)), s.t. t 0. Use calculus dy/dx.4) Reset x = x + tf(x).5) Evaluate f(x). If |df/dxj| , STOP! Else, go Step 2. (||f(x)||