-
Notifications
You must be signed in to change notification settings - Fork 6
/
Copy pathE_GradientMethod.m
50 lines (40 loc) · 1.48 KB
/
E_GradientMethod.m
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
function E_GradientMethod
% In this example, we use a fixed-step gradient method for
% solving the L-smooth convex minimization problem
% min_x F(x); for notational convenience we denote xs=argmin_x F(x).
%
% We show how to compute the worst-case value of F(xN)-F(xs) when xN is
% obtained by doing N steps of the gradient method starting with an initial
% iterate satisfying ||x0-xs||<=1.
%
% Result to be compared with that of
% [1] Yoel Drori. "Contributions to the Complexity Analysis of
% Optimization Algorithms." PhD thesis, Tel-Aviv University, 2014.
% (0) Initialize an empty PEP
P = pep();
% (1) Set up the objective function
param.mu = 0; % Strong convexity parameter
param.L = 1; % Smoothness parameter
F=P.DeclareFunction('SmoothStronglyConvex',param); % F is the objective function
% (2) Set up the starting point and initial condition
x0 = P.StartingPoint(); % x0 is some starting point
[xs,fs] = F.OptimalPoint(); % xs is an optimal point, and fs=F(xs)
P.InitialCondition( (x0-xs)^2 <= 1); % Initial condition ||x0-xs||^2<= 1
% (3) Algorithm
h = 1/param.L; % step size
N = 10; % number of iterations
x = x0;
for i = 1:N
x = x-h*F.gradient(x);
end
xN = x;
% (4) Set up the performance measure
fN = F.value(xN);
P.PerformanceMetric(fN-fs); % Worst-case evaluated as F(x)-F(xs)
% (5) Solve the PEP
P.solve()
% (6) Evaluate the output
double(fN-fs) % worst-case objective function accuracy
% The result should be (see [1])
% param.L/2/(2*N+1)
end