Sine Cosine Algorithm (SCA) is a population optimization algorithm based on sine and cosine functions. Published in the top journal "Knowledge-Based Systems" in Zone 1 of the Chinese Academy of Sciences. SCA is a novel stochastic optimization algorithm, the most significant feature of the algorithm is that it completes the necessary elements of an intelligent optimization algorithm in a concise and clear form, and only uses the volatility and periodicity of the sine and cosine functions as the design goal to achieve the operator to search and iterate the optimal solution. Compared with genetic algorithm, particle swarm optimization algorithm, and many other intelligent optimization algorithms, the sine cosine algorithm has the advantages of fewer parameters, simple structure, easy implementation, and fast convergence speed, and has better performance in practical applications.
Sine Cosine Algorithm (SCA) summarizes and absorbs the iterative strategy of some group intelligent optimization algorithms, takes the set containing a specific number of random solutions as the initial solution set of the algorithm, repeatedly evaluates the fitness of the solution through the objective function and randomly iterates the solution set according to the specific update strategy, and finally obtains the optimal solution or a satisfactory solution that meets the fitness requirements. Like most swarm intelligent optimization algorithms, SCA relies on the iterative strategy to achieve random search of solution space, which cannot guarantee that the optimal solution can be found in one operation, but when the initial solution set size and the number of iterations are large enough, the probability of finding the optimal solution is greatly improved.
SCA has a unique mathematical model and very competitive results, with fast convergence for many problems, especially for real-world cases, which has been widely used and has 2300+ citations.
SCA organizes the iterative strategy into two threads: global search and local development. In the global search thread, large random fluctuations are applied to the solutions in the current solution set to search for unknown regions in the solution space. In the local development thread, a weak random perturbation is applied to the solution set to fully search for the neighborhoods of the current solution.
SCA uses the periodic volatility of the sine and cosine functions to construct an iterative equation that realizes the functions of global search and local development, and uses the concise updated iterative equation to apply perturbations and update the solution set, and the specific iterative equations are divided into the following two types: sine iteration or cosine iterative equation:
where denotes the current number of iterations, represents the component of the position of individual i in the j-dimension at the time of the iteration, r1, r2, r3 are random parameters, r1 is determined by the update function, and r2 u [ 0 , 2 r3 0 , denotes the component of the optimal candidate solution of the candidate solution set in the j-dimension in the first iteration.
In order to eliminate the correlation between the iteration step size and the direction, the above two iterative equations are combined into the complete iterative equation by the immediately following parameter r4 u [ 0 , 1 ].
R1 controls the conversion of the algorithm from global search to local development. When the value of r1 is large, the algorithm tends to search globally; When the value of r1 is small, the algorithm is biased towards local development.
r2 describes the direction in which the current solution moves to the current optimal solution when it is updated, and the extremum of the iteration step that can be reached, which affects whether the updated solution is in the solution space between the current solution and the optimal solution or outside the space.
r3 gives a random weight for the optimal solution, which is to randomly emphasize the effect of ( r3 1 or ignore ( r3 1 ) the optimal solution when defining the distance traveled by the candidate solution.
R4 describes the randomness between sine and cosine updates, and intends to eliminate possible correlations between iteration step sizes and directions.
Take a two-dimensional random variable as an example:
When the value of r1sin( r2 ) or r1cos( r2 ) is between -1 and 1, the local development strategy is iteratively applied, and the algorithm searches for the solution space between the candidate solution and the current optimal solution, that is, a certain neighborhood of the candidate solution. If the value of r1sin( r2 ) or r1cos( r2 ) is > 1 or < 1, the global development policy is applied. This is how SCA enables global search and local development of solution spaces.
Considering the balance between the two processes of global search and local development and the necessity of the algorithm convergence to the global optimal solution, r1:1= is adaptively adjusted with the progress of iteration
a is a constant; t is the current number of iterations; t is the maximum number of iterations; Since the value of r1 gradually decreases with the number of iterations, the ability of the algorithm to develop locally and globally is balanced. When a = 2 is set, as shown in the figure below, the fluctuation amplitude of r1sin(r 2) and r1cos(r 2) (the sine and cosine parameter part) gradually decays with the increase of the number of iterations, and its values are in the range of ( 1 , 2 ] and [ 2 , 1 ) ), the algorithm performs a global search and develops locally between [ 1 , 1 ] .
The number of initialization iterations t = 0, the initial candidate solution set m, the random position of the candidate solution, and the r1, r2 and other parameters of the iterative update equation;
The fitness of each candidate solution is calculated, and the current optimal candidate solution p(t) is determined and retained. update the candidate solution set based on the iterative equation;
Iteratively update the r1, r2 and other parameters of the equation according to the formula and the probability distribution law of the parameters.
Terminate the inspection. Determine whether the termination condition is satisfied, if the number of iterations or the solution condition is satisfied, p(t) is output; If not, go back to step 2.
SCA's performance is benchmarked in three test phases. First, a set of well-known test cases, including unimodal, multimodal, and combinatorial functions, are used to test the exploration, utilization, local optimal avoidance, and convergence of SCA. Secondly, the performance of SCA on the shifted 2D test function was qualitatively observed and confirmed using several performance indicators (search history, trajectory, average fitness of the solution, and the best solution in the optimization process).
sine cosine algorithm (sca)
source codes demo version 1.0
developed in matlab r2011b(7.13)
author and programmer: seyedali mirjalili
e-mail: [email protected]
homepage:
main **
s. mirjalili, sca: a sine cosine algorithm for solving optimization problems
knowledge-based systems, doi:
you can simply define your cost function in a seperate file and load its handle to fobj
the initial parameters that you need are:
fobj = @yourcostfunction
dim = number of your variables
max_iteration = maximum number of iterations
searchagents_no = number of search agents
lb=[lb1,lb2,..lbn] where lbn is the lower bound of variable n
ub=[ub1,ub2,..ubn] where ubn is the upper bound of variable n
if all the variables h**e equal lower bound you can just
define lb and ub as two single numbers
to run sca: [best_score,best_pos,cg_curve]=sca(searchagents_no,max_iteration,lb,ub,dim,fobj)
function [destination_fitness,destination_position,convergence_curve]=sca(n,max_iteration,lb,ub,dim,fobj)
display('sca is optimizing your problem');
initialize the set of random solutions
x=initialization(n,dim,ub,lb);
destination_position=zeros(1,dim);
destination_fitness=inf;
convergence_curve=zeros(1,max_iteration);
objective_values = zeros(1,size(x,1));
calculate the fitness of the first set and find the best one
for i=1:size(x,1)
objective_values(1,i)=fobj(x(i,:)
if i==1
destination_position=x(i,:)
destination_fitness=objective_values(1,i);
elseif objective_values(1,i)destination_position=x(i,:)
destination_fitness=objective_values(1,i);
endall_objective_values(1,i)=objective_values(1,i);
endmain loop
t=2; %start from the second iteration since the first iteration was dedicated to calculating the fitness
while t<=max_iteration
eq. (3.4)
a = 2;
max_iteration = max_iteration;
r1=a-t*((a)/max_iteration); r1 decreases linearly from a to 0
update the position of solutions with respect to destination
for i=1:size(x,1) %in i-th solution
for j=1:size(x,2) %in j-th dimension
update r2, r3, and r4 for eq. (3.3)
r2=(2*pi)*rand();
r3=2*rand;
r4=rand();
eq. (3.3)
if r4<0.5
eq. (3.1)
x(i,j)= x(i,j)+(r1*sin(r2)*abs(r3*destination_position(j)-x(i,j)))
else eq. (3.2)
x(i,j)= x(i,j)+(r1*cos(r2)*abs(r3*destination_position(j)-x(i,j)))
endend
endfor i=1:size(x,1)
check if solutions go outside the search spaceand bring them back
flag4ub=x(i,:)ub;
flag4lb=x(i,:)x(i,:)=(x(i,:)flag4ub+flag4lb)))ub.*flag4ub+lb.*flag4lb;
calculate the objective values
objective_values(1,i)=fobj(x(i,:)
update the destination if there is a better solution
if objective_values(1,i)destination_position=x(i,:)
destination_fitness=objective_values(1,i);
endend
convergence_curve(t)=destination_fitness;
display the iteration and best optimum obtained so far
if mod(t,50)==0
display(['at iteration ', num2str(t), ' the optimum is ', num2str(destination_fitness)])
end increase the iteration counter
t=t+1;
end