Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
A Distributed Formation Joint Network Navigation and Positioning Algorithm
Previous Article in Journal
Transport Equation for Small Systems and Nonadditive Entropy
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Circle Search Algorithm: A Geometry-Based Metaheuristic Optimization Algorithm

1
Centre for Advances in Reliability and Safety, Hong Kong
2
Electrical Power and Machines Department, Faculty of Engineering, Ain Shams University, Cairo 11517, Egypt
3
Electrical Engineering Department, Faculty of Engineering and Technology, Future University in Egypt, Cairo 11835, Egypt
4
Electrical Engineering Department, College of Engineering, King Saud University, Riyadh 11421, Saudi Arabia
5
Department of Electrical Engineering, University of Jaén, 23700 Linares, Spain
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(10), 1626; https://doi.org/10.3390/math10101626
Submission received: 18 April 2022 / Revised: 1 May 2022 / Accepted: 6 May 2022 / Published: 10 May 2022

Abstract

:
This paper presents a novel metaheuristic optimization algorithm inspired by the geometrical features of circles, called the circle search algorithm (CSA). The circle is the most well-known geometric object, with various features including diameter, center, perimeter, and tangent lines. The ratio between the radius and the tangent line segment is the orthogonal function of the angle opposite to the orthogonal radius. This angle plays an important role in the exploration and exploitation behavior of the CSA. To evaluate the robustness of the CSA in comparison to other algorithms, many independent experiments employing 23 famous functions and 3 real engineering problems were carried out. The statistical results revealed that the CSA succeeded in achieving the minimum fitness values for 21 out of the tested 23 functions, and the p-value was less than 0.05. The results evidence that the CSA converged to the minimum results faster than the comparative algorithms. Furthermore, high-dimensional functions were used to assess the CSA’s robustness, with statistical results revealing that the CSA is robust to high-dimensional problems. As a result, the proposed CSA is a promising algorithm that can be used to easily handle a wide range of optimization problems.

1. Introduction

Human life is becoming more intelligent, and the surrounding systems are growing, which is resulting in higher complexity and cost. Optimization algorithms are important in addressing these complex systems and maintaining the trade-off between quality and cost. In the past decades, conventional optimization algorithms, such as the Nelder and Mead algorithm [1] and the Hooke and Jeeve algorithm [2], have been used to solve small nonlinear problems. However, these algorithms suffer from several disadvantages, such as (1) their dependency on the initial conditions of the optimization problem; (2) their accuracy being based on the differential equation solver in the existing tools; and (3) their possibility of getting stuck at a local minimum point rather than the global minimum point, especially when the problem under study is a heavy nonlinear or nonconvex problem. Accordingly, metaheuristic algorithms have emerged as a viable option for solving complex problems. These algorithms are population-based and stochastic algorithms, which use the principle of random operators. Metaheuristic algorithms have a significant potential to solve constrained and nonlinear problems, which consider the optimization problem as a black-box problem. In addition, metaheuristic algorithms can be classified into three main categories on the basis of the manner of inspiration: biology-based, nature-based, and physics-based.
Biology-based algorithms simulate the behavior of creatures in reproduction and evolution, swarming, or in search of food. The genetic algorithm (GA) [3] and particle swarm optimization (PSO) method [4] were two of the most popular stochastic algorithms that emerged in the last decades [5]. The GA is based on organisms reproducing their best offspring, whereas the PSO is based on bird and fish swarming. These two algorithms are regarded as the most commonly used algorithms. Following that, numerous algorithms emerged to mimic the evolution and swarming behavior of other creatures, including differential evolution (DE) that mimics the species evolution [6], artificial bee colony (ABC) that simulates bees foraging [7,8] and ant colony optimization (ACO) that stimulates ants’ bio-communication in order to find the lowest path between the colony and food [9]. The bat algorithm (BA) mimics the echo positioning behavior of bats [10], symbiotic organisms search (SOS) replicates organisms’ symbiotic interaction techniques for survival and reproduction in an ecology [11], and butterfly optimizer (ABO) replicates the mate-finding approach of several butterfly species [12]. The salp swarm algorithm (SSA) simulates salps’ swarm behavior while traversing and foraging in the seas [13], the artificial gorilla troops optimizer (GTO) mimics the society of silverback gorillas [14], the grasshopper optimizer (GOA) mimics the grasshoppers’ swarming behavior [15], and the moth–flame optimizer (MFO) simulates the flight of moths in a straight line toward the moon [16].
Other biology-based algorithms mimic the hunting behavior of wild creatures, including the hunting search algorithm (HuS), which mimics the hunting process of wild animals, such as lions and tigers, that surround a prey and catch it [17]; grey wolf optimization (GWO), which mimics the hierarchical order of grey wolves’ herd to hunt prey [18]; the whale optimization algorithm (WOA), which mimics the spiral hunting of humpback whales [19], and the Harris hawks optimizer (HHO), which mimics hawks’ collaboration to pounce on prey [20]. Moreover, the coyote optimizer algorithm (COA) mimics the social environment of these species of wolves [21], and the marine predators algorithm (MPA) mimics the Levy and Brownian motions of oceanic predators as well as the ecological interaction between predators and prey [22]. There are other biology-based algorithms, such as dolphin echolocation [23], the social spider algorithm [24], the shuffled frog-leaping algorithm [25], the honey badger algorithm [26], the cuckoo search algorithm [27], the artificial butterfly optimizer [12], the bird mating algorithm [28], the tunicate swarm algorithm [29,30], the pity beetle algorithm [31], the spotted hyena optimizer [32], and golden eagle optimization [33].
Nature-based metaheuristic algorithms imitate the regular natural performance of plants, water, and so on. The sunflower optimizer (SFO) simulates the position tracking behavior of sunflowers [34], the tree seed algorithm (TSA) imitates the relationship between trees growth and their seeds [35], and the flower pollination algorithm (FPA) simulates the pollination process of flowers [36]. The water cycle algorithm (WCA) simulates the water flow from rivers and waterways to the sea [37], the water evaporation optimizer (WEO) simulates the evaporation of a small number of the water components to a solid surface with varying water content [38], and the heat transfer search (HTS) mimics the interaction of heated molecules with each other and with their surroundings in order to achieve thermal equilibrium [39]. There are other nature-based metaheuristic algorithms, such as farmland fertility [40], the water strider algorithm [41], and the rain optimization algorithm [42].
Physics-based metaheuristic algorithms mimic the famous physical theories and phenomena [43]. The gravitational search algorithm (GSA) mimics the rule of gravity and masses interactions [44,45], the Lichtenberg algorithm (LA) mimics the Lichtenberg figures patterns [46], and Henry gas solubility optimization (HGSO) mimics Henry’s rule of dissolved gases [47,48,49]. Thermal exchange optimizer (TEO) mimics the heat exchange between objects and surroundings [50], the colliding bodies optimizer (CBO) mimics the state of two collided moving objects before and after collision [51], and the atom search optimizer (ASO) mimics the interaction forces between atoms [52]. Charged system search (CSS) simulates the electrostatics Coulomb law and the mechanics Newtonian laws [53], the ray optimization algorithm (ROA) simulates the traveling light from the bright medium to dark medium [54], and the transient search optimizer (TSO) simulates the transient response of electrical circuits that include the energy storage devices [55]. There are other physics-based metaheuristic algorithms, including Galactic swarm optimization [56], the central force optimizer [57], the ions motion algorithm [58], the newton metaheuristic algorithm [59], the gradient-based optimizer [60], atomic orbital search [61], chaos game optimization (CGO) [62], and the simulated annealing algorithm [63].
All metaheuristic algorithms compete to find the optimal solution; even a little change in the answer may result in substantial cost savings. All of these algorithms, however, started in their simplest form and were later enhanced or hybridized with other algorithms. The metaheuristic algorithms that can be applied easily and efficiently have attracted researchers to widely utilize them. PSO is one of the most utilized, hybridized, and modified to solve a broad range of optimization problems [64]. GWO, WOA, TSO, and SSA are modified and applied to design an optimal control system for wind power plants [65,66,67,68,69]. TSO, HHO, COA, and SFO are applied to estimate the electrical parameters of photovoltaic modules [70,71,72,73,74]. At the same time, the no-free-lunch theorem states that no algorithm can solve all problems [75], wherein one algorithm can efficiently solve a certain problem while failing to solve another problem. This encourages researchers to model new effective algorithms that can solve more optimization problems. Table 1 summarizes the most recent emerging metaheuristic algorithms, and it can be indicated that there is a need for new competitive and effortless algorithms.
This article proposes a novel metaheuristic algorithm, called CSA, which can be categorized as a geometry-based metaheuristic algorithm. Within this category, the sine cosine algorithm (SCA) is a geometry-based method that emulates sinusoidal waveforms [86]. Geometry is the science that deals with the characteristics of figures in space, such as their dimensions, relative position, distance, form, and size. The circle is the most frequently used geometric figure because it possesses unique characteristics such as a diameter, a perimeter, a center point, and a tangent line. The radius that crosses the tangent point is perpendicular to the tangent line, and the orthogonal function is the ratio of the radius to the perpendicular tangent line. The orthogonal function varies significantly with a tiny angle change, which may speed up the CSA exploration phase.
The major contributions of this work are listed below:
  • Introducing a novel geometry-based optimization method, called CSA.
  • Presenting a mathematical model for the proposed CSA, including the states of exploration and exploitation processes.
  • Applying the proposed CSA and other comparative algorithms to determine the optimal solution of 23 well-known functions and three engineering design issues
  • Applying the CSA to solve high-dimensional functions (100 and 1000 dimensions).
  • Testing the superiority and significance of the CSA in comparison with other algorithms, performed by using a variety of statistical tests, including the mean, standard deviation, rank test, and p-values.
The rest of the paper is structured as follows: Section 2 provides the specifics of the inspiration and the modeling of the CSA. Section 3 shows and contrasts the optimum results of the 23 standard functions that are obtained using the CSA and other experienced algorithms. Section 4 illustrates the use of the CSA to optimally design three well-known engineering issues. Section 5 provides a short conclusion regarding this study.

2. Circle Search Algorithm

2.1. Background

The geometrical circle is a fundamental closed curve with the same distance between all points and the center. As shown in Figure 1, the diameter is defined as the line connecting two points on the curve that intersect the center (xc). The radius (R) is the line that links any point on the circle to the center. The perimeter of a circle is equal to the length of the curve that surrounds it. As observed in Figure 1, the tangent line segment is a straight line that intersects the circle at a single point (xt) and is perpendicular to the radius intersecting this point. According to Pythagorean equations, the orthogonal function (Tan) of the right triangle is the ratio between the radius and the perpendicular tangent line segment. It is obvious that the radius is defined as the distance between xt and xc, whereas the tangent line segment is the distance between points xt and xp; then, the orthogonal function (Tan) is expressed as in the following equations:
Tan ( θ ) = x t x c x p x t
x t x c = ( x p x t ) × Tan ( θ )
x t = x c + ( x p x t ) × Tan ( θ )

2.2. CSA Formulation

The CSA looks for the optimum answer inside random circles in order to broaden the scope of the search area. Using the center of the circle as a target point, the angle of the contacting point of the tangent line and the circumference of the circle progressively decreases until it approaches the center of the circle, as shown in Figure 2a. Due to the possibility that this circle has been trapped in the local solution, the angle at which the tangent line touches the point is altered at random, as shown in Figure 2b. The touching point Xt is considered the search agent of the CSA, and the center point Xc is assumed to be the best position in the algorithm. As shown in Figure 2, the CSA updates the search agent in response to the movement of the touching point toward the center. Nevertheless, to prevent the CSA from being stuck in a local solution, the contact point will be updated randomly by changing the angle in a random manner. The main steps of the CSA optimizer are explained below:
Step 1: Initialization: This step is important in the CSA, where whole dimensions of each search agent should be equally randomized, as depicted in Algorithm 1. Most of the previous published code randomizes the dimensions unequally, which sometimes make the algorithms surprisingly obtain the best results. Then, the search agents are initialized between the upper limit values (UB) and lower limit values (LB) of the search space as in Equation (4):
X t = LB + r × ( UB LB )
where r is a random vector that lies between [0, 1].
Step 2: Update search agent position: the position of the search agents Xt is updated according to the evaluated best position Xc as shown in Equation (5):
X t = X c + ( X c X t ) × tan ( θ )
where the angle θ plays an important role in the exploration and exploitation of the CSA and can be calculated as follows:
θ = { w × r a n d I t e r > ( c × M a x i t e r ) ( escape   from   local   stagnation ) w × p otherwise
w = w × r a n d w
a = π π × ( I t e r M a x i t e r ) 2
p = 1 0.9 × ( I t e r M a x i t e r ) 0.5
where rand is a random number lies between 0 and 1, Iter stands for the iteration counter, Maxiter represents the maximum number of iterations, and c is a constant between 0 and 1 that represents the percentage of maximum iterations. Equation (7) shows that the variable w changes from −π to 0 during the increasing of the iterations number. The variable a changes from π to 0 according to Equation (8). The variable p changes from 1 to 0 as in Equation (9). Accordingly, the angle θ changes from π to 0 .
There are two cases can be achieved for the CSA as follows:
  • Case 1:Iter > (c.Maxiter): this case means that the angle θ = w × r a n d all the time, which can applied to improve the exploration process of the CSA and escape the local stagnation.
  • Case 2:Iter < (c.Maxiter): this case makes the angle θ = w × p all the time, which can be used to improve the exploitation process of the CSA.
The pseudo code of the CSA is summarized in the Algorithm 2. Furthermore, Figure 3 exhibits the flowchart of the CSA.
Algorithm 1 Initialization of the CSA
InputLBandUB.
Do for all search agents
    r = random number between [0, 1].
    Use Equation (4) to initialize the search agent Xt.
 End Do
Algorithm 2 Pseudo-code of the CSA
Initialize the search agentsXtusing Algorithm 1
Input the constant value c, Iter = 0, and Maxiter
While Iter less than Maxiter
     Use Equation (8) to find the value of a
     Do for all search agents
       Use Equation (7) to find the value of w
       Use Equation (9) to find the value of p
       Use Equation (6) to find the value of the angle θ
       Use Equation (5) to update the search agent Xt
       if the updated search agents are out of the boundaries then set search agents equal to the boundaries
       find the fitness function f(Xt)
     End Do
     Evaluate the f(Xt) with the stored best solution f(Xc)
     Update f(Xc) and Xc
     Iter = Iter + 1
End While
Output f(Xc) and Xc

3. Computational Complexity of the CSA

The Big O Notation can be used to quantify the runtime of the CSA. The initialization is based on one loop as described in Algorithm 1, so it requires O ( N ) , where N denotes the search agents’ number. The computational complexity of the update process is O ( M a x i t e r × N × D ) . The computational complexity of the fitness evaluation is O ( M a x i t e r × N ) . Therefore, the total complexity is O ( N ( 1 + M a x i t e r × D + M a x i t e r ) , where the computational complexity of the CSA is O ( N × M a x i t e r × D ) .

4. Experimental Results and Discussion

In this article, the optimization processes were performed using MATLAB R2019b, Windows 10 64 bits on Core i7, 16 GB RAM PC to find the best solution of the famous 23 standard functions.

4.1. Standard Functions

These standard functions are well known and used widely as benchmark functions, including the unimodal, multimodal, and fixed-dimension multimodal functions. Table 2 shows the list of the unimodal functions (F1–F7), which have one optimal best solution, and they were used to check the deep exploitation of the optimization algorithms. Table 3 and Table 4 list the non-fixed multimodal functions (F8–F13) and fixed dimensions of the multimodal functions (F14–F23), which trap the algorithms into many local optimal solutions. These multimodal functions are used to examine the exploration of the algorithms and their ability to escape from the local optima.

4.2. Comparative Algorithms

The CSA performance was compared with eight algorithms—the PSO, GWO, SSA, SCA, WOA, HHO, CGO, and TSO algorithms. The PSO and GWO algorithms are the most used algorithms, so they can be considered as the benchmark algorithms for all new algorithms. The WOA, CGO, and TSO algorithms are recent algorithms, which are applied widely due to their straightforward coding and application. The SSA and HHO are recent algorithms and have good capability to avoid being stuck in local optima. The SCA is a geometry-based algorithm, where the proposed CSA is also classified as a geometry-based algorithm. The settings of the applied algorithms in this paper are displayed in Table 5, which were tuned on the basis of the related literature. All algorithms had the same maximum number of iterations (500), and the search agents were 30 in number.

4.3. Statistical Analysis

Statistical analysis was carried out for 30 independent runs for all applied algorithms. For obtaining a fair comparison, the same initial population was used for all compared algorithms. The mean and standard deviation were the most used statistical tests to show the superiority of the algorithms to obtain the best minimum for different runs. Table 6, Table 7 and Table 8 show the minimum, mean, standard deviation (std), and ranks for all applied algorithms and functions. The best achieved mean solution was bolded for visual inspection, wherein the CSA obtained the best mean solution for 21 out of 23 functions. Furthermore, the standard deviations of the results of 30 independent runs for all functions were very small, especially for the Shwefel function (F8), where all algorithms obtained very big standard deviations for this function. The sum of individual ranks showed that the CSA obtained the first rank, whereas the SSA obtained the second rank. The balance between exploitation and exploration abilities of the algorithms can be tested using two contrary functions (Rosenbrock function (F5) and Rastrigin function (F9)), wherein most of the algorithms failed to solve both of them. However, the proposed CSA successfully obtained the best minimum solution for both.
In addition, the null hypothesis t-test with 5% significance level was used to validate the significance level of the results of the CSA versus other algorithms. The significance level revealed that there was a significant comparison between the achieved results by the CSA and the results of the comparative algorithms. This can be measured by comparing the results of the CSA for 30 independent runs with the results of other algorithms, e.g., CSA vs. GWO. If the p-value of the t-test was less than 0.05, then there was a significant comparison between the CSA and the other algorithms. However, if the p-value was higher than 0.05, then there was no significant comparison between the CSA and the other algorithms. Table 7 shows the obtained p-values of the CSA versus other algorithms for the 30 independent runs. Most of the p-values were less than 0.05, except 8 values out of 92 values.

4.4. High-Dimensional Functions

To investigate the CSA’s behavior with high-dimensional problems in further detail, the CSA was tested for benchmark functions (F1–F13) with 100 and 1000 dimensions. The statistical results of all algorithms are shown in Table 8, including the best minimum, mean, and standard deviation. It is obvious that the CSA was resilient and consistently produced the best results for all high-dimensional functions. Additionally, the standard deviation demonstrated the CSA’s durability and stability in 30 independent experiments. On the other hand, the compared algorithms were not resilient against high-dimensional algorithms with a large standard deviation. Additionally, the 1000 dimensions ensured the suggested CSA’s robustness. The results of all algorithms and functions are summarized in Table 9. It is obvious that the CSA is resilient and produces the best minimal results, with a very low standard deviation in comparison with other algorithms.

4.5. Computational Time

In some applications, the computational speed and simplicity of the optimization algorithm are critical. As a result, the CPU time for the proposed CSA and other algorithms is measured during the optimization of high-dimensional functions. As shown in Table 10, the CSA outperformed the PSO and required less CPU time than all other algorithms for all functions. However, GWO and SSA required higher CPU time. Additionally, Table 11 compares the CPU times of each algorithm, revealing that the CSA ran very fast and almost had half the CPU time of the GWO. The results have proven that the CSA is more efficient and faster than the compared algorithms.

4.6. Convergence Speed

For finding a fair comparison between the convergence speed of the CSA and the convergence speed of other algorithms, the same initial populations were used for all algorithms. Figure 4 shows that all convergence curves started from the same point, where the convergence curve (red line) of the proposed CSA converged to the minimum value faster than other algorithms. This fast convergence speed of the CSA was due to the behavior of the Tan function with the variations of the angle.
Further statistical analysis was conducted using a box plot to show the minimum, maximum, and median of the results of 30 independent experiments. Figure 5 exhibits the statistical analysis for four selected functions (F5, F12, F16, and F22), where the deviation between the minimum and maximum for the proposed CSA was very small for all functions. However, other algorithms exhibited wide deviation between the minimum and maximum values for 30 independent runs.

5. Real-World Engineering Problems

The principal goal of this section is to evaluate the performance of the proposed CSA using constrained engineering problems, including welded beam, pressure vessel, and tension spring designs. The CSA, PSO, SSA, SCA, and GWO algorithms were applied to obtain an optimal design of these problems. The number of the population of all algorithms was 100, and the iteration number was 10,000. The statistical results were obtained for 30 independent runs.

5.1. Welded Beam

The SCA was used to optimally design the famous welded beam design, which is shown in Figure 6. The objective is to reduce the manufacturing cost of the welded beam by finding its optimum design factors. The optimal design is subjected to many restrictions, such as shear stress (τ), bending stress (σ), buckling load (Pc), deflection (δ), and other restrictions. The formulations of objective function and its constraints are stated in Equation (9). The design factors x = [x1, x2, x3, x4] = [h, l, t, b] refer to thickness of weld, length, height, and thickness of the bar, respectively. Table 12 lists the optimal design factors that were achieved by the CSA and other algorithms. It is obvious that the CSA obtained the lowest cost compared with PSO, SSA, SCA, and GWO algorithms. Moreover, the standard deviation of 30 runs’ results of the CSA was low, and thus the CSA was stable at all 30 runs. Therefore, the CSA was considered robust enough to be used in real-world engineering problems.
x = [ x 1   x 2   x 3   x 4 ] min f ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 ) g 1 ( x ) = τ τ max 0   and   g 2 ( x ) = σ σ max 0 g 3 ( x ) = δ δ max 0   and   g 4 ( x ) = x 1 x 4 0 g 5 ( x ) = P P c 0   and   g 6 ( x ) = 0.125 x 1 0 g 7 ( x ) = 0.10471 x 1 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 ) 5.0 0
The range of variables is
0.1 x 1 2 0.1 x 2 10 0.1 x 3 10 0.1 x 4 2
where
P = 6000   l b ;   L = 14   i n ;   E = 30 × 10 6   p s i ;   G = 12 × 10 6   p s i τ max = 13600   p s i ;   σ max = 30000   p s i ;   δ max = 0.25   i n M = P ( L + x 2 2 ) ;   R = x 2 2 4 + ( x 1 + x 3 2 ) 2 ;   τ = P 2 x 1 x 2 ;   τ = M R J
J = 2 2 x 1 x 2 ( x 2 2 12 + ( x 1 + x 3 2 ) 2 ) ;   τ = τ 2 + τ τ x 2 R + τ 2 P c = 4.013 E L 2 x 3 2 x 4 6 36 ( 1 x 3 2 L E 4 G ) σ = 6 P L x 4 x 3 2 ;   δ = 4 P L 3 E x 4 x 3 2

5.2. Pressure Vessel

In this application, the target was to design the cheapest and optimal formation of a pressure vessel, as depicted in Figure 7. The formulation of cost function and its constraints are shown in Equation (10). The design factors of the vessel were the thicknesses of the shell and head (Ts and Th) and the inner radius and length of the vessel (R and L) without bounce. Table 13 lists the optimal design factors of the pressure vessel that were obtained by the CSA and other algorithms, including PSO, SSA, SCA, and GWO algorithms. It was obvious that the CSA was competing with other algorithms to find the best minimum; however, the GWO algorithm obtained better mean and standard deviations for 30 independent runs.
x = [ x 1   x 2   x 3   x 4 ] = [ T s , T h , R , L ] min f ( x ) = 0.6224 x 1 x 3 x 4 + 1.7781 x 2 x 3 2 + 3.1661 x 1 2 x 4 + 19.84 x 1 2 x 3 g 1 ( x ) = x 1 + 0.0193 x 3 0 g 2 ( x ) = x 2 + 0.00954 x 3 0 g 3 ( x ) = π x 3 2 x 4 4 3 π x 3 3 + 1296000 0 g 4 ( x ) = x 4 240 0
The range of variables
0 x 1 99 0 x 2 99 10 x 3 200 10 x 4 200

5.3. Tension Spring

The objective here was to optimally design the tension spring with the lowest weight, as displayed in Figure 8. The mathematical formulations of the weight function and the restriction functions are displayed in Equation (11). The main four restriction functions were deflection, shear stress, surge wave frequency, and outer diameter. The optimal design factors were the mean coil diameter (D), wire diameter (d), and the active coils’ number (N). Table 14 lists the optimal design factors of the tension spring coil that were obtained by the CSA and other algorithms, including PSO, SSA, SCA, and GWO algorithms. It was obvious that the proposed CSA was competitive in obtaining the best minimum weight of the tension spring and low average of the results of 30 runs.
x = [ x 1   x 2   x 3 ] = [ D , d , N ] min f ( x ) = x 1 2 x 2 ( x 3 + 2 ) g 1 ( x ) = 1 x 2 3 x 3 71785 x 1 4 0   and   g 2 ( x ) = 4 x 2 2 x 1 x 2 12566 x 1 3 ( x 2 x 1 ) + 1 5108 x 1 2 1 0 g 3 ( x ) = 1 140.45 x 1 x 3 x 2 2 0   and   g 4 ( x ) = x 1 + x 2 1.5 1 0
The range of variables
0.05 x 1 2 & 0.25 x 2 1.3 2.00 x 3 15

6. Conclusions and Future Work

This article introduced a novel geometry-based metaheuristic algorithm dubbed the circle search algorithm (CSA). This approach is inspired by the circle’s extraordinary features, particularly the relationship between the tangent line and the orthogonal radius, which is represented by the tan function of the opposing angle to the radius. The CSA exploration is represented by numerous random circles that search for the optimal answer in a variety of directions. The CSA is exploited by decreasing the radius and opposing angle of the tangent point until it reaches the center point. The CSA effectiveness was demonstrated using 23 standard renowned functions and compared with the performance of other well-known algorithms, including the PSO, SSA, SCA, and GWO algorithms. The unimodal functions demonstrated that the CSA is more capable of deepening exploitation than other methods. The multimodal functions validated the CSA capacity to explore and escape from local optima. Convergence curves were used to visually inspect the CSA convergence rate. Furthermore, the proposed algorithm was tested with high-dimensional problems, and its results outperformed the other compared algorithms. Moreover, the computational time of the CSA had the shortest value among the algorithms. Numerous statistical tests were conducted to determine the suggested CSA superiority and significance in comparison with other algorithms. The mean, standard deviation, rank test, and t-test for null hypothesis were used to examine the 30 independent runs’ results. These statistical tests established the suggested CSA resilience in comparison with other algorithms, where more than 90% of p-values were less than 0.05. To provide additional validation, the CSA method was applied to three well-known, real-world engineering problems, namely, welded beam, pressure vessel, and tension spring. In conclusion, researchers are encouraged by the acquired results to apply the CSA to solve a variety of real-world engineering optimization problems. In future works, the CSA will be applied to renewable energy modeling, power system operation and control, control applications, microgrids, and smart grid applications.

Author Contributions

Conceptualization, M.H.Q.; Data curation, S.A.; Formal analysis, M.H.Q., H.M.H., R.A.T., S.A., M.T.-V. and F.J.; Investigation, H.M.H., R.A.T. and M.T.-V.; Methodology, M.H.Q.; Project administration, S.A.; Resources, F.J.; Software, M.H.Q.; Supervision, S.A. and F.J.; Visualization, H.M.H., R.A.T. and M.T.-V.; Writing—original draft, M.H.Q.; Writing—review & editing, H.M.H., M.T.-V. and F.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Researchers Supporting Project number (RSP-2021/307), King Saud University, Riyadh, Saudi Arabia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

This work was supported by the Researchers Supporting Project number (RSP-2021/307), King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, P.C.; Shoup, T.E. Parameter sensitivity study of the Nelder-Mead Simplex Method. Adv. Eng. Softw. 2011, 42, 529–533. [Google Scholar] [CrossRef]
  2. Altinoz, O.T.; Yilmaz, A.E. Multiobjective Hooke–Jeeves algorithm with a stochastic Newton–Raphson-like step-size method. Expert Syst. Appl. 2019, 117, 166–175. [Google Scholar] [CrossRef]
  3. Leardi, R. Genetic Algorithms. Compr. Chemom. 2009, 1, 631–653. [Google Scholar]
  4. Clerc, M. Particle Swarm Optimization; ISTE: London, UK, 2006; Volume 4, ISBN 9780470612163. [Google Scholar]
  5. Qais, M.; Abdulwahid, Z. A new method for improving particle swarm optimization algorithm (TriPSO). In Proceedings of the 2013 5th International Conference on Modeling, Simulation and Applied Optimization, Hammamet, Tunisia, 28–30 April 2013. [Google Scholar]
  6. Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  7. Karaboga, D.; Basturk, B. Artificial Bee Colony (ABC) optimization algorithm for solving constrained optimization problems. In Proceedings of the Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Cancun, Mexico, 18–21 June 2007; Melin, P., Castillo, O., Aguilar, L.T., Kacprzyk, J., Pedrycz, W., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; Volume 4529, pp. 789–798. [Google Scholar]
  8. Ji, J.; Song, S.; Tang, C.; Gao, S.; Tang, Z.; Todo, Y. An artificial bee colony algorithm search guided by scale-free networks. Inf. Sci. 2019, 473, 142–165. [Google Scholar] [CrossRef]
  9. Dorigo, M.; Blum, C. Ant colony optimization theory: A survey. Theor. Comput. Sci. 2005, 344, 243–278. [Google Scholar] [CrossRef]
  10. Yang, X.S. A new metaheuristic Bat-inspired Algorithm. In Studies in Computational Intelligence; González, J.R., Pelta, D.A., Cruz, C., Terrazas, G., Krasnogor, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; Volume 284, pp. 65–74. ISBN 9783642125379. [Google Scholar]
  11. Cheng, M.Y.; Prayogo, D. Symbiotic Organisms Search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
  12. Qi, X.; Zhu, Y.; Zhang, H. A new meta-heuristic butterfly-inspired algorithm. J. Comput. Sci. 2017, 23, 226–239. [Google Scholar] [CrossRef]
  13. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  14. Abdollahzadeh, B.; Soleimanian Gharehchopogh, F.; Mirjalili, S. Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. Int. J. Intell. Syst. 2021, 36, 5887–5958. [Google Scholar] [CrossRef]
  15. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper Optimisation Algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef] [Green Version]
  16. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  17. Oftadeh, R.; Mahjoob, M.J.; Shariatpanahi, M. A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search. Comput. Math. Appl. 2010, 60, 2087–2098. [Google Scholar] [CrossRef] [Green Version]
  18. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  19. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  20. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Futur. Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  21. Pierezan, J.; Dos Santos Coelho, L. Coyote Optimization Algorithm: A New Metaheuristic for Global Optimization Problems. In Proceedings of the 2018 IEEE Congress on Evolutionary Computation, Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–8. [Google Scholar]
  22. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  23. Kaveh, A.; Farhoudi, N. A new optimization method: Dolphin echolocation. Adv. Eng. Softw. 2013, 59, 53–70. [Google Scholar] [CrossRef]
  24. Yu, J.J.Q.; Li, V.O.K. A social spider algorithm for global optimization. Appl. Soft Comput. J. 2015, 30, 614–627. [Google Scholar] [CrossRef] [Green Version]
  25. Eusuff, M.; Lansey, K.; Pasha, F. Shuffled frog-leaping algorithm: A memetic meta-heuristic for discrete optimization. Eng. Optim. 2006, 38, 129–154. [Google Scholar] [CrossRef]
  26. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110. [Google Scholar] [CrossRef]
  27. Gandomi, A.H.; Yang, X.S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  28. Askarzadeh, A. Bird mating optimizer: An optimization algorithm inspired by bird mating strategies. Commun. Nonlinear Sci. Numer. Simul. 2014, 19, 1213–1228. [Google Scholar] [CrossRef]
  29. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
  30. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S.; Elgendy, M.A. Output Power Smoothing of Grid-Tied PMSG-Based Variable Speed Wind Turbine Using Optimal Controlled SMES. In Proceedings of the 2019 54th International Universities Power Engineering Conference, Bucharest, Romania, 3–6 September 2019; pp. 1–6. [Google Scholar]
  31. Kallioras, N.A.; Lagaros, N.D.; Avtzis, D.N. Pity beetle algorithm—A new metaheuristic inspired by the behavior of bark beetles. Adv. Eng. Softw. 2018, 121, 147–166. [Google Scholar] [CrossRef]
  32. Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications. Adv. Eng. Softw. 2017, 114, 48–70. [Google Scholar] [CrossRef]
  33. Mohammadi-Balani, A.; Dehghan Nayeri, M.; Azar, A.; Taghizadeh-Yazdi, M. Golden eagle optimizer: A nature-inspired metaheuristic algorithm. Comput. Ind. Eng. 2021, 152, 107050. [Google Scholar] [CrossRef]
  34. Gomes, G.F.; da Cunha, S.S.; Ancelotti, A.C. A sunflower optimization (SFO) algorithm applied to damage identification on laminated composite plates. Eng. Comput. 2019, 35, 619–626. [Google Scholar] [CrossRef]
  35. Kiran, M.S. TSA: Tree-seed algorithm for continuous optimization. Expert Syst. Appl. 2015, 42, 6686–6698. [Google Scholar] [CrossRef]
  36. Yang, X.S. Flower pollination algorithm for global optimization. In Proceedings of the Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Orléan, France, 3–7 September 2012; Springer: Berlin/Heidelberg, Germany; Volume 7445, pp. 240–249. [Google Scholar]
  37. Eskandar, H.; Sadollah, A.; Bahreininejad, A.; Hamdi, M. Water cycle algorithm—A novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput. Struct. 2012, 110–111, 151–166. [Google Scholar] [CrossRef]
  38. Kaveh, A.; Bakhshpoori, T. Water Evaporation Optimization: A novel physically inspired optimization algorithm. Comput. Struct. 2016, 167, 69–85. [Google Scholar] [CrossRef]
  39. Patel, V.K.; Savsani, V.J. Heat transfer search (HTS): A novel optimization algorithm. Inf. Sci. 2015, 324, 217–246. [Google Scholar] [CrossRef]
  40. Shayanfar, H.; Gharehchopogh, F.S. Farmland fertility: A new metaheuristic algorithm for solving continuous optimization problems. Appl. Soft Comput. J. 2018, 71, 728–746. [Google Scholar] [CrossRef]
  41. Kaveh, A.; Dadras Eslamlou, A. Water strider algorithm: A new metaheuristic and applications. Structures 2020, 25, 520–541. [Google Scholar] [CrossRef]
  42. Moazzeni, A.R.; Khamehchi, E. Rain optimization algorithm (ROA): A new metaheuristic method for drilling optimization solutions. J. Pet. Sci. Eng. 2020, 195, 107512. [Google Scholar] [CrossRef]
  43. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Output power smoothing of grid-connected permanent-magnet synchronous generator driven directly by variable speed wind turbine: A review. J. Eng. 2017, 2017, 1755–1759. [Google Scholar] [CrossRef]
  44. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  45. Wang, Y.; Yu, Y.; Gao, S.; Pan, H.; Yang, G. A hierarchical gravitational search algorithm with an effective gravitational constant. Swarm Evol. Comput. 2019, 46, 118–139. [Google Scholar] [CrossRef]
  46. Pereira, J.L.J.; Francisco, M.B.; Diniz, C.A.; Antônio Oliver, G.; Cunha, S.S.; Gomes, G.F. Lichtenberg algorithm: A novel hybrid physics-based meta-heuristic for global optimization. Expert Syst. Appl. 2021, 170, 114522. [Google Scholar] [CrossRef]
  47. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Futur. Gener. Comput. Syst. 2019, 101, 646–667. [Google Scholar] [CrossRef]
  48. Qais, M.; Khaled, U.; Alghuwainem, S. Improved differential relay for bus bar protection scheme with saturated current transformers based on second order harmonics. J. King Saud Univ. Eng. Sci. 2018, 30, 320–329. [Google Scholar] [CrossRef] [Green Version]
  49. Qais, M.; Khaled, U. Evaluation of V–t characteristics caused by lightning strokes at different locations along transmission lines. J. King Saud Univ. Eng. Sci. 2018, 30, 150–160. [Google Scholar] [CrossRef] [Green Version]
  50. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110, 69–84. [Google Scholar] [CrossRef]
  51. Kaveh, A.; Mahdavi, V.R. Colliding bodies optimization: A novel meta-heuristic method. Comput. Struct. 2014, 139, 18–27. [Google Scholar] [CrossRef]
  52. Zhao, W.; Wang, L.; Zhang, Z. Atom search optimization and its application to solve a hydrogeologic parameter estimation problem. Knowl. Based Syst. 2019, 163, 283–304. [Google Scholar] [CrossRef]
  53. Kaveh, A.; Talatahari, S. A novel heuristic optimization method: Charged system search. Acta Mech. 2010, 213, 267–289. [Google Scholar] [CrossRef]
  54. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray Optimization. Comput. Struct. 2012, 112–113, 283–294. [Google Scholar] [CrossRef]
  55. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Transient search optimization: A new meta-heuristic optimization algorithm. Appl. Intell. 2020, 50, 3926–3941. [Google Scholar] [CrossRef]
  56. Muthiah-Nakarajan, V.; Noel, M.M. Galactic Swarm Optimization: A new global optimization metaheuristic inspired by galactic motion. Appl. Soft Comput. J. 2016, 38, 771–787. [Google Scholar] [CrossRef]
  57. Formato, R.A. Central force optimization: A new nature inspired computational framework for multidimensional search and optimization. Stud. Comput. Intell. 2008, 129, 221–238. [Google Scholar]
  58. Javidy, B.; Hatamlou, A.; Mirjalili, S. Ions motion algorithm for solving optimization problems. Appl. Soft Comput. J. 2015, 32, 72–79. [Google Scholar] [CrossRef]
  59. Gholizadeh, S.; Danesh, M.; Gheyratmand, C. A new Newton metaheuristic algorithm for discrete performance-based design optimization of steel moment frames. Comput. Struct. 2020, 234, 106250. [Google Scholar] [CrossRef]
  60. Ahmadianfar, I.; Bozorg-Haddad, O.; Chu, X. Gradient-based optimizer: A new metaheuristic optimization algorithm. Inf. Sci. 2020, 540, 131–159. [Google Scholar] [CrossRef]
  61. Azizi, M. Atomic orbital search: A novel metaheuristic algorithm. Appl. Math. Model. 2021, 93, 657–683. [Google Scholar] [CrossRef]
  62. Talatahari, S.; Azizi, M. Chaos Game Optimization: A novel metaheuristic algorithm. Artif. Intell. Rev. 2021, 54, 917–1004. [Google Scholar] [CrossRef]
  63. van Laarhoven, P.J.M.; Aarts, E.H.L. (Eds.) Simulated Annealing: Theory and Applications; Springer: Dordrecht, The Netherlands, 1987; pp. 7–15. ISBN 978-94-015-7744-1. [Google Scholar]
  64. Hasanien, H.M.; Muyeen, S.M. Particle swarm optimization-based superconducting magnetic energy storage for low-voltage ride-through capability enhancement in wind energy conversion system. Electr. Power Compon. Syst. 2015, 43, 1278–1288. [Google Scholar] [CrossRef] [Green Version]
  65. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Enhanced whale optimization algorithm for maximum power point tracking of variable-speed wind generators. Appl. Soft Comput. J. 2020, 86, 105937. [Google Scholar] [CrossRef]
  66. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Augmented grey wolf optimizer for grid-connected PMSG-based wind energy conversion systems. Appl. Soft Comput. J. 2018, 69, 504–515. [Google Scholar] [CrossRef]
  67. Qais, M.; Hasanien, H.M.; Alghuwainem, S. Salp swarm algorithm-based TS-FLCs for MPPT and fault ride-through capability enhancement of wind generators. ISA Trans. 2020, 101, 211–224. [Google Scholar] [CrossRef]
  68. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. A Grey Wolf Optimizer for Optimum Parameters of Multiple PI Controllers of a Grid-Connected PMSG Driven by Variable Speed Wind Turbine. IEEE Access 2018, 6, 44120–44128. [Google Scholar] [CrossRef]
  69. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Whale optimization algorithm-based Sugeno fuzzy logic controller for fault ride-through improvement of grid-connected variable speed wind generators. Eng. Appl. Artif. Intell. 2020, 87, 103328. [Google Scholar] [CrossRef]
  70. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Identification of electrical parameters for three-diode photovoltaic model using analytical and sunflower optimization algorithm. Appl. Energy 2019, 250, 109–117. [Google Scholar] [CrossRef]
  71. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S.; Nouh, A.S. Coyote optimization algorithm for parameters extraction of three-diode photovoltaic models of photovoltaic modules. Energy 2019, 187, 116001. [Google Scholar] [CrossRef]
  72. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Parameters extraction of three-diode photovoltaic model using computation and Harris Hawks optimization. Energy 2020, 195, 117040. [Google Scholar] [CrossRef]
  73. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Transient search optimization for electrical parameters estimation of photovoltaic module based on datasheet values. Energy Convers. Manag. 2020, 214, 112904. [Google Scholar] [CrossRef]
  74. Qais, M.H.; Hasanien, H.M.; Alghuwainem, S. Optimal transient search algorithm-based PI controllers for enhancing low voltage ride-through ability of grid-linked PMSG-based wind turbine. Electronics 2020, 9, 1807. [Google Scholar] [CrossRef]
  75. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  76. Jiang, Y.; Wu, Q.; Zhu, S.; Zhang, L. Orca predation algorithm: A novel bio-inspired algorithm for global optimization problems. Expert Syst. Appl. 2022, 188, 116026. [Google Scholar] [CrossRef]
  77. Suyanto, S.; Ariyanto, A.A.; Ariyanto, A.F. Komodo Mlipir Algorithm. Appl. Soft Comput. 2022, 114, 108043. [Google Scholar] [CrossRef]
  78. Li, C.; Chen, G.; Liang, G.; Luo, F.; Zhao, J.; Dong, Z.Y. Integrated optimization algorithm: A metaheuristic approach for complicated optimization. Inf. Sci. 2022, 586, 424–449. [Google Scholar] [CrossRef]
  79. Abualigah, L.; Elaziz, M.A.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  80. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  81. Jafari, M.; Salajegheh, E.; Salajegheh, J. Elephant clan optimization: A nature-inspired metaheuristic algorithm for the optimal design of structures. Appl. Soft Comput. 2021, 113, 107892. [Google Scholar] [CrossRef]
  82. Feng, Z.; Niu, W.; Liu, S. Cooperation search algorithm: A novel metaheuristic evolutionary intelligence algorithm for numerical optimization and engineering optimization problems. Appl. Soft Comput. 2021, 98, 106734. [Google Scholar] [CrossRef]
  83. Zhang, Y.; Jin, Z. Group teaching optimization algorithm: A novel metaheuristic method for solving global optimization problems. Expert Syst. Appl. 2020, 148, 113246. [Google Scholar] [CrossRef]
  84. Zervoudakis, K.; Tsafarakis, S. A mayfly optimization algorithm. Comput. Ind. Eng. 2020, 145, 106559. [Google Scholar] [CrossRef]
  85. Shabani, A.; Asgarian, B.; Salido, M.; Asil Gharebaghi, S. Search and rescue optimization algorithm: A new optimization method for solving constrained engineering optimization problems. Expert Syst. Appl. 2020, 161, 113698. [Google Scholar] [CrossRef]
  86. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
Figure 1. Terminologies of the geometric circle.
Figure 1. Terminologies of the geometric circle.
Mathematics 10 01626 g001
Figure 2. The processes of the CSA algorithm (a) exploitation; (b) exploration.
Figure 2. The processes of the CSA algorithm (a) exploitation; (b) exploration.
Mathematics 10 01626 g002
Figure 3. Flowchart of the CSA.
Figure 3. Flowchart of the CSA.
Mathematics 10 01626 g003
Figure 4. Convergence curves of the applied algorithms.
Figure 4. Convergence curves of the applied algorithms.
Mathematics 10 01626 g004aMathematics 10 01626 g004bMathematics 10 01626 g004cMathematics 10 01626 g004dMathematics 10 01626 g004e
Figure 5. Statistical results using box-plot for 30 independent runs.
Figure 5. Statistical results using box-plot for 30 independent runs.
Mathematics 10 01626 g005
Figure 6. Welded beam design.
Figure 6. Welded beam design.
Mathematics 10 01626 g006
Figure 7. Cylindrical vessel design.
Figure 7. Cylindrical vessel design.
Mathematics 10 01626 g007
Figure 8. Tension coil spring.
Figure 8. Tension coil spring.
Mathematics 10 01626 g008
Table 1. State of the art metaheuristic algorithms.
Table 1. State of the art metaheuristic algorithms.
AlgorithmNameReferenceClassificationMimicking
Orca predation algorithmOPA[76]Biology-based (hunters)The orcas’ hunting habit.
Komodo mlipir algorithmKMA[77]Biology-basedKomodo dragons and miliper foraging and reproduction
Integrated optimization algorithmIOA[78]Evolutionarily basedFollower search, leader search, wanderer search, crossover search, and role learning are all terms used to construct IOA
Reptile search algorithmRSA[79]Biology-based (hunters)Crocodiles’ hunting habit
African vultures optimizationAVOA[80]Biology-basedAfrican vultures’ feeding and navigational behaviors
Elephant clan optimizationECO[81]Biology-basedElephants’ clan behavior.
Cooperation search algorithmCoSA[82]Human-learning-basedThe behaviors of teamwork in contemporary business
Group teaching optimizationGTOA[83]Human-learning-basedThe relationship between the instructor and his or her pupils
Mayfly algorithmMA[84]Biology-basedMayfly flying and mating behavior
Search and rescue optimization algorithmSAR[85]Human-learning-basedThe study of human behavior during search and rescue missions
Table 2. Unimodal standard famous functions.
Table 2. Unimodal standard famous functions.
FunctionExpressionDimension (d)Solution SpaceBest Solution
F1 f ( x ) = i = 1 d x i 2 30, 100, 1000[−100, 100]d0
F2 f ( x ) = i = 1 d | x i | + i = 1 d | x i | 30, 100, 1000[−10, 10]d0
F3 f ( x ) = i = 1 d ( j = 1 i x j ) 2 30, 100, 1000[−100, 100]d0
F4 f ( x ) = max i { | x i | , 1 i d } 30, 100, 1000[−100, 100]d0
F5 f ( x ) = i = 1 d [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ] 30, 100, 1000[−30, 30]d0
F6 f ( x ) = i = 1 d ( x i + 0.5 ) 2 30, 100, 1000[−100, 100]d0
F7 f ( x ) = i = 1 d i . x i 4 + r a n d o m [ 0 , 1 ) 30, 100, 1000[−1.28, 1.28]d0
Table 3. Multimodal standard famous functions.
Table 3. Multimodal standard famous functions.
FunctionExpressionDimension (d)Solution SpaceBest Solution
F8 f ( x ) = i = 1 d x i sin ( | x i | ) 30, 100, 1000[−500, 500]d−418.9829 × d
F9 f ( x ) = i = 1 n [ x i 2 10 cos ( 2 π x i ) + 10 ] 30, 100, 1000[−5.12, 5.12]d0
F10 f ( x ) = 20 exp ( 0.2 1 d j = 1 d x j ) exp ( 1 n cos ( 2 π x j ) ) + 20 + e 30, 100, 1000[−32, 32]d0
F11 f ( x ) = 1 4000 i = 1 d x i 2 i = 1 d cos ( x i i ) + 1 30, 100, 1000[−600, 600]d0
F12 f ( x ) = π d { 10 sin ( π y 1 ) + i = 1 d ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ] + ( y d 1 ) 2 } + i = 1 d u ( x i , 10 , 100 , 4 ) y i = 1 + x i + 1 4 u ( x i , a , k , m ) = { k ( x i a ) m x i > a 0 a < x i < a k ( x i a ) m x i < a 30, 100, 1000[−50, 50]d0
F13 f ( x ) = 0.1 × { 10 sin 2 ( 3 π x 1 ) + i = 1 d ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x d 1 ) 2 [ 1 + sin 2 ( 2 π x d ) ] } + i = 1 d u ( x i , 5 , 100 , 4 ) 30, 100, 1000[−50, 50]d0
Table 4. Fixed-dimension multimodal standard famous functions.
Table 4. Fixed-dimension multimodal standard famous functions.
FunctionExpressionDimension (d)Solution SpaceBest Solution
F14 f ( x ) = ( 1 500 + i = 1 25 1 i + j = 1 2 ( x i a i j ) 6 ) 1 2[−65, 65]d1
F15 f ( x ) = i = 1 11 [ a i x i ( b i 2 + b i x 2 ) b i 2 + b i x 3 + x 4 ] 2 4[−5, 5]d0.00030
F16 f ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 2[−5, 5]d−1.0316
F17 f ( x ) = ( x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 ) 2 + 10 ( 1 1 8 π ) cos ( x 1 ) + 10 2[−5, 5]d0.398
F18 f ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 3 x 2 ) 2 × ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ] 2[−2, 2]d3
F19 f ( x ) = i = 1 4 c i exp ( j = 1 3 a i j ( x j p i j ) 2 ) 3[1, 3]d−3.86
F20 f ( x ) = i = 1 4 c i exp ( j = 1 6 a i j ( x j p i j ) 2 ) 6[0, 1]d−3.32
F21 f ( x ) = i = 1 5 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]d−10.1532
F22 f ( x ) = i = 1 7 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]d−10.4028
F23 f ( x ) = i = 1 10 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]d−10.5363
Table 5. Parameters of the applied algorithms.
Table 5. Parameters of the applied algorithms.
AlgorithmParameters
Proposed CSAw decreased from 1.5 to 0 and constant c = 0.75 for F1–F13 and c = 0.3 for F14–F23
PSOInertia weight w decreased from 0.5 to 0.3, c1 = 2, and c2 = 2
GWOThe parameter a changed from 2 to 0
SSAProbability update was 0.5
SCAConstant a = 2 and probability update was 0.5
WOAThe parameter a changed from 2 to 0, b = 1
HHOThe decreasing energy E1 changed from 2 to 0
CGOα, β, and γ were random numbers
TSOThe parameter a changed from 2 to 0
Table 6. Statistical results of the optimized standard functions with d = 30.
Table 6. Statistical results of the optimized standard functions with d = 30.
GWOSCASSAHHOWOAPSOTSOCGOCSA
F1Avg.5.7838 × 10−386.5806 × 10−119.5464 × 10−081.4759 × 10−1096.6351 × 10−694.6281 × 10−071.0826 × 10−031.1934 × 10−1369.5326 × 10−219
STD.9.1734 × 10−382.7669 × 10−105.2552 × 10−088.0839 × 10−1093.6342 × 10−681.0552 × 10−063.9713 × 10−034.7846 × 10−1360.0000 × 1000
Min1.1944 × 10−401.9641 × 10−153.3386 × 10−080.0000 × 1000 6.6909 × 10−862.3249 × 10−108.9536 × 10−080.0000 × 1000 2.9648 × 10−278
F2Avg.1.8357 × 10−223.1710 × 10−082.6003 × 10−013.4043 × 10−536.7368 × 10−505.3551 × 10−048.4374 × 10−061.7566 × 10−711.3380 × 10−92
STD.3.6643 × 10−224.5318 × 10−081.8497 × 10−011.2317 × 10−523.3959 × 10−498.9901 × 10−041.2036 × 10−057.9508 × 10−717.3287 × 10−92
Min1.1603 × 10−231.1753 × 10−094.5012 × 10−039.7243 × 10−1721.1799 × 10−572.8776 × 10−066.5020 × 10−070.0000 × 1000 3.4777 × 10−140
F3Avg.1.2328 × 10−081.0265 × 10−073.0191 × 1002 5.3195 × 10−781.7475 × 10−011.0393 × 1003 3.9684 × 1002 1.0135 × 10−983.5072 × 10−192
STD.3.2665 × 10−083.1441 × 10−074.7355 × 1002 2.0975 × 10−774.9298 × 10−019.4985 × 1002 1.0181 × 1003 2.6061 × 10−980.0000 × 1000
Min3.4277 × 10−129.8383 × 10−131.1474 × 1000 4.6827 × 10−1037.1459 × 10−083.8895 × 10−024.5120 × 10−040.0000 × 1000 1.2646 × 10−274
F4Avg.5.7254 × 10−011.0814 × 1000 1.0281 × 1000 1.5750 × 10−1194.8675 × 10−023.0927 × 1000 7.2259 × 10−016.5523 × 10−591.2504 × 10−98
STD.1.0996 × 1000 2.5084 × 1000 8.5943 × 10−018.6266 × 10−1191.3524 × 10−012.9911 × 1000 6.7465 × 10−011.2075 × 10−586.8486 × 10−98
Min4.2053 × 10−089.9628 × 10−052.8062 × 10−028.0269 × 10−1816.5725 × 10−062.8062 × 10−026.0508 × 10−030.0000 × 1000 3.0109 × 10−139
F5Avg.2.5381 × 1001 7.0200 × 1001 2.5675 × 1001 1.8902 × 10−068.5474 × 1000 1.9573 × 1001 1.1420 × 1001 1.5351 × 1001 0.0000 × 1000
STD.1.5260 × 1001 1.3659 × 1002 2.4616 × 1001 1.0192 × 10−051.2854 × 1001 2.3932 × 1001 1.2831 × 1001 5.4322 × 1000 0.0000 × 1000
Min7.5169 × 10−027.5169 × 10−029.3447 × 10−040.0000 × 1000 2.3611 × 10−074.2949 × 10−054.6622 × 10−033.2114 × 10−060.0000 × 1000
F6Avg.5.4155 × 10−015.3173 × 1000 2.0211 × 10−072.8436 × 10−071.3848 × 10−024.7315 × 10−074.8451 × 10−011.8244 × 10−180.0000 × 1000
STD.3.5419 × 10−018.0594 × 10−013.5045 × 10−071.4406 × 10−062.4111 × 10−028.7532 × 10−074.2675 × 10−015.9259 × 10−180.0000 × 1000
Min2.8333 × 10−041.2272 × 1000 2.4022 × 10−080.0000 × 1000 7.5679 × 10−085.0032 × 10−093.5827 × 10−026.2486 × 10−240.0000 × 1000
F7Avg.1.3202 × 10−034.7389 × 10−044.0651 × 10−021.0839 × 10−041.2714 × 10−032.7046 × 10−022.5284 × 10−034.6345 × 10−043.8180 × 10−04
STD.6.4299 × 10−044.4640 × 10−044.0466 × 10−021.0664 × 10−042.4965 × 10−032.5014 × 10−022.8216 × 10−033.4197 × 10−046.6990 × 10−04
Min2.8985 × 10−044.5957 × 10−062.1466 × 10−031.3803 × 10−057.2309 × 10−077.7340 × 10−041.1353 × 10−047.3420 × 10−052.6012 × 10−05
F8Avg.−1.1269 × 1004 −1.1117 × 1004 −1.2096 × 1004 −1.2569 × 1004 −1.2427 × 1004 −1.2071 × 1004 −1.2223 × 1004 −1.2451 × 1004 −1.2569 × 1004
STD.1.6040 × 1003 1.5893 × 1003 1.2285 × 1003 2.3541 × 10−096.5709 × 1002 1.2389 × 1003 9.5254 × 1002 6.4871 × 1002 1.9404 × 10−12
Min−1.2557 × 1004 −1.2551 × 1004 −1.2569 × 1004 −1.2569 × 1004 −1.2569 × 1004 −1.2569 × 1004 −1.2569 × 1004 −1.2569 × 1004 −1.2569 × 1004
F9Avg.4.1807 × 1001 4.0559 × 1001 9.9496 × 1000 0.0000 × 1000 1.8948 × 10−153.0214 × 1001 2.2908 × 1000 0.0000 × 1000 0.0000 × 1000
STD.3.5269 × 1001 4.2016 × 1001 1.4311 × 1001 0.0000 × 1000 1.0378 × 10−143.3687 × 1001 1.2547 × 1001 0.0000 × 1000 0.0000 × 1000
Min0.0000 × 1000 0.0000 × 1000 2.3251 × 10−080.0000 × 1000 0.0000 × 1000 2.7281 × 10−081.7390 × 10−080.0000 × 1000 0.0000 × 1000
F10Avg.1.1978 × 10−012.5187 × 10−019.4409 × 10−018.8818 × 10−165.0330 × 10−159.1797 × 10−011.3331 × 10−012.7830 × 10−158.8818 × 10−16
STD.6.5604 × 10−019.5974 × 10−011.2595 × 1000 0.0000 × 1000 2.9626 × 10−151.9652 × 1000 7.2493 × 10−011.8027 × 10−150.0000 × 1000
Min7.9936 × 10−156.1332 × 10−094.1912 × 10−058.8818 × 10−168.8818 × 10−168.9230 × 10−067.4523 × 10−068.8818 × 10−168.8818 × 10−16
F11Avg.3.7391 × 10−032.1526 × 10−071.8495 × 10−020.0000 × 1000 0.0000 × 1000 1.1234 × 10−026.6144 × 10−020.0000 × 1000 0.0000 × 1000
STD.1.0062 × 10−021.1396 × 10−061.4792 × 10−020.0000 × 1000 0.0000 × 1000 1.3511 × 10−021.5859 × 10−010.0000 × 1000 0.0000 × 1000
Min0.0000 × 1000 3.2196 × 10−154.8962 × 10−040.0000 × 1000 0.0000 × 1000 2.9305 × 10−091.8765 × 10−080.0000 × 1000 0.0000 × 1000
F12Avg.1.2064 × 1000 1.4563 × 1000 1.9961 × 10−021.0540 × 10−083.7369 × 10−047.9748 × 10−017.4001 × 10−031.3909 × 10−201.5705 × 10−32
STD.2.5890 × 1000 2.4947 × 1000 7.7080 × 10−023.8589 × 10−081.1984 × 10−031.7667 × 1000 1.2526 × 10−026.6394 × 10−205.5674 × 10−48
Min1.6660 × 10−052.1551 × 10−044.8395 × 10−076.3387 × 10−212.1843 × 10−093.6269 × 10−131.3758 × 10−061.0149 × 10−251.5705 × 10−32
F13Avg.2.3921 × 10−011.8630 × 1000 4.0527 × 10−017.6694 × 10−072.4754 × 10−032.3928 × 10−011.5123 × 10−012.2315 × 10−021.3498 × 10−32
STD.2.3561 × 10−019.1497 × 10−011.5566 × 1000 3.5053 × 10−066.0994 × 10−031.2677 × 1000 1.5863 × 10−015.2790 × 10−025.5674 × 10−48
Min5.3035 × 10−043.4038 × 10−027.9766 × 10−061.3498 × 10−322.8468 × 10−091.1833 × 10−106.0890 × 10−058.1780 × 10−241.3498 × 10−32
F14Avg.3.2156 × 1000 1.9345 × 1000 1.0643 × 1000 9.9800 × 10−019.9800 × 10−011.0311 × 1000 9.9800 × 10−019.9800 × 10−019.9800 × 10−01
STD.3.8889 × 1000 9.9707 × 10−012.5219 × 10−011.5699 × 10−101.3046 × 10−081.8148 × 10−011.7835 × 10−070.0000 × 1000 1.7494 × 10−16
Min9.9800 × 10−019.9800 × 10−019.9800 × 10−019.9800 × 10−019.9800 × 10−019.9800 × 10−019.9800 × 10−019.9800 × 10−019.9800 × 10−01
F15Avg.4.0459 × 10−045.7433 × 10−046.4506 × 10−043.2725 × 10−043.2225 × 10−044.6375 × 10−043.7770 × 10−043.3801 × 10−043.0806 × 10−04
STD.1.1637 × 10−042.5944 × 10−043.3345 × 10−042.2421 × 10−052.5146 × 10−052.4872 × 10−041.7279 × 10−041.6718 × 10−047.8742 × 10−07
Min3.0749 × 10−043.3338 × 10−043.0769 × 10−043.0751 × 10−043.0784 × 10−043.0749 × 10−043.0758 × 10−043.0749 × 10−043.0749 × 10−04
F16Avg.−1.0316 × 1000 −1.0315 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000
STD.7.0549 × 10−088.7695 × 10−054.0464 × 10−143.0122 × 10−093.6944 × 10−096.6486 × 10−161.5990 × 10−056.7752 × 10−164.6137 × 10−09
Min−1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000 −1.0316 × 1000
F17Avg.3.9789 × 10−014.0100 × 10−013.9789 × 10−013.9789 × 10−013.9790 × 10−013.9789 × 10−013.9791 × 10−013.9789 × 10−013.9789 × 10−01
STD.5.3714 × 10−063.2491 × 10−037.3008 × 10−155.2004 × 10−061.7830 × 10−050.0000 × 1000 2.2182 × 10−050.0000 × 1000 5.2398 × 10−08
Min3.9789 × 10−013.9811 × 10−013.9789 × 10−013.9789 × 10−013.9789 × 10−013.9789 × 10−013.9789 × 10−013.9789 × 10−013.9789 × 10−01
F18Avg.3.0000 × 1000 3.0000 × 1000 3.0000 × 1000 3.0000 × 1000 3.0001 × 1000 3.0000 × 1000 2.9715 × 1001 3.0000 × 1000 3.0000 × 1000
STD.4.1823 × 10−053.5195 × 10−051.8243 × 10−136.0229 × 10−074.6026 × 10−049.9301 × 10−165.0596 × 1000 9.3663 × 10−164.9599 × 10−06
Min3.0000 × 1000 3.0000 × 1000 3.0000 × 1000 3.0000 × 1000 3.0000 × 1000 3.0000 × 1000 3.0032 × 1000 3.0000 × 1000 3.0000 × 1000
F19Avg.−3.8616 × 1000 −3.8507 × 1000 −3.8628 × 1000 −3.8583 × 1000 −3.8518 × 1000 −3.8628 × 1000 −3.8040 × 1000 −3.8628 × 1000 −3.8625 × 1000
STD.2.3429 × 10−037.7056 × 10−031.6209 × 10−116.6567 × 10−031.7466 × 10−022.6684 × 10−152.0896 × 10−012.7101 × 10−151.4192 × 10−03
Min−3.8628 × 1000 −3.8605 × 1000 −3.8628 × 1000 −3.8628 × 1000 −3.8628 × 1000 −3.8628 × 1000 −3.8628 × 1000 −3.8628 × 1000 −3.8628 × 1000
F20Avg.−3.3219 × 1000 −2.6135 × 1000 −3.2161 × 1000 −3.0656 × 1000 −3.3118 × 1000 −3.2638 × 1000 −3.3133 × 1000 −3.2784 × 1000 −3.3059 × 1000
STD.2.2933 × 10−053.1389 × 10−015.5503 × 10−021.0806 × 10−013.4871 × 10−027.1334 × 10−029.1886 × 10−035.8273 × 10−024.1813 × 10−02
Min−3.3220 × 1000 −3.0564 × 1000 −3.3220 × 1000 −3.1993 × 1000 −3.3217 × 1000 −3.3220 × 1000 −3.3211 × 1000 −3.3220 × 1000 −3.3220 × 1000
F21Avg.−8.3916 × 1000 −6.8959 × 1000 −1.0153 × 1001 −9.8130 × 1000 −9.8109 × 1000 −8.9697 × 1000 −1.0128 × 1001 −1.0153 × 1001 −1.0153 × 1001
STD.2.1174 × 1000 2.2821 × 1000 5.5169 × 10−111.2933 × 1000 1.2928 × 1000 2.1819 × 1000 2.7575 × 10−026.7923 × 10−151.4067 × 10−07
Min−1.0152 × 1001 −1.0152 × 1001 −1.0153 × 1001 −1.0153 × 1001 −1.0153 × 1001 −1.0153 × 1001 −1.0153 × 1001 −1.0153 × 1001 −1.0153 × 1001
F22Avg.−8.9990 × 1000 −7.2308 × 1000 −1.0227 × 1001 −9.8713 × 1000 −9.8701 × 1000 −8.3816 × 1000 −1.0391 × 1001 −1.0403 × 1001 −1.0403 × 1001
STD.2.2109 × 1000 2.4640 × 1000 9.6292 × 10−011.6218 × 1000 1.6214 × 1000 2.7339 × 1000 1.0467 × 10−021.3601 × 10−153.3477 × 10−05
Min−1.0402 × 1001 −1.0394 × 1001 −1.0403 × 1001 −1.0403 × 1001 −1.0403 × 1001 −1.0403 × 1001 −1.0403 × 1001 −1.0403 × 1001 −1.0403 × 1001
F23Avg.−8.6580 × 1000 −7.2350 × 1000 −9.8216 × 1000 −1.0175 × 1001 −1.0354 × 1001 −8.7432 × 1000 −1.0518 × 1001 −1.0358 × 1001 −1.0536 × 1001
STD.2.5993 × 1000 2.5826 × 1000 1.8535 × 1000 1.3719 × 1000 9.8705 × 10−012.5794 × 1000 2.0234 × 10−029.7874 × 10−012.7028 × 10−05
Min−1.0536 × 1001 −1.0536 × 1001 −1.0536 × 1001 −1.0536 × 1001 −1.0536 × 1001 −1.0536 × 1001 −1.0536 × 1001 −1.0536 × 1001 −1.0536 × 1001
Table 7. The p-values of the null hypothesis t-test of CSA versus other algorithms.
Table 7. The p-values of the null hypothesis t-test of CSA versus other algorithms.
GWOSCASSAHHOWOAPSOTSOCGO
F11.7344 × 10−061.7344 × 10−061.7344 × 10−066.0350 × 10−031.7344 × 10−061.7344 × 10−061.7344 × 10−061.9209 × 10−06
F21.7344 × 10−061.7344 × 10−061.7344 × 10−065.0383 × 10−011.7344 × 10−061.7344 × 10−061.7344 × 10−062.3534 × 10−06
F31.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−063.1817 × 10−06
F41.7344 × 10−061.7344 × 10−061.7344 × 10−063.7243 × 10−051.7344 × 10−061.7344 × 10−061.7344 × 10−062.3534 × 10−06
F51.7344 × 10−061.7344 × 10−061.7344 × 10−061.5625 × 10−021.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−06
F61.7344 × 10−061.7344 × 10−061.7344 × 10−061.3183 × 10−041.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−06
F75.3070 × 10−056.5641 × 10−021.7344 × 10−061.7088 × 10−033.3269 × 10−021.7344 × 10−065.7924 × 10−051.3194 × 10−02
F81.7344 × 10−061.7344 × 10−061.7344 × 10−066.2500 × 10−022.5631 × 10−061.7344 × 10−061.7344 × 10−061.2383 × 10−06
F92.5631 × 10−062.5596 × 10−061.7344 × 10−061.0000 × 1000 1.0000 × 1000 1.7344 × 10−061.7344 × 10−061.0000 × 1000
F101.4383 × 10−061.7344 × 10−061.7344 × 10−061.0000 × 1000 2.1912 × 10−051.7344 × 10−061.7344 × 10−066.3342 × 10−05
F111.2500 × 10−011.7344 × 10−061.7344 × 10−061.0000 × 1000 1.0000 × 1000 1.7344 × 10−061.7344 × 10−061.0000 × 1000
F121.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−061.7344 × 10−06
F131.7344 × 10−061.7344 × 10−061.7344 × 10−062.7016 × 10−051.7344 × 10−061.7344 × 10−061.7344 × 10−061.7300 × 10−06
F141.7344 × 10−061.7344 × 10−061.0231 × 10−051.8435 × 10−041.7344 × 10−061.0000 × 1000 1.7344 × 10−065.0000 × 10−01
F152.8434 × 10−051.7344 × 10−061.9209 × 10−066.3391 × 10−062.8786 × 10−063.3173 × 10−043.5152 × 10−063.1123 × 10−05
F163.8822 × 10−061.7344 × 10−061.7344 × 10−068.9443 × 10−044.1140 × 10−031.7344 × 10−062.1266 × 10−061.7344 × 10−06
F171.7344 × 10−061.7344 × 10−061.7344 × 10−065.2165 × 10−061.9209 × 10−061.7344 × 10−061.9209 × 10−061.7344 × 10−06
F183.0650 × 10−041.2453 × 10−021.7344 × 10−062.1630 × 10−051.1499 × 10−041.7344 × 10−061.7344 × 10−061.7344 × 10−06
F191.4936 × 10−051.7344 × 10−061.7344 × 10−069.3157 × 10−061.1265 × 10−051.7344 × 10−062.3534 × 10−061.7344 × 10−06
F201.5658 × 10−021.7344 × 10−065.7924 × 10−051.7344 × 10−061.1748 × 10−023.0861 × 10−011.4795 × 10−027.3433 × 10−01
F211.7344 × 10−061.7344 × 10−061.7344 × 10−061.3591 × 10−012.6033 × 10−066.6858 × 10−011.7344 × 10−066.0496 × 10−07
F221.7344 × 10−061.7344 × 10−063.1123 × 10−051.7988 × 10−059.3157 × 10−063.8202 × 10−012.1266 × 10−061.7300 × 10−06
F231.7344 × 10−061.7344 × 10−061.4795 × 10−021.9729 × 10−053.1123 × 10−056.4352 × 10−011.7344 × 10−063.1123 × 10−05
Table 8. Statistical results of the optimized standard functions with d = 100.
Table 8. Statistical results of the optimized standard functions with d = 100.
FunctionTestCSAPSOSSASCAGWO
F1Best4.05207 × 10−661.40909 × 10−028.41001 × 10−027.65220 × 10−051.15671 × 10−16
Mean9.49793 × 10−373.23392 × 1001 1.98479 × 1001 4.22900 × 10−012.72480 × 10−14
Std5.20184 × 10−364.83756 × 1001 2.80619 × 1001 9.12128 × 10−013.45607 × 10−14
F2Best1.79982 × 10−331.51649 × 10−013.28652 × 10−019.66793 × 10−054.82345 × 10−10
Mean2.92106 × 10−223.89166 × 1000 4.75249 × 1000 3.31250 × 10−036.43809 × 10−09
Std1.53426 × 10−214.88194 × 1000 3.71882 × 1000 3.35433 × 10−034.85863 × 10−09
F3Best6.77086 × 10−662.43120 × 1001 2.58477 × 1002 3.09196 × 10−045.83050 × 10−01
Mean1.18525 × 10−408.05289 × 1004 2.33034 × 1004 4.24039 × 1000 1.82227 × 1002
Std6.39505 × 10−403.84529 × 1004 3.16829 × 1004 6.62382 × 1000 2.88485 × 1002
F4Best5.64112 × 10−361.97453 × 10−011.96642 × 10−011.97453 × 10−011.97453 × 10−01
Mean3.81723 × 10−222.62001 × 1000 9.88436 × 10−012.62001 × 1000 2.59439 × 1000
Std1.83964 × 10−212.72100 × 1000 7.22522 × 10−012.72100 × 1000 2.71903 × 1000
F5Best0.00000 × 1000 8.09021 × 10−011.15830 × 1000 1.80961 × 1000 1.80961 × 1000
Mean0.00000 × 1000 3.38351 × 1003 4.75416 × 1002 4.89317 × 1003 1.26269 × 1002
Std0.00000 × 1000 1.14975 × 1004 9.22158 × 1002 1.39892 × 1004 1.65257 × 1002
F6Best0.00000 × 1000 9.17634 × 10−053.55921 × 10−041.76154 × 10−038.62047 × 10−04
Mean0.00000 × 1000 3.30019 × 1001 1.32821 × 1001 2.93302 × 1001 7.17494 × 1000
Std0.00000 × 1000 5.13931 × 1001 1.86246 × 1001 2.88694 × 1001 3.74740 × 1000
F7Best4.23203 × 10−061.54553 × 10−024.83797 × 10−033.57837 × 10−043.66995 × 10−03
Mean4.00859 × 10−041.86225 × 10−011.41949 × 10−011.01286 × 10−018.42239 × 10−03
Std4.78167 × 10−041.75616 × 10−011.17454 × 10−012.46964 × 10−014.68636 × 10−03
F8Best−4.18983 × 1004 −4.18952 × 1004 −4.18914 × 1004 −4.18464 × 1004 −4.18464 × 1004
Mean−4.18983 × 1004 −3.96372 × 1004 −4.03440 × 1004 −3.75470 × 1004 −3.82785 × 1004
Std2.96014 × 10−114.49162 × 1003 3.63037 × 1003 5.82693 × 1003 5.27419 × 1003
F9Best0.00000 × 1000 3.27575 × 10−038.23333 × 10−033.73030 × 10−053.25244 × 10−06
Mean0.00000 × 1000 8.27521 × 1001 4.28287 × 1001 1.24652 × 1002 1.51656 × 1002
Std0.00000 × 1000 7.32871 × 1001 4.87230 × 1001 1.19664 × 1002 1.11244 × 1002
F10Best8.88178 × 10−167.84248 × 10−022.24259 × 10−028.35089 × 10−041.59905 × 10−09
Mean8.88178 × 10−162.88503 × 1000 1.76283 × 1000 1.31387 × 1000 9.57690 × 10−01
Std0.00000 × 1000 2.62189 × 1000 1.12465 × 1000 2.18567 × 1000 2.49990 × 1000
F11Best0.00000 × 1000 4.25764 × 10−035.71780 × 10−023.41024 × 10−060.00000 × 1000
Mean0.00000 × 1000 1.52847 × 1000 9.20741 × 10−012.27770 × 10−011.55201 × 10−03
Std0.00000 × 1000 1.66071 × 1000 5.63135 × 10−012.79365 × 10−015.96126 × 10−03
F12Best4.71163 × 10−331.97180 × 10−041.12509 × 10−035.24202 × 10−031.23122 × 10−04
Mean4.71163 × 10−331.01808 × 1000 4.60484 × 10−011.64094 × 1000 1.44908 × 1000
Std1.39185 × 10−481.49538 × 1000 1.11625 × 1000 1.78266 × 1000 2.82336 × 1000
F13Best1.34978 × 10−323.52700 × 10−052.58636 × 10−047.25952 × 10−044.19563 × 10−04
Mean1.34978 × 10−321.45863 × 1001 2.90656 × 1000 2.47516 × 1001 2.72461 × 1000
Std5.56740 × 10−482.21891 × 1001 5.27444 × 1000 4.78364 × 1001 2.82642 × 1000
Table 9. Statistical results of the optimized standard functions with d = 1000.
Table 9. Statistical results of the optimized standard functions with d = 1000.
FunctionTestCSAPSOSSASCAGWO
F1Best1.74114 × 10−687.86330 × 10−018.69625 × 10−019.85563 × 10−011.28824 × 10−03
Mean6.75805 × 10−432.57658 × 1004 1.87823 × 1003 3.13124 × 1004 6.86880 × 1001
Std3.69123 × 10−425.49528 × 1004 4.45580 × 1003 6.67314 × 1004 1.47024 × 1002
F2Best4.94485 × 10−351.70351 × 1000 1.86340 × 1000 2.11466 × 1000 7.00593 × 10−04
Mean2.44060 × 10−212.85661 × 1002 9.02285 × 1001 4.69947 × 1001 4.62960 × 10−02
Std1.33304 × 10−202.42764 × 1002 7.96943 × 1001 2.46857 × 1001 3.27695 × 10−02
F3Best7.28940 × 10−621.44335 × 1006 7.08473 × 1003 3.52592 × 1004 3.55999 × 1005
Mean4.45675 × 10−399.83821 × 1006 5.17443 × 1006 8.89991 × 1005 1.07487 × 1006
Std2.13852 × 10−383.95888 × 1006 4.61038 × 1006 6.68057 × 1005 3.87543 × 1005
F4Best9.39244 × 10−412.82903 × 10−022.82903 × 10−022.82903 × 10−022.82903 × 10−02
Mean4.97072 × 10−233.13296 × 1000 1.47575 × 1000 3.13296 × 1000 3.01621 × 1000
Std2.35097 × 10−222.36133 × 1000 1.34765 × 1000 2.36133 × 1000 2.28900 × 1000
F5Best0.00000 × 1000 1.11144 × 1002 1.10805 × 1002 1.13580 × 1002 1.13580 × 1002
Mean0.00000 × 1000 3.96596 × 1006 1.47830 × 1005 4.84064 × 1006 1.08327 × 1006
Std0.00000 × 1000 1.44863 × 1007 7.61203 × 1005 1.75361 × 1007 4.23453 × 1006
F6Best0.00000 × 1000 1.11552 × 1000 1.18127 × 1000 1.37640 × 1000 1.16202 × 1000
Mean0.00000 × 1000 9.06200 × 1003 1.26919 × 1003 1.13358 × 1004 1.43531 × 1002
Std0.00000 × 1000 1.17632 × 1004 2.22354 × 1003 1.46871 × 1004 1.19839 × 1002
F7Best1.34248 × 10−052.95314 × 10−023.36464 × 10−038.28122 × 10−035.86957 × 10−03
Mean2.40186 × 10−043.51020 × 1001 7.03426 × 10−013.98208 × 1001 8.03855 × 1000
Std2.09618 × 10−041.65495 × 1002 1.69178 × 1000 1.89489 × 1002 2.83190 × 1001
F8Best−4.18983 × 1005 −4.18978 × 1005 −4.18978 × 1005 −4.18977 × 1005 −4.18977 × 1005
Mean−4.18983 × 1005 −3.87824 × 1005 −3.88408 × 1005 −3.84621 × 1005 −3.84898 × 1005
Std1.18405 × 10−104.42563 × 1004 4.34900 × 1004 4.69810 × 1004 4.68726 × 1004
F9Best0.00000 × 1000 4.15359 × 10−014.65994 × 10−015.27444 × 10−015.27444 × 10−01
Mean0.00000 × 1000 1.34989 × 1003 4.44170 × 1002 1.47652 × 1003 1.47580 × 1003
Std0.00000 × 1000 1.02977 × 1003 4.18444 × 1002 1.10934 × 1003 1.10745 × 1003
F10Best8.88178 × 10−161.17690 × 10−011.21822 × 10−011.30832 × 10−019.76292 × 10−04
Mean8.88178 × 10−164.07750 × 1000 2.10393 × 1000 4.24648 × 1000 2.82129 × 1000
Std0.00000 × 1000 2.78347 × 1000 1.56515 × 1000 2.84110 × 1000 3.32683 × 1000
F11Best0.00000 × 1000 1.54060 × 1000 5.76886 × 10−011.65931 × 1000 4.45692 × 10−03
Mean0.00000 × 1000 1.40518 × 1002 1.45067 × 1001 1.74547 × 1002 7.56728 × 10−01
Std0.00000 × 1000 1.93782 × 1002 3.49974 × 1001 2.45555 × 1002 6.70865 × 10−01
F12Best4.71163 × 10−343.49707 × 10−052.95159 × 10−053.53593 × 10−053.22334 × 10−05
Mean4.71163 × 10−344.61458 × 1000 1.15674 × 1000 4.71509 × 1000 2.37655 × 1000
Std8.69906 × 10−507.79023 × 1000 2.17639 × 1000 8.25036 × 1000 3.71613 × 1000
F13Best1.34978 × 10−321.24017 × 1000 1.07560 × 1000 1.37886 × 1000 1.05795 × 1000
Mean1.34978 × 10−322.66755 × 1002 2.86286 × 1001 2.83343 × 1002 2.02468 × 1002
Std5.56740 × 10−484.21627 × 1002 2.54245 × 1001 4.39341 × 1002 3.96828 × 1002
Table 10. CPU time for high-dimensional functions at d = 100.
Table 10. CPU time for high-dimensional functions at d = 100.
CSAPSOSSASCAGWO
F10.1784320.1905420.3029270.200640.282661
F20.123150.1595380.2286580.1955750.239618
F30.5680330.6114110.7215360.7308380.698834
F40.122950.1889390.2393740.1963820.251754
F50.1569470.1709690.2352640.2092250.246743
F60.1155940.1694060.2153890.1931950.227015
F70.2750010.3053040.3936520.3641410.403618
F80.140280.2239610.2686330.2464610.294057
F90.1222070.1605930.2629840.2091030.27208
F100.1339380.1750450.2555050.2143360.264772
F110.139270.1739380.2558950.2304210.256377
F120.5265530.5821850.6671340.63230.678474
F130.5124110.5599260.6526650.6167620.688092
Sum3.1147663.6717574.6996164.2393794.804095
Rank(1)(2)(4)(3)(5)
Table 11. CPU time for high-dimensional functions at d = 1000.
Table 11. CPU time for high-dimensional functions at d = 1000.
CSAPSOSSASCAGWO
F10.8646060.9206961.4114281.5263121.954178
F20.82110.9272571.6912171.6907012.304578
F39.7460419.9143119.7577919.4165629.929068
F40.7128320.8574851.3912381.6441032.219137
F50.7840510.8583951.4234931.5504482.002662
F60.7645180.8837341.4604211.6107661.972805
F71.4507081.4535122.1552362.3873852.742038
F80.9704831.179461.7780782.0791842.718625
F90.9042831.1358961.6762091.8951932.421654
F100.9183391.1329181.6800221.8610072.290501
F111.0247921.1713381.8099262.0440522.410413
F122.4957812.8020613.2924363.4210134.016006
F132.4710042.7003873.1754983.3819124.266181
Sum23.9285425.9374532.7029934.5086441.24785
Table 12. Optimal design of welded beam.
Table 12. Optimal design of welded beam.
CSAPSOSSASCAGWO
h0.2057296397860.2057296397860.2057232119550.2058110434020.205724311092
l3.4704886656287.0924142765577.0927270087087.3806745891097.092527090825
t9.0366239103589.0366239103589.0366242228898.9720026671409.036803373437
b0.2057296397860.2057296397860.2057296384160.2088742449720.205728938896
Minimum cost1.7248523085972.2181508617642.2181727852112.2730303955082.218180086668
Average cost1.7248538289572.2181508617642.2442456290012.2913536537722.218198907121
Std.0.0000048077570.0000000000000.0527841726260.0104964719040.000013807448
Table 13. Optimal design of pressure vessel.
Table 13. Optimal design of pressure vessel.
CSAPSOSSASCAGWO
Ts0.778168641380.77816864140.793579201020.789225476130.007781787
Th0.384649162630.38464916260.392266779760.406399935230.003846530
R40.319618724140.319618724141.118075266340.692614787640.319922618
L200.000000000200.000000000189.175939539196.033601579200.000000000
Minimum cost5885.332773625885.332773625912.206521716004.520716735885.46666087
Average cost6011.553341546013.403734046191.425569616198.380748305974.52840595
Std175.417988776179.129462647307.601967961116.55262468279.439547307
Table 14. Optimal design of tension coil spring.
Table 14. Optimal design of tension coil spring.
CSAPSOSSASCAGWO
D0.05171902590.05169753990.05000000000.05000000000.0517410542
d0.35743904300.35692175270.31742541330.31552297460.3579696634
N11.246802938011.277015126314.027775062414.424334003511.2159545387
Min. weight0.01266524920.01266523410.01271905780.01295563680.0126652949
Avg. weight0.01267893350.01339887580.01271905850.01318450090.0126662267
Std0.00003275440.00155083560.00000000110.00012955490.0000011609
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Qais, M.H.; Hasanien, H.M.; Turky, R.A.; Alghuwainem, S.; Tostado-Véliz, M.; Jurado, F. Circle Search Algorithm: A Geometry-Based Metaheuristic Optimization Algorithm. Mathematics 2022, 10, 1626. https://doi.org/10.3390/math10101626

AMA Style

Qais MH, Hasanien HM, Turky RA, Alghuwainem S, Tostado-Véliz M, Jurado F. Circle Search Algorithm: A Geometry-Based Metaheuristic Optimization Algorithm. Mathematics. 2022; 10(10):1626. https://doi.org/10.3390/math10101626

Chicago/Turabian Style

Qais, Mohammed H., Hany M. Hasanien, Rania A. Turky, Saad Alghuwainem, Marcos Tostado-Véliz, and Francisco Jurado. 2022. "Circle Search Algorithm: A Geometry-Based Metaheuristic Optimization Algorithm" Mathematics 10, no. 10: 1626. https://doi.org/10.3390/math10101626

APA Style

Qais, M. H., Hasanien, H. M., Turky, R. A., Alghuwainem, S., Tostado-Véliz, M., & Jurado, F. (2022). Circle Search Algorithm: A Geometry-Based Metaheuristic Optimization Algorithm. Mathematics, 10(10), 1626. https://doi.org/10.3390/math10101626

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop