Medical, Pharma, Engineering, Science, Technology and Business

Department of Basic Engineering Science, El-Menoufia University, Shebin El-Kom, Egypt

- *Corresponding Author:
- Rizk M Rizk Allah

Department of Basic Engineering Science

El-Menoufia University, Shebin El-Kom, Egypt

**E-mail:**rizk_masoud @sh-eng.menofia.edu.eg

**Received** March 21, 2016; **Accepted** April 29, 2016; **Published** April 30, 2016

**Citation: **Allah RMR (2016) Hybridization of Fruit Fly Optimization Algorithm and
Firefly Algorithm for Solving Nonlinear Programming Problems. Int J Swarm Intel
Evol Comput 5: 134. doi: 10.4172/2090-4908.1000134

**Copyright:** © 2016 Allah RMR. This is an open-access article distributed under
the terms of the Creative Commons Attribution License, which permits unrestricted
use, distribution, and reproduction in any medium, provided the original author and
source are credited.

**Visit for more related articles at** International Journal of Swarm Intelligence and Evolutionary Computation

We propose a novel hybrid algorithm named, FOA-FA to solve the nonlinear programming problems (NLPPs). The main feature of the hybrid algorithm is to integrate the strength of fruit fly optimization algorithm (FOA) in handling continuous optimization and the merit of firefly algorithm (FA) in achieving robust exploration. The methodology of the proposed algorithm consists of two phases. The first one employs a variation on original FOA employing a new adaptive radius mechanism (ARM) for exploring the whole scope around the fruit flies locations to overcome the drawbacks of original FOA which has been continues for the nonnegative orthant problems. The second one incorporates FA to update the previous best locations of fruit flies to avoid the premature convergence. The hybrid algorithm speeds up the convergence and improves the algorithm’s performance. The proposed FOA-FA algorithm is tested on several benchmark problems and two engineering applications. The numerical comparisons have demonstrated its effectiveness and efficiency.

Firefly algorithm; Fruit fly optimization algorithm; Nonlinear programming problems

Traditional optimization methods can be classified into two distinct groups; direct and gradient-based methods. In direct search methods, only objective function and constraint value are used to guide the search, whereas gradient-based methods use the first and/or second-order derivatives of the objective function and/or constraints to guide the search process. Since derivative information is not used, the direct search methods are usually slow, requiring many function evaluations for convergence. For the same reason, they can be applied to many problems without a major change of the algorithm. On the other hand, gradient based methods quickly converge to an optimal solution, but are not efficient in non-differentiable or discontinuous problems. In addition, if there is more than one local optimum in the problem, the result may depend on the selection of an initial point, and the obtained optimal solution may not necessarily be the global optimum. Furthermore, the gradient search may become difficult and unstable when the objective function and constraints have multiple or sharp peaks.

Therefore, to overcome these shortcomings, a lot of research has focused on meta-heuristic methods. The meta-heuristic methods have many advantages compared to the traditional nonlinear programming techniques, among which the following three are the most important: (i) They can be applied to problems that consist of discontinuous, nondifferentiable and non-convex objective functions and/or constraints. (ii) They do not require the computation of the gradients of the cost function and the constraints. (iii) They can easily escape from local optima. Many meta-heuristic algorithms such as, genetic algorithm (GA) [1], particle swarm optimization (PSO) [2], differential evolution [3] and ant colony optimization [4] have shown their efficacy in solving computationally intensive problems.

Among the existing meta-heuristic algorithms, a well-known branch is the FA which is a meta-heuristic search algorithm, has been developed by Yang [5,6]. FA mimics some characteristics of tropic firefly swarms and their flashing behavior. A firefly tends to be attracted towards other fireflies with higher flash intensity. The main advantages of FA are namely intensification and diversification or exploitation and exploration: As light intensity decreases with distance, the attraction among fireflies can be local or global, depending on the absorbing coefficient, and thus all local modes as well as global modes will be visited. Therefore, it has captured much attention and has been successfully applied to solve several optimization problems including [7-9].

Fruit fly optimization algorithm (FOA) is one of the latest metaheuristic methods which proposed by Pan [10] for solving optimization problems. The main inspiration of FOA is that the fruit fly itself is superior to other species in sensing and perception, especially in osphresis and vision. They feed chiefly on rotten fruits. In the process of finding food, their osphresis organs smell all kinds of scents in the air. They then fly towards the corresponding locations. When they get close to the food locations, they find foods using their visions and then fly towards that direction. FOA has many advantages such as a simple structure, ease of implementation and speed to acquire solutions. Therefore, FOA has been successfully applied to solve several optimization problems [10-14].

Recently, hybridization is recognized to be essential aspect of high performing algorithms. Pure algorithms cannot reach to an optimal solution in a reasonable time. In addition, the premature convergence in pure algorithms is inevitable. Thus pure algorithms are almost always inferior to hybridizations. To our best knowledge, this is the first time that the hybridization between FOA and FA has been proposed for solving NLPPs. This hybridization aims to improve the performance of the original FOA and eliminate its drawbacks.

In this paper, we propose a novel hybrid algorithm named, FOA-FA for solving the nonlinear programming problems and two engineering applications. The proposed FOA-FA algorithm employed the merits of both fruit fly optimization algorithm (FOA) and firefly algorithm (FA) and its methodology consists of two phases. The first one employs a variation on original FOA employing a new adaptive radius mechanism for exploring the whole scope around the fruit flies locations to overcome the drawbacks of original FOA which has been continues for the nonnegative orthant problems. In addition, several other improvements for original FOA are considered, such as: Initialization of the candidate solutions through the search space uniformly and a new modulation coefficient. Moreover, the premature convergence of original FOA degrades its performance by reducing its search capability, leading to a higher probability of being trapped to a local optimum. Therefore, the second phase incorporates the FA to update the previous best locations of fruit flies to force FOA jump out of premature convergence, because of its strong searching ability. Consequently, the hybrid algorithm speeds up the convergence and improves the algorithm’s performance.

The rest of the paper is organized as follows. In Section 2 we describe some preliminaries of the nonlinear programming problem. In Section 3, we introduce briefly the basics of both FOA and FA algorithms. The hybrid algorithm named, FOA-FA algorithm is proposed and explained in details in Section 4. The numerical experiments are given in Section 5 to substantiate the superiority of the proposed algorithm. Finally we summarize the paper with some comments.

The nonlinear programming problem can be defined as follow [15]:

(1)Where F(x) is the objective function defined on the search space S, X is called a decision variable and *g _{k}(x), h_{j}(x)* are defined the
inequality and equality constraints, respectively. The sets and define the feasible and infeasible search spaces, respectively.
Usually, the search space S is defined as an n-dimensional rectangle in as in (2) (domains of variables defined by their lower and upper
bounds):

Any point is called a feasible solution, otherwise, X is an infeasible solution.

**Fruit fly optimization algorithm (FOA)**

In this section we describe the basic steps of the original Fruit fly optimization algorithm (FOA) in details as follows [10].

**Step 1:** Set the main parameters of FOA and give initial location for
the fruit fly swarm randomly P

*x-axis, y-axis*

**Step 2:** Endow personal fruit fly with random direction for finding
food by using:

*xi = X- axis + Random value (3)*

*Yi = y- axis + Random Value*

*i =1,2,…,m*

where m is the swarm size of fruit flies

**Step 3: **Since the food location cannot be known, calculate the
distance (*Dist _{i}*) of the fruit fly from the origin and find out the smell
concentration judgment value (S

**Step 4:** Calculate the smell concentration (Smelli) of the individual
fruit fly location by substituting the smell concentration judgment
value (*S _{i}*) into the smell concentration judgment function (also called
Fitness function).

Smell* _{i}* = Function (S

**Step 5:** Find out the individual fruit fly with the maximal smell
concentration (the maximal value of Smell) among the fruit fly swarm:

*[best smell best index] = max (Smell)* (7)

**Step 6:** Keep the maximal smell concentration value and location
(x, y) of the best fruit fly. Then, the swarm flies towards that location.

Smell best = Best smell

*x-axis = x(best index)* (8)

*y-axis = y(best index)*

Enter iterative optimization to repeat the implementation of step 2–6. When the smell concentration is not superior to the previous iterative smell concentration any more, or the iterative number reaches the maximal iterative number, the circulation stops.

**Firefly algorithm (FA)**

The Firefly algorithm was developed by Xin-She [6] and it is based on idealized behavior of the flashing characteristics of fireflies. For simplicity, we can summarize these flashing characteristics as the following three rules:

a) All fireflies are unisex, so that one firefly is attracted to other fireflies regardless of their sex.

b) Attractiveness is proportional to their brightness, thus for any two flashing fireflies, the less bright one will move towards the brighter one. The attractiveness is proportional to the brightness and they both decrease as their distance increases. If no one is brighter than a particular firefly, it will move randomly.

c) The brightness of a firefly is affected or determined by the landscape of the objective function to be optimized [6].

For simplicity we can assume that the attractiveness of a firefly
is determined by its brightness or light intensity which in turn is
associated with the encoded objective function. In the simplest case
for an optimization problem, the brightness *I* of a firefly at a particular
position X can be chosen as * I (x) ∝1 f (x)*. However the attractiveness
is relative, it should vary with the distance

*Step 1: Attractiveness and light intensity*

In the firefly algorithm, there are two important issues: the
variation of the light intensity and the formulation of the attractiveness.
We know, the light intensity I(*r*) varies with distance r monotonically
and exponentially, that is:

Where *I _{0}* the original light intensity and γ is the light absorption
coefficient. As firefly attractiveness is proportional to the light intensity
seen by adjacent fireflies, we can now define the attractiveness β of a
firefly by Equation (10)

Where *r* is the distance between each two fireflies and β_{0} is their
attractiveness at r = 0 i.e., when two fireflies are found at the same point
of search space. The value of γ determines the variation of attractiveness
with increasing distance from communicated firefly.

**Step 2: Distance**

The distance between any two fireflies *i* and *j* at x_{i} and x_{j} respectively,
the Cartesian distance is determined by equation (11) where x_{i,k} is the *kth* component of the spatial coordinate of x* _{i}* the

*Step 3: Movement*

The firefly *i* movement is attracted to another more attractive
(brighter) firefly *j* is determined by:

Where the second term is due to the attraction while the third term
is randomization with ** α** being the randomization parameter and

In this section, the proposed algorithm will be presented in detail. The main reason for developing the FOA-FA is to overcome the drawbacks of the original FOA which cannot handle the negative domain that is the candidate solution cannot be uniformly generated in the problem domain. Also original FOA is easy to fall into premature convergence because the random term of the Equation (3) picks small values within a radius equal to one around the best location.

The methodology of the proposed FOA-FA algorithm consists of two phases. The first one employs the FOA, where a swarm of fruit flies that moves in various directions using the ARM. Therefore, these moves will follow a uniform distribution. The other phase incorporates the FA to update the previous best locations of fruit flies to force FOA jump out of premature convergence, because of its exploitation and exploration ability. Consequently, the hybrid algorithm speeds up the convergence and improves the algorithm’s performance. The main steps of the proposed algorithm as follows:

*Step 1: Initialize the parameters*

The main parameters of the proposed algorithm are the maximum
iterationnumber (T),thewarmsize(m) and distance zone of fruit fly (*R*).

The main parameters of the proposed algorithm are the maximum
iteration number (T),thewarmsize(m) and distance zone of fruit fly (*R*).

*Step 2: Initialize the swarm location*

Initial fruit fly swarm location is accompanied within the
search space, that is, the algorithm assigns a random vector for *i-th* fruit fly as in Equation (13):

Where rand is a random value from the uniform distribution on the interval [0, 1].

*Step 3: Solution construction using osphresis*

The osphresis that introduced in the original FOA (see Equation 3)
was picked small values within a radius that equals one. So, we introduce
the ARM for enhancing the diversity of solutions and achieving an
effective exploration to the whole scope around the fruit flies locations,
where the search radius decreases dynamically with iteration number
as shown in **Figure 1** (i.e., R=100, T=100). Therefore, the process of the
solution construction using osphresis is implemented as follows:

Where R is the search radius that determines the extent of the
search subspace which we want to cover during the run and *ε* is a
random number from [0, 1]. The random step Δ(*t,R*) returns a value
in the range *[0, R]* such that the probability of Δ(*t,R*) being close to
0 increases as *t* increases. The solution construction using osphresis is
demonstrated in Algorithm.

**Algorithm 1. The solution construction using osphresis**

1. Input :

2. set

3.

4. if set then

5. elseif set then

6. end if

7. *t = t +1*

8. *i = i +1*

9. Output: x_{i} ∈Ω

*Step 4: Evaluation*

The smell concentration ( *Smell _{i}* ) of the individual fruit fly is
related to objective function (fitness function). In this step, the fitness function is evaluated based on the decision variable X as in Equation
(15) instead of using the smell concentration judgment value (

*Smell _{i} = f(x_{i}), i= 1,2,..m* (15)

*Step 5: Update the swarm location*

In this step, the swarm flies toward the best obtained location using
Equation (16). If X_{best} is better than the current fruit fly swarm location
Xi- axis, it will replace the swarm location and become a new one in the
next iteration, i.e., .

*Step 6: Crossover operator for updating the osphresis*

In this step, the crossover operator is adopted for enhancing the osphresis between fruit flies as follows:

(18)where is the coefficient of modulation, it decides
the scope of modulation, is random numberand isthe
extending-modulation operator is specified by the user. The idea of this
step is to make the new generation of fruit flies around the best fruit fly,
where the new solution not within x_{i}(t) and x^{best}(t) but the segment may
be extended equally on both sides (i.e., 3γ-3 as in **Figure 2**.

Therefore, the best food source with the lowest fitness x_{best} is also
found as in Equation (16), i.e.,. If x_{best} is better
than the current fruit fly swarm location Xi-axis , it will replace the
swarm location and become a new one in the next iteration, i.e., .

*Step 7: FA phase for updating the diversity*

The random term of the Equation (3) makes the fruit flies swarm
fluctuates around the so far best solution; consequently, the diversity loss
occurs. Under this condition, the best solution might be easily trapped
in a local optimum or premature convergence. FA is incorporated to
update the previous best locations of fruit flies to force FOA jump out
of premature convergence, because of its exploitation and exploration
ability. Consequently, the hybrid algorithm speeds up the convergence
and improves the algorithm’s performance. In the rest of this paper, we
will discuss the implementation FA in three steps as follows (**Figure 3**).

** Step 7.1: Initialization:** Initialize a swarm of fireflies with the
obtained best ant position, where each firefly contains n variables (i.e.,
the position of the

*Step 7.2: Light intensity*

Calculate the light intensity *I _{i}(x_{i}), i = 1, 2, …m* for each firefly which
in turn is associated with the encoded objective function, where for minimization problem the brighter firefly represents the minimum
value for

*Step 7.3: Attractiveness and movement*

In the firefly algorithm, the attractiveness of a firefly is determined
by its light intensity which in turn is associated with the encoded
objective function and the distance *r _{ij}* between firefly

Since each member of the swarm explores the problem space taking
into account results obtained by others, therefore the randomization
term may moves the fire fly to lose its best location, so we introduce
a modification on the randomization term that makes the fireflies
approached from the optimum. A further improvement on the
convergence of the algorithm is to vary the randomization parameter
α_{1} so that it decreases gradually as the optima are approaching. For
example, we can use

Where *T _{c}* is the maximum number of generations for FA, and θ є [0,
1] is the randomness reduction constant.

The basic steps of the proposed FOA-FA algorithm can be
summarized as the pseudo code shown in **Figure 4**.

In this section, performance of the proposed FOA-FA algorithm
is tested on eleven benchmark problems, which are known from the
literature and reported in Young [16] and two engineering applications
[17,18]. The solutions of the proposed FOA-FA are compared with the
prominent algorithms that reported in Young [16-18]. The test problems
are listed in **Table 1**. The algorithm is coded in MATLAB 7, which has
an Intel Core I 5 (1.8 GHz) processor and 4 GB RAM memory.

**Table 1:** Test problems.

On the other hand, extensive experimental tests were conducted
to see the effect of different values on the performance of the proposed
algorithm. Based upon these observations, the following parameters
have been set as in **Table 2**.

Number of iteration (T) |
200 |

The swarm size (m) | 100 |

Initial attractiveness (β0) | 1 |

The light absorption coefficient(γ) | 1 |

The randomness reduction constant(Ѳ) | 0.96 |

Randomization parameter (α) | 0.95 |

**Table 2:** Parameters settings.

**The performance evaluation metrics**

Our hybrid algorithm was tested on set of the test problems in engineering domain. Each of the test problems was run thirty times independently, with different seeds. In order to evaluate the closeness between the obtained optimal solution and the theoretical optimal solution, as well as measuring the number of evaluations for the proposed algorithm and the existing algorithms, we used two different metrics.

**The speed:** The speed with which the optimum is obtained is a criterion for examining the performance. For the proposed FOA-FA to
be competitive, it is important to be able to compare its speed with the
other existing algorithms. As time is a relative notion depending on
the computer used, we preferred to measure the speed by counting the
number of evaluations of the tested problem. On the one hand, this
criterion is independent from the type of computer used, and on the
other hand the evaluation of the function is the operation taking the
longest time in the program.

**The error: **So the test for closeness, we used the evaluation of the
error between the result given by the algorithm and the exact value of
the optimum, for each test problem. The relative error was used each
time it was possible:

When the optimum to reach was 0, it was no longer possible to use this expression so we calculated the absolute error:

(21)**Table 3** compares the performance of the proposed FOA-FA,
where the better values in the comparison are bold in font type. On
the other hand, a comparison is made with the prominent algorithms from the literature. The test problems have been solved by FOA-FA for
30 times. The starting values of the variables for each problem were
selected randomly for all runs from the solution space. The results
found by FOA-FA such as the best and average function value, numbers
offunction evaluation (NFE) and solution time in seconds have been
recorded in **Table 3**, whereas for the other algorithms only the function
value and numbers of function evaluation are given because the
solution times, the best and average function value for some algorithms
not given. First, we start by comparing the optimum solution of the
given the test problems for the proposed FOA-FA algorithm and the
existing algorithms. Based on the obtained results that demonstrated
in **Table 3** we notes that, the proposed FOA-FA is successful finding
the optimum solution for the test problems F2 and F4 better than both
the theoretical optimum solution and the obtained by other existing
algorithms. Also the proposed FOA-FA algorithm outperforms the
existing algorithms for functions *F _{1}, F_{6}, F_{7}, F_{8}, F_{9}, F_{10}* and

F (x) |
FOA-FA | Compared algorithm | |||||||
---|---|---|---|---|---|---|---|---|---|

Function value | Number of function evaluation(NFE) | Time(s) | Name | Function value | NFE | ||||

Best | Average | Best | Average | Best | Average | ||||

F_{1} |
0 |
1.792475E-20 | 1500 | 1500 | 1500 | 0.629180 | SGA MGA SZGA |
0.44108E-5 0.40842E-5 0.29802E-7 |
4000 18182 4000 |

F_{2} |
- 16.09172000 6 |
-16.09172 | 2000 | 2000 | 2000 | 0.323088 | SGA MGA SZGA |
-16.09171 -16.09171 -16.09172 |
4000 8000 4000 |

F_{3} |
0.9980 | 0.9980 | 1500 | 1700 | 1500 | 0.766544 | SGA MGA SZGA |
0.9980 0.9980 0.9980 |
8000 2000 2000 |

F_{4} |
0.39785 |
0.3978873 | 2500 | 2500 | 2500 | 0.401413 | SGA MGA SZGA |
0.39789 0.39789 0.3979 |
8000 10000 4000 |

F_{5} |
-1.03163 | -1.0316284 | 1600 | 2000 | 1600 | 0.134047 | SGA MGA SZGA |
-1.03163 -1.03163 -1.03163 |
6000 3000 3000 |

F_{6} |
3 | 3 | 2500 | 2500 | 2500 | 0.716377 | SGA MGA SZGA |
3.0001 3.0001 3 |
6000 5000 4000 |

F7_{} |
-186.73091 | -186.730908 | 2300 | 2500 | 2300 | 4.8243 | SGA MGA SZGA |
-186.72744 -186.72825 -186.73091 |
6000 12000 3000 |

F_{8} |
8.01276 | 8.0137 | 10000 | 10000 | 10000 | 19.6132 | SGA MGA SZGA |
8.01359 8.01293 8.01276 |
200000 270270 20000 |

F_{9} |
3.6284E-7 |
3.6284E-7 | 60000 | 60000 | 60000 | 18.6979 | SGA MGA SZGA |
1.63594 0.44407E-2 0.13074E-5 |
200000 317460 175440 |

F_{10} |
0 |
7.5684E-17 | 25000 | 24500 | 25000 | 4.799961 | SGA MGA SZGA |
0.15029E-1 0.13565E-1 0.25422E-7 |
400000 320000 320000 |

F_{11} |
1.3022E-15 |
9.4984E-7 | 2500 | 2500 | 2500 | 15.0280 | SGA MGA SZGA |
0.89927 0.16851 0.23033E-3 |
400000 49855 70000 |

**Table 3:** The comparison of solution quality.

Problems | Absolute error | Relative error | ||||
---|---|---|---|---|---|---|

Proposed algorithm | Compared algorithms | Proposed algorithm | Compared algorithms | |||

F_{1} |
0 | SGA MGA SZGA |
0.44108E-5 0.40842E-5 0.00298E-5 |
- | SGA MGA SZGA |
- - - |

F_{2} |
0.6E-8 | SGA MGA SZGA |
0.99999E-5 0.99999E-5 0 |
-0.37286E-9 | SGA MGA SZGA |
-0.62143E-6 -0.62143E-6 0 |

F_{3} |
0 | SGA MGA SZGA |
0 0 0 |
0 | SGA MGA SZGA |
0 0 0 |

F_{4} |
0.4E-4 | SGA MGA SZGA |
0 0 0.99999E-5 |
E-30.10053 | SGA MGA SZGA |
0 0 0.251325E-4 |

F_{5} |
0 | SGA MGA SZGA |
0 0 0 |
0 | SGA MGA SZGA |
0 0 0 |

F_{6} |
0 | SGA MGA SZGA |
0.1E-3 0.1E-3 0 |
0 | SGA MGA SZGA |
0.3333E-4 0.3333E-4 0 |

F7_{} |
0 | SGA MGA SZGA |
0.003469 0.002659 0 |
0 | SGA MGA SZGA |
-0.18582E-4 -0.14245E-4 0 |

F_{8} |
0 | SGA MGA SZGA |
0.83E-3 0.17E-3 0 |
0 | SGA MGA SZGA |
0.10358E-3 0.021216E-3 0 |

F_{9} |
0.36284E-6 | SGA MGA SZGA |
1.63594 0.44407E-2 0.13074E-5 |
- | SGA MGA SZGA |
- - - |

F_{10} |
0 | SGA MGA SZGA |
0.15029E-1 0.13565E-1 0.25422E-7 |
- | SGA MGA SZGA |
- - - |

F_{11} |
0.13022E-14 | SGA MGA SZGA |
0.89927 0.16851 0.23033E-3 |
- | SGA MGA |
- - - |

**Table 4:** The performance assessment.

Finally, the last performance metric is the relative error or the
absolute error, where as depicted from **Table 4**, the absolute error by using the proposed algorithm is better than the other exists algorithms
for all test problems except SZGA for the test problem F4 and also
the relative error is superior to the other exists algorithms for all test
problems except SZGA for the test problem *F _{4}* The SZGA seems to
better than the proposed algorithm for the test problem

**Design optimization problems**

The design optimization problems are a very important research area in engineering studies because real world design problems require the optimization of relevant engineering applications. In terms of robustness and efficiency of the available methods, these methods are still in need of improvements. Therefore, to meet the ever increasing demands in the design of the electrical devices, the proposed FOA-FA has been substantiated its capability for solving these applications. Thus, the validity of the proposed algorithm has been proved superiority by using two engineering applications, the linear synchronous motor (LSM) and air-cored solenoid [17,18].

**Design of LSM:** The linear synchronous motor (LSM) operates
on the same working principle as that of a permanent magnet rotary
D.C. motor [17]. As in a rotary motor there are two parts in a LSM,
one is the set of permanent magnets and the other is the armature that
has conductors carrying current. The permanent magnets produce a
magnetic flux perpendicular to the direction of motion. The flow of
current is in the direction perpendicular to both the direction of the
motion and the direction of the magnetic flux.

To validate the proposed algorithm, it is employed to optimize the
geometrical design of the linear electric actuator problem as is shown in **Figure 5(a)** and **5(b)**. The objective function is to maximum force subject
to some of constraints on heat, radius, saturation, demagnetization and
maximum force constraint. There are four design variables: the current
in each slot, *x _{1}*, the dimensions of the slot, and

The obtained result for design of LSM is demonstrated in **Table 5**,
by giving the best, worst and mean values for the objective function.
Also **Figure 6** depicts the relation between the maximum force and
number of iterations for FOA before applying FA and after Applying FA
(i.e., the hybrid algorithm FOA-FA). The results of the proposed FOAFA
is superior to those obtained using the Deshpande [17].

Methods | FOA-FA | FOA | Deshpande [13] |
---|---|---|---|

Best objective function value | 17705 | 1.6239E4 | 16192 |

Worst objective function value | 12139 | 15440 | - |

Mean objective function value | 1.7043E4 | 1.6198E4 | - |

Median objective function value | 17703 | 1.6239E4 | - |

NFE | 4000 | 4500 | - |

Time (second) | 3.449 | 0.3812 | - |

**Table 5:** Optimal results of the LSM problem.

Design of an air-cored solenoid: The optimization problem of a
coreless solenoid with rectangular cross section *x _{1}×x_{i}* and a mean radius

**Table 6** demonstrates the result for design of air-cored solenoid, by
giving the best, worst and mean values for the objective function. On
the other hand, **Figure 7** depicts the relation between the maximum
inductance against the number of iterations for FOA before applying
FA and after Applying FA (i.e., the hybrid FOA-FA). The result of the
proposed FOA-FA is superior to those obtained by [18].

Methods | FOA-FA | FOA | Barba[14] |
---|---|---|---|

Best objective function value | 258.0607 | 258.0291 | 257.197 |

Worst objective function value | 250.0581 | 244.3131 | - |

Mean objective function value | 257.7696 | 257.5935 | - |

Median objective function value | 257.9143 | 258.0084 | - |

NFE | 3300 | 4477 | - |

Time (second) | 1.2345 | 0.4378 | - |

**Table 6:** Optimal results of the air-cored solenoid.

**Figure 8** shows the graphical representation of the optimum
solution for the LSM and air cored solenoid. It is obvious that the
algorithm outperforms the other algorithms for the both problems.

We concluded that the integrated algorithm of FOA with FA has
improved the quality of the founded solutions and also it guarantees
the faster converge to the optimal solution. The FOA is presented in the first stage to provide the initial solution to optimization problem (close
to the optimal solution as possible) followed by FA to enhance the
quality of the solution. However, because of its random behavior, FOA
may suffer from slow convergence. So we introduced the neighborhood
search via adaptive radius mechanism *Δ (t,R)* that returns a value
in the range *[0, R]*. On the other hand, to reach a quick and closer result to optimal solution, and to improve the efficiency of the FOA,
hybridization between FOA and FA was implemented.

In this subsection, a comparative study has been carried out to assess the performance of the proposed FOA-FA algorithm concerning the hybridization, closeness to optimal solution and computational time. On the first hand, pure algorithms suffer from reach to an optimal solution in a reasonable time. Also, the sinking into premature convergence is inevitable in pure algorithms. Consequently, our hybridization algorithm has twofold feature, solving NLPPs in a reasonable time and avoiding the sinking into the premature convergence. On the other hand, the proposed FOA-FA algorithm is highly competitive when compared with the other existing methods in term of calculating the relative, absolute error and number of evolutions. Accordingly, it provides the facility to search in the negative and the nonnegative orthant through adaptive radius mechanism. Finally, the feasibility of using the proposed algorithm to handle NLPPs has been empirically approved.

A novel hybrid algorithm named, FOA-FA for solving the nonlinear programming problems and engineering applications is proposed. The proposed FOA-FA algorithm employed the merits of both fruit fly optimization algorithm (FOA) and firefly algorithm (FA) and consists of two phases. The first one employs a variation on original FOA employing a new adaptive radius mechanism for exploring the whole scope around the fruit flies locations to overcome the drawbacks of original FOA, which has been an continues to be a fruitful paradigm for designing effective the nonnegative orthant algorithms. Moreover, the premature convergence of original FOA degrades its performance by reducing its search capability, leading to a higher probability of being trapped to a local optimum. Therefore, the second phase incorporates the FA to update the previous best locations of fruit flies to force FOA jump out of premature convergence, because of its strong searching ability. The hybrid algorithm speeds up the convergence and improves the algorithm’s performance.

A careful observation will reveal the following benefits of the proposed optimization algorithm.

1) It can efficiently safeguard the solution from sinking into the premature convergence through incorporating the FA.

2) It can overcome the lack of the exploration of the FOA which roaming in the nonnegative orthant by introducing the adaptive radius mechanism.

3) It emphasizes the diversity of solutions by incorporating the FA that enriches the exploratory capabilities of the proposed algorithm.

4) It competitive when compared with the other existing algorithm.

5) It can find can find the global minimum for the problems very efficiently.

6) It can accelerate the convergence and improves the algorithm’s performance through elapsed low computational time.

The future work will be focused on three directions: (i) the application of FOA- FA to real world problems; (ii) the extension of the method to solve the multi-objective problems; and (iii) another applications in engineering.

- Deep K, Dipti (2008) A self-organizing migrating genetic algorithm for constrained optimization. Applied Mathematics and Computation 198: 237-250.
- He Q, Wang L (2007) A hybrid particle swarm optimization with a feasibility – based rule for constrained optimization. Applied Mathematics and Computation 186: 1407-1442.
- Becerra RL, Coello CAC (2006) Cultured differential evolution for constrained optimization. Computer Methods in Applied Mechanics and Engineering 195: 4303-4322.
- Mousa MM, WaielF Abd El-Wahed, Rizk-Allah RM (2011) A Hybrid Ant Colony Optimization Approach Based Local Search Scheme for Multiobjective Design Optimizations. Journal of Electric Power Systems Research 81: 1014-1023.
- Yang XS (2009) Firefly algorithms for multimodal optimization, In Stochastic Algorithms: Foundation and Applications. SAGA 2009, Lecture Notes in Computer Sciences 5792: 169-178.
- Yang XS (2008) Nature-Inspired Metaheuristic Algorithms. Luniver Press.
- Rizk-Allah RM, Elsayed ZM, El-Sawy EA (2013) Hybridizing ant colony optimization with firefly algorithm for unconstrained optimization problems. Applied Mathematics and Computation 224: 473-483.
- El-Sawy A, Zaki EM, Rizk-Allah RM (2013) A Novel Hybrid Ant Colony Optimization and Firefly Algorithm for Solving Constrained Engineering Design Problems. Journal of Natural Sciences and Mathematics 6: 122.
- El-Sawy A, Zaki EM, Rizk-Allah RM (2013) Novel hybrid ant colony optimizationand firefly algorithm for multi-objective optimization problems. International Journal of Mathematical 4: 152-161.
- Pan WT (2012) A new fruit fly optimization algorithm: taking the financial distress model as an example. Knowledge-Based Systems 26: 6-74.
- Li H, Guo S, Li L, Sun J (2013) A hybrid annual power load forecasting model based on generalized regression neural network with fruit fly optimization algorithm. Knowl-Based Syst 37: 378-387.
- Lin SM (2013) Analysis of service satisfaction in web auction logistics service using a combination of fruit fly optimization algorithm and general regression neural network. NeurComputAppl 7: 459-465.
- Han J, Wang P, Yang X (2012) Tuning of PID controller based on fruit fly optimization algorithm, in: International Conference on Mechatronics and Automation (ICMA) pp: 409-413.
- Wang L, Zheng XL, Wang SL (2013) A novel binary fruit ﬂy optimization algorithm for solving the multidimensional knapsack problem.Knowl-Based Syst 48: 17-23.
- Bazaraa MS, Shetly CM (1979) Nonlinear Programming Theory and Algorithms. Wiley, New York.
- Young-Doo K, Soon-Bum K, Seung-Bo Y, Jae-Yong K (2003) Convergence enhanced genetic algorithm with successive zooming method for solving continuous optimization problems. Computers and Structures 81: 1715-1725.
- Deshpande AD (2002) California Linear Drives, Inc, A Study Of Methods To Identify Constraint Dominance In Engineering Design Problems. MS Thesis, Mechanical and Industrial Engineering Department, University of Massachusetts, Amherst.
- Barba PB, Farina M, Savini A (2001) An improved technique for enhancing diversity in Pareto evolutionary optimization of electromagnetic devices. COMPEL 20: 482-496.

Select your language of interest to view the total content in your interested language

- ACS Nano
- AI Algorithm
- Advanced Materials
- Algorithm
- Android Technology
- Artificial Intelligence
- Artificial Intelligence Studies
- Artificial Intelligence and Philosophy
- Artificial neural networks
- Automated Mining
- Automated Reasoning and Inference
- Automation
- Behavior-based systems
- Bioinformatics Modeling
- Biomechanics
- Biomechanics and Robotics
- Biosensor
- Biostatistics: Current Trends
- Case-based reasoning
- Cloud
- Cloud Computation
- Cognitive Aspects of AI
- Commonsense Reasoning
- Computational Neuroscience
- Computational Sciences
- Computer Artificial Intelligence
- Computer Hardware
- Computer Science
- Constraint Processing
- Cryptography
- Data Communication
- Data Mining Current Research
- Data Mining and Analysis
- Data Security
- Data Storage
- Development Process
- Digital Image Processing
- Distributed Sensor Networks
- Electronic Engineering
- Evolution of social network
- Evolutionary Optimisation
- Evolutionary algorithm
- Evolutionary algorithm in datamining
- Evolutionary computation
- Evolutionary science
- Findings on Machine Learning
- Fuzzy Logic
- Handover
- Hardware Networking
- Heuristic Search
- High-Level Computer Vision
- Human Centered
- Human-Robot Interaction
- Hybrid soft computing
- IT Management
- Industrial Robotics
- Information Systems
- Information Technology
- Intelligent Interfaces
- Intelligent Robotics
- Internet Communication Technology
- Knowledge modelling
- Machine Learninng
- Mathematical Modeling
- Medical Device
- Medical Robotics
- Mobile Communication
- Mobile Device
- Mobile Repots
- Mobile Robot System
- Motion Sensors
- Multi Objective Programming
- Nano/Micro Robotics
- Network Algorthims
- Network Security
- Networks Systems Technology
- Neural Network
- Neural Networks
- Neurorobotics
- Ontology Engineering
- Optical Communication
- Project development
- Real Time
- Robotic Rehabilitation
- Robotics
- Robotics In Medical
- Robotics Research
- Robotics for Application
- Robotics for Mechanism
- Routing Protocol
- Sensing and Perception
- Sensor Network Technology
- Sensor Technology
- Sensors and Actuators
- Simulation
- Social Robots
- Software Architecture
- Software Component
- Software Quality
- Studies on Computational Biology
- Swarm Robotics
- Swarm intelligence
- Systems Biology
- Telerobotics
- Web Service
- Wireless Sensor Networks
- Wireless Technology
- ZIPBEE Protocol
- swarm intelligence and robotics

- 4th International Conference on
**BigData Analysis**and**Data Mining**

September 07-08, 2017, Paris, France - 6th International Conference on
**Biostatistics**and**Bioinformatics**

Nov 13-14, 2017, Atlanta, USA - 4th World Congress on
**Robotics**and**Artificial Intelligence**

October 23-24, 2017 - 4th World Congress on
**Robotics**and Artificial Intelligence

October 23-24, 2017 Osaka, Japan - 5th International conference on
**Artificial Intelligence**

April 16-17, 2018 Las Vegas, USA - 4th Global Summit and Expo on Multimedia &
**Artificial Intelligence**

July 19-21, 2018 Rome, Italy

- Total views:
**9084** - [From(publication date):

April-2016 - Aug 23, 2017] - Breakdown by view type
- HTML page views :
**8798** - PDF downloads :
**286**

Peer Reviewed Journals

International Conferences 2017-18