Informatica 32 (2008) 319-326 319 Solving Engineering Optimization Problems with the Simple Constrained Particle Swarm Optimizer Leticia C. Cagnina and Susana C. Esquivel LIDIC, Universidad Nacional de San Luis, San Luis, Argentina E-mail: lcagnina@unsl.edu.ar Carlos A. Coello Coello CINVESTAV-IPN, Mexico D. F., Mexico Keywords: constrained optimization, engineering problems, particle swarm optimization Received: July 7, 2008 This paper introduces a particle swarm optimization algorithm to solve constrained engineering optimization problems. The proposed approach uses a relatively simple method to handle constraints and a different mechanism to update the veloci ty and posi ti on of each particle. The algori thm is validated using four standard engineering design problems reported in the specialized literature and it is compared with respect to algorithms representative of the state-of-the-art in the area. Our results indicate that the proposed scheme is a promising alternati ve to solve this sort of problems because it obtains good results wi th a low number of objective functions evaluations. Povzetek: Članek uvaja za reševanje inženirskih optimizacijskih problemov z omejitvami algoritem za optimizacijo z roji. 1 Introduction Engineering design optimization problems are normally adopted in the specialized literature to show the effectiveness of new constrained optimization algorithms. These nonlinear engineering problems have been investigated by many researchers that used different methods to solve them: Branch and Bound using SQP [24], Recursive Quadratic Programming [9], Sequential Linearization Algorithm [20], Integer-discrete-continuous Nonlinear Programming [11], Nonlinear Mixed-discrete Programming [19], Simulated Annealing [27], Genetic Algorithms [26], Evolutionary Programming [8] and, Evolution Strategies [25] among many others. These types of problems normally have mixed (e.g., continuous and discrete) design variables, nonlinear objective functions and nonlinear constraints, some of which may be active at the global optimum. Constraints are very important in engineering design problems, since they are normally imposed on the statement of the problems and sometimes are very hard to satisfy, which makes the search difficult and inefficient. Particle Swarm Optimization (PSO) is a relatively recent bio-inspired metaheuristic, which has been found to be highly competitive in a wide variety of optimization problems. However, its use in engineering optimization problems and in constrained optimization problems, in general, has not been as common as in other areas (e.g., for adjusting weights in a neural network). The approach described in this paper contains a constraint-handling technique as well as a mechanism to update the velocity and position of the particles, which is different from the one adopted by the original PSO. This paper is organized as follows. Section 2 briefly discusses the previous related work. Section 3 describes in detail our proposed approach. Section 4 presents the experimental setup adopted and provides an analysis of the results obtained from our empirical study. Our conclusions and some possible paths for future research are provided in Section 5. 2 Literature review Guo et al. presented a hybrid swarm intelligent algorithm with an improvement in global search reliability. They tested the algorithm with two of the problems adopted here (E02 and E04). Despite they claim that their algorithm is superior for finding the best solutions (in terms of quality and robustness), the solution that they found for E02 is greater than its best known value and for E04 the results obtained are not comparable to ours, because they used more constraints in the definition of that problem [13]. Shamim et al. proposed a method based on a socio-behavioral simulation model. The idea behind this approach is that the leaders of all societies interact among themselves for the improvement of the society. They tested their algorithm using three of the problems adopted here (E01, E02 and E03). The best values reported for these three problems are close from the optimal known values. The number of fitness function evaluations was 19,259 for 320 Informatica 32 (2008) 319-326 L.C. Cagnina et al. E01, 19,154 for E02 and 12,630 forE03 [1]. Mahdavi et al. developed an improved harmony search algorithm with a novel method for generating new solutions that enhances the accuracy and the convergence rate of the harmony search. They used three of the problems adopted here (E01, E03 and E04) to validate their approach, performing 300,000, 200,000 and 50,000 evaluations, respectively. For E01 and E02, the best values reported are not the best known values because the ranges of some variables in E01 are different from those of the original description of the problem (x4 is out of range), which makes such solution infeasible under the description adopted here. The value reported by them for E04 is very close to the best value known [21]. Bernardino et al. hybridized a genetic algorithm embedding an artificial immune system into its search engine, in order to help moving the population into the feasible region. The algorithm was used to solve four of the test problems adopted here (E01, E02, E03 and E04), using 320,000, 80,000, 36,000 and 36,000 evaluations of the objective functions, respectively. The best values found for E01, E02 and E04 are close to the best known. For E03 the value reported is better than the best known, because one of the decision variables is out of range (x5). The values in general, are good, although the number of evaluations required to obtain them is higher than those required by other algorithms [4]. Hernandez Aguirre et al. proposed a PSO algorithm with two new perturbation operators aimed to prevent premature convergence, as well as a new neighborhood structure. They used an external file to store some particles and, in that way, extend their life after the adjustment of the tolerance of the constraints. The authors reference three algorithms which obtained good results for the problems adopted in their study: two PSO-based algorithms and a Differential Evolution (DE) algorithm. One of the PSO-based approaches compared [16] used three of the problems adopted here (E01, E02 and E04), performing 200,000 objective function evaluations. The other PSO-based approach compared [14] was tested with the same set of problems and the best known values were reached for E02 and E04 after 30,000 objective function evaluations. The DE algorithm [22] reported good results with 30,000 evaluations for the four problems. This same number of evaluations was performed by the algorithm proposed by Hernandez et al. and their results are the best reported until now for the aforementioned problems [15]. For that reason, we used these last two algorithms to compare the performance of our proposed approach. The DE algorithm [22] will be referenced as "Mezura" and, the PSO by [15] as "COPSO". 3 Our proposed approach: SiC-PSO The particles in our proposed approach (called Simple Constrained Particle Swarm Optimizer, or SiC-PSO), are n-dimensional values (continuous, discrete or a combination of both) vectors, where n refers to the number of decision variables of the problem to be solved. Our approach adopts one of the most simple constraint-handling methods currently available. Particles are compared by pairs: 1) if the two particles are feasible, we choose the one with a better fitness function value; 2) if the two particles are infeasible, we choose the particle with the lower infeasi-bility degree; 3) if one particle is feasible and the other is infeasible, we choose the feasible one. This strategy is used when the pbest, gbest and Ibest particles are chosen. When an individual is found infeasible, the amount of violation (this value is normalized with respect to the largest violation stored so far) is added. So, each particle saves its infeasibility degree reached until that moment. As in the basic PSO [10], our proposed algorithm records the best position found so far for each particle (pbest value) and, the best position reached by any particle into the swarm (gbest value). In other words, we adopt the gbest model. But in previous works, we found that the gbest model tends to converge to a local optimum very often [7]. Motivated by this, we proposed a formula to update the velocity, using a combination of both the gbest and the lbest models [5]. Such a formula (Eq. 1) is adopted here as well. The Ibest model is implemented using a ring topology [17] to calculate the neighborhoods of each particle. For a size of neighborhood of three particles and a swarm of six particles (1,2,3,4,5,6), the neighborhoods considered are the following: (1,2,3), (2,3,4), (3,4,5), (4,5,6), (5,6,1) and (6,1,2). The formula for updating particles is the same that in the basic PSO and it is shown in Eq. 2. Vid = w(vid + ciri (pbid - Pid) + C2r2 (plid - Pid) (1) + C3r3(pgd - Pid)) Pid = Pid + Vid (2) where vid is the velocity of the particle i at the dimension d, w is the inertia factor [10] whose goal is to balance the global exploration and the local exploitation, ci is the personal learning factor, and c2, c3 are the social learning factors, r1, r2 and r3 are three random numbers within the range [0..1], Pbid is the best position reached by the particle i, Plid is the best position reached by any particle in the neighborhood of particle i and, Pgd is the best position reached by any particle in the swarm. Finally, Pid is the value of the particle i at the dimension d. We empirically found that for some difficult functions, a previous version of our algorithm could not find good values. The reason was its diversification of solutions which kept the approach from converging. In SiC-PSO we changed the common updating formula (Eq. 2) of the particles for the update equation presented by Kennedy [18]. In Kennedy's algorithm, the new position of each particle is randomly chosen from a Gaussian distribution with the mean selected as the average between the best position recorded for the particle and the best in its neighborhood. SOLVING ENGINEERING OPTIMIZATION PROBLEMS Informatica 32 (2008) 319-326 321 The standard deviation is the difference between these two values. We adapted that formula adding the global best (gbest) to the best position of the particle and the best in its neighborhood. We also changed the way in which the standard deviation is determined. We use the pbest and, the gbest instead of the ¡best as was proposed by Kennedy. We determined those changes after several empirical tests with different Gaussian random generator parameters. Thus, the position is updated using the following equation: ATÍPi + pli + pg i i Pi = -3-, \Pi - pg\ (3) for E03 for which best mean was found by our algorithm. The lower standard deviation values for E01 and E04 was obtained by COPSO; for E02 and E03, our SiC-PSO found the minimum values. Tables 3,4, 5 and 6 show the solution vectors of the best solution reached by SiC-PSO as well as the values of the constraints, for each of the problems tested. where pi, pli and pg are defined as before and, N is the value returned by the Gaussian random generator. SiC-PSO used Eq. 3 and Eq. 2 for the updating of positions of the particles. We considered a high probability for selecting Eq. 3 (0.925) over Eq. 2. We chose that probability after conducting numerous experiments. 4 Parameter settings and analysis of results A set of 4 engineering design optimization problems was chosen to evaluate the performance of our proposed algorithm. A detailed description of the test problems may be consulted in the appendix at the end of this paper. We performed 30 independent runs per problem, with a total of 24,000 objective function evaluations per run. We also tested the algorithm with 27,000 and 30,000 evaluations of the objective function, but no performance improvements were noticed in such cases. Our algorithm used the following parameters: swarm size = 8 particles, neighborhood size = 3, inertia factor w = 0.8, personal learning factor and social learning factors for c1, c2 and c3 were set to 1.8. These parameter settings were empirically derived after numerous previous experiments. Our results were compared with respect to the best results reported in the specialized literature. Those values were obtained by Hernandez Aguirre et al. [15] and Mezura et al. [22]. We reference those results into the tables shown next as "COPSO" and "Mezura", respectively. It is important remark that COPSO and Mezura algorithms reached the best values after 30,000 fitness function evaluations, which is a larger value than that required by our algorithm. The best values are shown in Table 1 and, the mean and standard deviations over the 30 runs are shown in Table 2. The three algorithms reached the best known values for E01. For E02, SiC-PSO and COPSO reached the best known, but Mezura reported a value with a precision of only 4 digits after the decimal point, and the exact value reached by them is not reported. For E03, SiC-PSO reached the best value, COPSO reached a value slightly worse than ours, and Mezura reached an infeasible value. SiC-PSO and COPSO reached the best value for E04, although Mezura reported a value that is worse than the best known. In general, COPSO obtained the best mean values, except Best Solution Xi 0.205729 X2 3.470488 X3 9.036624 X4 0.205729 gi(x) -1.819E-12 92(X) -0.003721 93(X) 0.000000 94(x) -3.432983 95(X) -0.080729 9e(X) -0.235540 97(X) 0.000000 f (X) 1.724852 O Solution vector for E01 Best Solution Xi 0.812500 X2 0.437500 X3 42.098445 X4 176.636595 gi(X) -4.500E-15 92(x) -0.035880 93(x) -1.164E-10 94(x) -63.363404 f (X) 6,059.714335 Table 4: SiC-PSO Solution vector for E02 (pressure vessel). 5 Conclusions and Future Work We have presented a simple PSO algorithm (SiC-PSO) for constrained optimization problems. The proposed approach uses a simple constraint-handling mechanism, a ring topology for implementing the Ibest model and a novel formula to update the position of particles. SiC-PSO had a very good performance when is applied to several engineering design optimization problems. We compared our results with respect to those obtained by two algorithms that had been previously found to perform well in the same problems. These two algorithms are more sophisticated than our SiC-PSO. Our algorithm obtained the optimal values for each of the test problems studied, while performing a lower number of objective function evaluations. Also, the performance of our approach with respect to the mean and standard deviation is comparable with that shown by the 322 Informatica 32 (2008) 319-326 L.C. Cagnina et al. Prob. Optimal SiC-PSO COPSO Mezura E01 1.724852 1.724852 1.724852 1.724852 E02 6,059.714335 6,059.714335 6,059.714335 6,059.7143 E03 NA 2,996.348165 2,996.372448 2,996.348094* E04 0.012665 0.012665 0.012665 0.012689 *Infeasible solution. NA Not avaliable. Table 1: Best results obtained by SiC-PSO, COPSO and Mezura. Mean St. Dev. Prob. SiC-PSO COPSO Mezura SiC-PSO COPSO Mezura E01 2.0574 1.7248 1.7776 0.2154 1.2E-05 8.8E-02 E02 6,092.0498 6,071.0133 6,379.9380 12.1725 15.1011 210.0000 E03 2,996.3482 2,996.4085 2,996.3480* 0.0000 0.0286 0.0000* E04 0.0131 0.0126 0.0131 4.1E-04 1.2E-06 3.9E-04 *Infeasible solution. Table 2: Means and Standard Deviations for the results obtained. Best Solution X1 3.500000 X2 0.700000 X3 17 x4 7.300000 X5 7.800000 X6 3.350214 X7 5.286683 91(X) -0.073915 92(X) -0.197998 93(X) -0.499172 gA(X) -0.901471 95(X) 0.000000 g6(X) -5.000E-16 g7(X) -0.702500 gs(X) -1.000E-16 g9(X) -0.583333 gio(X) -0.051325 gii(X) -0.010852 f (X) 2,996.348165 Table 5: SiC-PSO Solution vector for E03 (speed reducer). other algorithms. Thus, we consider our approach to be a viable choice for solving constrained engineering optimization problems, due to its simplicity, speed and reliability. As part of our future work, we are interested in exploring other PSO models and in performing a more detailed statistical analysis of the performance of our proposed approach. Appendix: Engineering problems Formulating of the engineering design problems used to test the algorithm proposed. Best Solution Xi 0.051583 X2 0.354190 X3 11.438675 gi(X) -2.000E-16 g2(X) -1.000E-16 g3(X) -4.048765 gA(X) -0.729483 f (X) 0.012665 Table 6: SiC-PSO Solution vector for E04 (tension/compression spring). E01: Welded beam design optimization problem The problem is to design a welded beam for minimum cost, subject to some constraints [23]. Figure 1 shows the welded beam structure which consists of a beam A and the weld required to hold it to member B. The objective is to find the minimum fabrication cost, considerating four design variables: x\, x2, x3, x4 and constraints of shear stress t, bending stress in the beam a, buckling load on the bar Pc, and end deflection on the beam S. The optimization model is summarized in the next equation: Minimize: f (x) = 1.10471xi2x2 + 0.04811x3X4(14.0 + x2) subject to: g1(x) = t(x) - 13, 600 < 0 g2(x) = a(x) — 30, 000 < 0 g3(x) = x1 — x4 < 0 g4(x) = 0.10471(xi2) + 0.04811x3x4(14 + x2) — 5.0 < 0 g5 (x) = 0.125 — xi < 0 ga(x) = S(x) — 0.25 < 0 SOLVING ENGINEERING OPTIMIZATION PROBLEMS Informatica 32 (2008) 319-326 323 with: gr(x) = 6, 000 - Pc(x) < 0 r(x) = \l (t')2 + (2t't")xr + (t'')2 6, 000 V2X1X2 = MR = J M = 6, 000 (14+ X2r) R- X2 2 + Xl + X3 2 J = 2 iX1X2v2 a(x) S(x) 4 v 2 2 ( xi + x3\2 x2 + 12 v 2 504, 000 x4x32 65, 856, 000 (30 x 106)x4x33 Pc(x) = 4.013(30 x 106)y^ 6^ . / x3 x46 36 196 1 x . 30X106 ' x3\l 4(12X 106) ~28 with 0.1 < x1,xi < 2.0, and 0.1 < x2,x3 < 10.0. Best solution: x* = (0.205730, 3.470489, 9.036624,0.205729) where f (x*) = 1.724852. of the vessel x4. The variables x1 and x2 are discrete values which are integer multiples of 0.0625 inch. Then, the formal statement is: Minimize: f (x) = 0.6224X1X3X4 + 1.7781X2X32 + 3.1661xi2x4 + 19.84x12x3 subject to: g1(x) = -x1 +0.0193x3 < 0 g2(x) = -x2 + 0.00954x3 < 0 2 2 4 3 rx3 x4 - 3 nx:i +1,2 g4(x) = x4 - 240 < 0 g3(x) = -nx32x42 - 4nx33 + 1, 296, 000 < 0 with 1 x 0.0625 < x1,x2 < 99 x 0.0625,10.0 < x3, and x4 < 200.0. Best solution: x* = (0.8125,0.4375, 42.098446,176.636596) where f (x*) = 6,059.714335. Figure 1: Weldem Beam. E02: Pressure Vessel design optimization problem A compressed air storage tank with a working pressure of 3,000 psi and a minimum volume of 750 ft3. A cylindrical vessel is capped at both ends by hemispherical heads (see Fig. 2). Using rolled steel plate, the shell is made in two halves that are joined by teo longitudinal welds to form a cylinder. The objective is minimize the total cost, including the cost of the materials forming the welding [24]. The design variables are: thickness x1, thickness of the head x2, the inner radius x3, and the length of the cylindrical section Figure 2: Pressure Vessel. E03: Speed Reducer design optimization problem The design of the speed reducer [12] shown in Fig. 3, is considered with the face width x1, module of teeth x2, number of teeth on pinion x3, length of the first shaft between bearings x4, length of the second shaft between bearings x5, diameter of the first shaft x6, and diameter of the first shaft x7 (all variables continuous except x3 that is integer). The weight of the speed reducer is to be minimized subject to constraints on bending stress of the gear teeth, surface stress, transverse deflections of the shafts and stresses in the shaft. The problem is: Minimize: f (X) = 0.7854XiX22(3.3333x3 + 14.9334x3 - 43.0934) - 1.508xi(x6 + x?) + 7.4777(x3 + x?) + 0.7854(x4x6 + x5x?) subject to: g1 (x) = 27 2 x1 x22 x3 10 324 Informatica 32 (2008) 319-326 L.C. Cagnina et al. 397.5 g2(x) = -2 2 - 1 - 0 X1X 2 X 3 g3(x) = -4 -1 - 0 X2X3X6 1.93x5 g4(X) = -4 - 1 - 0 X2X3X7 mathematical formulation of this problem is: Minimize: gs(X) = ga(X) = 1.0 //745.0x4 110X3 v v X2X3 1.0 2 + 16.9 x 106 - 1 - 0 85x7 subject to: g2(X) = / (x) = (X3 + 2)X2 Xi 3 ^ 1 X2X3 gi(X) = 1 - tTirsX! - 0 + 745.0x^2 + 157.5 x 106 - 1 - 0 X2X3 gr(*X) = ^ - 1 - 0 g8(X) = 5X2 - 1 - 0 xi 4 x 22 - x i x 2 12, 566(X2X1) - Xi ' 5, 108x1 140.45xi g3 (x) = 1--2- < 0 10 x 22 x 3 g4(x) X2 + Xi 1.5 10 gg(x) = Xi 12X2 10 1.5X6 + 1.9 gio(x) =--1 - 0 X4 1.1X7 + 1.9 gii(x) =--1 - 0 X5 with 0.05 < x1 < 2.0,0.25 < x2 < 1.3, and 2.0 < x3 < 15.0. Best solution: x* = (0.051690, 0.356750,11.287126) with 2.6 < xi < 3.6,0.7 < x2 < 0.8,17 < X3 < where f (x* ) = 0.012665 28, 7.3 < x4 < 8.3, 7.8 < x5 < 8.3, 2.9 < x6 < 3.9, and 5.0 < xr < 5.5. Best solution: x* = (3.500000,0.7,17, 7.300000, 7.800000, 3.350214, 5.286683) where f (x*) = 2, 996.348165. Figure 3: Speed Reducer. Figure 4: Tension/Compression Spring. Acknowledgment The first and second author acknowledge support from the ANPCyT (National Agency to Promote Science and Technology, PICT 2005 and Universidad Nacional de San Luis. The third author acknowledges support from CONACyT project no. 45683-Y. 5.1 E04: Tension/compression spring design optimization problem This problem [2] [3] minimizes the weight of a tension/compression spring (Fig. 4), subject to constraints of minimum deflection, shear stress, surge frequency, and limits on outside diameter and on design variables. There are three design variables: the wire diameter x1, the mean coil diameter x2, and the number of active coils x3. The References [1] S. Akhtar, K. Tai and T. Ray. A Socio-behavioural Simulation Model for Engineering Design Optimization. Eng. Optimiz., 34(4):341-354, 2002. [2] J. Arora. Introduction to Optimum Design. McGraw-Hill, 1989. SOLVING ENGINEERING OPTIMIZATION PROBLEMS Informatica 32 (2008) 319-326 325 [3] A. Belegundu. A Study of Mathematical Programming Methods for Structural Optimization. PhD thesis, Department of Civil Environmental Engineering, University of Iowa, Iowa, 1982. [4] H. Bernardino, H. Barbosa and A. Lemonge. A Hybrid Genetic Algorithm for Constrained Optimization Problems in Mechanical Engineering. In Proc. IEEE Congress on Evolutionary Computation (CEC 2007), Singapore, 2007, pages 646-653. [5] L. Cagnina, S. Esquivel and C. Coello Coello. A Particle Swarm Optimizer for Constrained Numerical Optimization. In Proc. 9th International Conference on Parallel Problem Solving from Nature (PPSN IX), Reykjavik, Iceland, 2006, pages 910-919. [6] L. Cagnina, S. Esquivel and C. Coello Coello. A Bipopulation PSO with a Shake-Mechanism for Solving Constrained Numerical Optimization. In Proc. IEEE Congress on Evolutionary Computation (CEC2007), Singapore, 2007, pages 670-676. [7] L. Cagnina, S. Esquivel and R. Gallard. Particle Swarm Optimization for Sequencing Problems: a Case Study. In Proc. IEEE Congress on Evolutionary Computation (CEC 2004), Portland, Oregon, USA, 2004, pages 536-541. [8] Y. Cao and Q. Wu. Mechanical Design Optimization by Mixed-variable Evolutionary Programming. In 1997 IEEE International Conference on Evolutionary Computation, Indianapolis, Indiana, USA, 1997, pages 443-446. [9] J. Cha and R. Mayne. Optimization with Discrete Variables via Recursive Quadratic Programming: part II. J.Mech. Transm.-T. ASME, 111(1):130-136, 1989. [10] R. Eberhart and Y. Shi. A Modified Particle Swarm Optimizer. In Proc. IEEE International Conference on Evolutionary Computation, Anchorage, Alaska, USA, 1998, pages 69-73. [11] J. Fu, R. Fenton and W. Cleghorn. A Mixed Integer-discrete-continuous Programming Method and its Applications to Engineering Design Optimization. Eng. Optimiz., 17(4):263-280, 1991. [12] J. Golinski. An Adaptive Optimization System Applied to Machine Synthesis. Mech. Mach. Theory, 8(4):419U436, 1973. [13] C. Guo, J. Hu, B. Ye and Y. Cao. Swarm Intelligence for Mixed-variable Design Optimization. J. Zheijiang University Science, 5(7):851-860, 1994. [14] S. He, E. Prempain and Q. Wu. An Improved Particle Swarm Optimizer for Mechanical Design Optimization Problems. Eng. Optimiz., 36(5):585-605, 2004. [15] A. Hernandez Aguirre, A. Muñoz Zavala, E. Villa Di-harce and S. Botello Rionda. COPSO: Constrained Optimization via PSO Algorithm. Technical report No. 1-07-04/22-02-2007, Center for Research in Mathematics (CIMAT), 2007. [16] X. Hu, R. Eberhart and Y. Shi. Engineering Optimization with Particle Swarm. In Proc. IEEE Swarm Intelligence Symposium, Indianapolis, Indiana, USA, 2003, pages 53-57. [17] J. Kennedy. Small World and Mega-Minds: Effects of Neighborhood Topologies on Particle Swarm Performance. In IEEE Congress on Evolutionary Computation (CEC 1999), Washington, DC, USA, 1999, pages 1931-1938. [18] J. Kennedy and R. Eberhart. Bores Bones Particle Swarm. In Proc. IEEE Swarm Intelligence Symposium, Indianapolis, Indiana, USA, 2003, pages 8089. [19] H. Li and T. Chou. A Global Approach of Nonlinear Mixed Discrete Programming in Design Optimization. Eng. Optimiz., 22(2):109-122, 1993. [20] H. Loh and P. Papalambros. A Sequential Linearization Approach for Solving Mixed-discrete Nonlinear Design Optimization Problems. J. Mech. Des.-T. ASME, 113(3):325-334, 1991. [21] M. Mahdavi, M. Fesanghary and E. Damangir. An Improved Harmony Search Algorithm for Solving Optimization Problems. Appl. Math. Comput., 188(2):1567-1579, 2007. [22] E. Mezura and C. Coello. Useful Infeasible Solutions in Engineering Optimization with Evolutionary Algorithms. Lect. Notes Comput. Sc., 3789:652-662, 2005. [23] K. Ragsdell and D. Phillips. Optimal Design of a Class of Welded Structures using Geometric Programming. J. Eng. Ind., 98(3):1021-1025, 1976. [24] E. Sandgren. Nonlinear Integer and Discrete Programming in Mechanical Design Optimization. J. Mech. Des.-T. ASME, 112(2):223-229, 1990. [25] G. Thierauf and J. Cai. Evolution Strategies - Paral-lelization and Applications in Engineering Optimization. In Parallel and Distributed Precessing for Computational Mechanics. B.H.V. Topping (ed.), Saxe-Coburg Publications, 2000, pages 329-349. [26] S. Wu and T. Chou. Genetic Algorithms for Nonlinear Mixed Discrete-integer Optimization Problems via Meta-genetic Parameter Optimization. Eng. Optimiz., 24(2):137-159, 1995. 326 Informatica 32 (2008) 319-326 L.C. Cagnina et al. [27] C. Zhang and H. Wang. Mixed-discrete Nonlinear Optimization with Simulated Annealing. Eng. Opti-miz., 21(4):277-291, 1993.