Paper received: 28.2.2008 Paper accepted: 15.5.2008 Coupling Functions Treatment in a Bi-Level Optimization Process Benoît Guédas* - Philippe Dépincé Ecole Centrale de Nantes, Institut de Recherche en Communication et Cybernétique de Nantes, France The optimization of complex engineering systems is often a mix of multi-objective optimization process - each service, discipline has to fulfill several objectives - and multidisciplinary optimization process - several disciplines are required. The disciplines are bound to each other: outputs of one discipline are used as input of other disciplines (the coupled variables) and a discipline can not have a direct access and knowledge to the whole set of variables. An approximation of the coupled variables is thus needed. The Collaborative Optimization Strategy for Multi-Objective Systems (COSMOS) has been developed at IRCCyN to perform simultaneously multi-objective and multidisciplinary optimization while discipline autonomy is guaranteed. It uses a simple method for the approximation of coupled variables and assumes that the quality of the approximation will increase as the algorithm converges to optimum solutions. In this paper, experiments are made to verify wether this assumption is true. We show that satisfying results are found on some test problems but limits of the methods are pointed out. © 2008 Journal of Mechanical Engineering. All rights reserved. Keywords: multidisciplinary optimization, multi-objective optimization, genetic algorithms, coupled problems 1 INTRODUCTION Nowadays, the designer has to face the continuous growing complexity of engineering problems, but also, the increasing economic competition that have led to a specialization and distribution of knowledge, expertise, tools and work sites. Consequently, multi-objective optimization (MOO) and multidisciplinary design optimization (MDO) are more and more used to provide one solution or an optimal set of solutions. While single-discipline optimization is mature, the design and optimization of complex systems (more than one discipline) is still quite young. Since the white papers provided in 1991 and 1998 by the AIAA [18] and [12], lot of research has been done in the multidisciplinary optimization domain: at the beginning centered on the aerospace industries, they are now used in different kind of enterprises (automotive, ship building, etc.) which expect from such a tool a way to improve their products, their organizations, Alexandrov and Lewis [1] defined MDO as a "systematic approach to optimization of complex coupled engineering systems where "multidisciplinary" refers to the different aspects that must be included in the design problem". A classical way to describe a multidisciplinary problem is presented in Figure 1. In a multidisciplinary problem, each sub-system (discipline) has its own design variables, objective and constraint functions. Some design variables, common to at least two sub-systems, are called common variables. Disciplinary outputs from one discipline can be needed to evaluate another subsystem. In this case there is a coupling between two disciplines and these variables are called coupling variables [8] and [5]. The third variable type, state variables, are internal variables particular to one discipline: they represent conditions that have to be satisfied within the discipline. In each discipline an evaluation/analysis is conducted that allows to compute the outputs: functions, constraints and coupling variables if needed. Frequently, complex systems are non-hierarchical implying that there is no reason to process the optimization of one sub-system before another [5]. In the optimization process of such systems, the presence of coupling functions and their recognition constitutes a real challenge for researchers. Several methods have been designed to deal with coupling problems (MDF, IDF, AAO, CO, CSSO, BLISS, etc. - section 2), but they are not *Corr. Author's Address: Ecole Centrale de Nantes, Institut de Recherche en Communication et Cybernétique de Nantes, 1, rue de la Noë - BP 92 101, 44321 Nantes CEDEX 3, France, benoit.guedas@irccyn.ec-nantes.fr 413 Fig. 1. A fully coupled disciplines system suited for the extended enterprise context where disciplines and tools are distributed on multiple sites. The main drawbacks of theses approaches are i) the unique solution given to the designer and ii) their mathematical formulation that is not always adapted to the industrial context: most of these methods centralize the optimization at the system level while it should be handled by sub-system levels. Collaborative Optimization Strategy for Multi-Objective Systems (COSMOS) [19] is a method aiming to fill the gap between classical MDO methods and industrial needs by: i) taking advantages of the multi-objective genetic algorithms and provides the designer with a set of optimum solutions and ii), giving more autonomy to the disciplines. The next part of this paper will present the MDO methods: the classical ones based on exact mathematical approaches and some that try to simulate an engineering process and are based on multi-objective genetic algorithm. The third part introduces the problems caused by the autonomy of disciplines in the resolution process. Parts 4 and 5 describe the test examples and the result obtained by COSMOS method on coupled problems. Finally, some conclusions and perspective are given. 2 COUPLING FUNCTIONS IN MULTIDISCIPLINARY OPTIMIZATION METHODS The ideal optimization process to solve a multidisciplinary optimization problem consists in separating the analysis phase from the optimization phase: the multidisciplinary analysis (MDA) computes the set of feasible solutions then the optimization selects the optimum from the previous set. This approach is not possible in practice because of the high computational cost required to determine the whole set of feasible solution. Moreover in most problems the disciplines cannot easily exchange data between each others. 2.1 A Multi-Objective Coupled Problem Let's consider a simplified model of two disciplines D1 and D2 (it can be generalized to n disciplines). Each discipline D. has a state equation E.(xc, x,, y , u) = 0. We will consider that there is an implicit function e.: X x X x Y ^ U, where x e 1 i c i J i c Xc is the design variables that are shared among disciplines (common variables), x.e X.is the vector of disciplinary variables, yJe Y is a parameter given by the discipline D, and u. e U the vector of state variables given by the state equation ei. This results in the following coupled problem: y-2 = .t'2,iil) D . "1 = ''lfcc ■■'■!■ 112 ) D Hi = M^c.t'i.ei) Finding the feasible solutions to this problem is called multidisciplinary analysis (MDA). We call A the set of feasible solutions of the problem (the multidisciplinary feasible solutions): .4 = {(j'c--i'i-"i "2) € Xe x Ai x X2 xUi xi/al «'l = iiUc J'i.?2(,rc..r2. ti2!) A mj = r2(JV.xj.?i(xc.xi.wi))} (2). Multidisciplinary optimization consists in adding an optimization problem to the satisfaction of the coupled problem. We will focus on the case where each discipline has its own optimization problem which is to minimize an objective function f : XcxX,xU ^ RPi with p. the number of objectives. A standard formulation of a two disciplines multidisciplinary optimization problem is then: mill fl(,t'c.,t'i.i;i) f mill fl(j'c- J'2- "2 ) jicxl x..-.x2 Dl } (fi = <"i C.cc. .('!. 1/2) ^ u2 = e2{xc,x2,yi) m = I m = hUc--'-'a - «1) The set S of solutions to the problem is then: (3). (4). With p the order relation of Pareto dominance: given a = (a1.....an) and b = (b1.....b) a p b & Vie {1,...,n} , at < bt a3Je {1,...,n} , a. < b. Unfortunately, such a set of optimum solutions is intractable under the hypothesis of partitions of the variables in each discipline. Indeed, a discipline i cannot access another discipline J variables and does not know its coupling function. Most of the MDO methods reported in the literature are developed specifically for single-objective problems with continuous variables and differentiable objective. These MDO methods are classified in two groups: mono-level and bi-level. The single-level (mono-level) group implies optimization at the supervisor level only. The bilevel group allows each discipline to manage its own optimization regarding its design variables. Multidisciplinary problems are often written in a simpler form, where the state variables are directly given to the other discipline, so they are also coupling variables: f mill /1 i'l- i/i) f mill fo(xc, .t'2, ¿/2) Dl —* Dj — (5). { yi = (i(-i,c--',i-h-2) ( >12 = This notation will be used in the next section to present some multidisciplinary optimization methods. 2.2 Mono-Level Approaches The mono-level family contains three multidisciplinary methods: Multidisciplinary Feasible (MDF), All-At-Once (AAO) and Individual Discipline Feasible (IDF) [16], [1], [14] and [6]. All the given formulations have different ways to handle the dependency of coupling functions. Dennis et al. [9] proposed an extension of all the above methodologies to the optimization of system of systems. 2.2.1 Multidisciplinary Feasible (MDF) disciplines are independent. Drawbacks are the computational effort and the lack of guaranty for the coupling variables to converge to a feasible solution. (6). The optimization variables are xc, x1 and x2. At each optimization step, the set of feasible solutions - described as A in (2) - is computed. The system of coupling equations must be solved. A fixed point iteration (FPI) algorithm, often used in this case, may not converge if the functions are not convex and may avoid hidden solutions [3]. 2.2.2 All at Once (AAO) All the variables (design, coupling, state) are considered as design variables and the analysis system equations becomes constraint. Hence, CPU consuming iterative analysis of sub-system are skipped but it increases the dimension of the design space. The problem formulation can be expressed by: (7). The optimization variables are x , xv x2, y and y2. 2.2.3 Individual Feasible (IDF) MDF is the most used approach to solve a MDO problem. A complete multidisciplinary analysis is performed for each choice of the design variables by the optimizer. This is conceptually very simple, and once all disciplines are coupled to form one single multidisciplinary analysis module, one can use the same techniques used in single discipline optimization. In this formulation the optimization variables are the design variables, the optimization is global and each iteration gives a feasible solution. Moreover the evaluation within the IDF is a compromise between AAO and MDF. At each point, each discipline is feasible but the whole system will only be feasible at the end. In this methodology, coupling variables are added to design variables and some auxiliary variables, z, are introduced to allow decoupling the disciplines. Some equality constraints are added that allow compatibility between coupling and auxiliary variables. This substitution relaxes the coupling between disciplines: for some iterations, a point cannot fulfill all the coupling. (8). The optimization variables are xc, x1, x2, z1 and z2. In mono-level approaches, the optimization problem is seen as a single global problem and all the variables are accessible. Multi-level approaches give more autonomy to the disciplines by allowing them to solve their own optimization problem locally. 2.3 Multi-Level Approaches In the case of bi-level optimization method, the original optimization problem is divided into optimization at both system and sub-system levels. Coordination between sub-systems is managed by an optimizer in charge of solving inconsistencies between the disciplines. Several strategies have been developed and the most discussed are Collaborative Optimization (CO) [7], and Concurrent SubSpace Optimization (CSSO) [20]. Other methods like Bi-Level Integrated System Synthesis (BLISS) [15], Analytical Target Cascading (ATC) [2] or Physical Programming (PP) [17] have been developed but will not be detailed in this paper. The two firsts are part of the Discipline Feasible Constraint (DFC) group. The primary features of each of these architectures include : i) the use of heterogeneous hardware or software, specific to the domain, to solve the subspace optimization problems, ii) the decomposition keeps domain-specific constraint information in the sub-problem, iii) the system leaves most of the design decisions (selection of local variables) to the disciplinary groups that understand the local formulation. 2.3.1 Collaborative Optimization (CO) In CO subspace optimizers are integrated with each subsystem. Through sub-system optimization each discipline can control its own set of local design variables and is in charge of satisfying its own domain specific constraints. Explicit knowledge of the other groups constraints or design variables is not required. The objective of each subsystem optimizer is to agree upon the values of the interdisciplinary variables with the other groups. A system level optimizer is employed to coordinate this process while minimizing the overall objective. It promotes disciplinary autonomy while achieving interdisciplinary compatibility. The subsystem optimizer does not allow discipline optimization but only tries to reach consistency upon the common and coupled variables. The optimization process remains global at the system level. All the methods described above are not designed for multi-objectives problems and give only a single solution to the designer. 2.4 Multi-Objective Multi-Level Approaches As far as we are concerned, main advantage of MDO methods should focus on their ability to decompose a multidisciplinary problem into several sub-problems of manageable size that can be solved simultaneously. According to the current complexity and antagonist objectives to achieve, it should also be able to provide a set of solutions (not only a single one that relies on an a priori choice of the designers) and finally MDO should be adapted to the structure of the enterprise and the way design of systems including several disciplines is conducted. The three methods presented thereafter are a first answer to such specifications. They can solve MDO problems that are decomposed into a hierarchy of several subsystem-level problems each of which has multiple objectives and constraints. Among different optimization algorithms that can be used for solving the subsystem problems, genetic algorithms (GAs) are used in the three methodologies. Using a population based optimization approach at both levels (i.e., system and subsystem levels) implies that a compromise as to be found at the system level to map fitness of solutions from multiple Pareto sets to a single system level candidate solution. 2.4.1 Multidisciplinary Optimization and Robust Design Approaches Applied to Concurrent Engineering (MORDACE) The MORDACE method [11] is based on a robust design approach: finding solutions that are robust with respect to changes in variable values due to discipline interactions. The MORDACE approach allows to independently perform the different discipline optimizations. Each discipline aims at finding optimum solution with respect to its own design variables thank to a Multi-Objective Genetic Algorithm (MOGA) in order to obtain for each discipline the Pareto frontier as the set of best solution design. When the independent optimization processes are finished, the designer has to find a compromise on common variable values. Changes in common variable values in order to find the best trade-off may worsen performance levels. This difficulty is solved by adding to the set of design objectives fi, a function that minimizes the effect of values variation of common variables. As disciplines simultaneously minimize objective functions fi and sensitive function, they are always multi-objectives. Among available designs, the procedure chooses Pareto designs plus all individuals that dominate the original one with regard to different disciplines. Then, it defines all possible couples made up of solutions proposed by discipline 1 and 2, respectively. At this stage, the calculation of a distance parameter allows efficient solutions to be sorted out from the very large set of all possible couples. Thus, a limited number of couples are automatically chosen. Those solutions show small difference between discipline 1 and 2 common variable values, and they are robust with regard to changes in those values. Then, performances and coupling functions of the compromise designs defined by the new vectors of variables have to be verified. Within the MORDACE method the designer needs to use a compromise method limited by the number of evaluation of potential solutions the designer allows. MORDACE uses approximations (response surfaces) of the coupling functions in each discipline. 2.4.2 Entropy-Based Multi-Level Multi-Objective Genetic Algorithm E-MMGA) This method relies on a decomposition of the optimization at the disciplinary level. A first proposal was given in [13], but do not take into account the coupling functions and was limited to hierarchical system. Each multi-objective GA at the subproblems operates on its own population of (xc, xi). The population size, n, for each sub-problem is kept the same. In addition, E-MMGA maintains two populations external to the sub-problems: the grand population and the grand pool. Both are populations of complete design variable vector: (xc, x1, ... , xd). The grand population is an estimate of the solution set to the overall optimization problem. The grand pool is an archive of the union of solutions generated by the sub-problems. The size of the grand population is the same as the sub-problem population size, n. The size of the grand pool is d times the size of the subsystem's population (d is the number of sub-problems or disciplines). The population of the grand population set is used as the initial population for each subproblem. Since the sub-problem multi-objective GAs operates on its own variables (xc, xi), only the chromosomes corresponding to xc and xi are used in the ith sub-problem. After each run of subproblem multi-objective GA there will be d populations having n individuals each. As each of the d populations contains only the chromosomes of only (xc, x) i = 1.....d, they are completed using the rest of the chromosome sequence (x1, ... , xi-1, xi+1.....xd) from the grand pool. After the chromosomes in all d populations are reconstituted to form the complete design variable vector, they are added to the grand pool. Then based on an entropy index that preserves the diversification of the solutions set, n individuals are chosen within the grand pool and replace the n individuals from the grand population. An important drawback is that the size of the grand pool increases very quickly with the number of disciplines and individuals. A variant has been proposed in [4]: in each subsystem only one solution is selected on the Pareto frontier and its objectives and constraints values are used to assign a fitness value for the system level individual. The so-called best solution for each disciplines is chosen by an algorithm thank to the designer or decision maker preferences. The coupling functions are taken into account thanks to supplementary constraints -added both at system and subsystem levels- and auxiliary variables. Note that the shared variables can be treated as parameters in the subsystems and it reduces the dimensionality of the subsystem level optimization problems. In this last case, the coupling variable values are not passed to the system level. 2.4.3 Collaborative Optimization Strategy for Multi-Objective Systems (COSMOS) Two variants (COSMOS-G and COSMOS-L [19]) have been proposed and the fundamental difference between them resides in a different treatment of common design variables. In the following, COSMOS refers to COSMOS-G. Lets n the size of the population and d the number of disciplines. During the initialization the supervisor creates a population of common design variables X , with xck the kh element of the n-tuple Xc = (xc1.....xcn). Each discipline i also creates a population of disciplinary variables X. = (x;1.....x.J. In order to get a fully determined population, the supervisor sends the n-tuple of coupling design variables Xc to each discipline. Each discipline i builds a disciplinary population Pop, ={(xc,k, xlk)|l < k < n} for which it can evaluate objective and constraint functions. An initial population can be created by the aggregation of common and disciplinary design variables and saved in Pop . 1 memorized Optimization at sub-system level: The supervisor provides a set of common design variables Xc to the disciplines. Each discipline i optimizes the design variables of a population of individuals (xk, xik) where 1 < k < n. The n-tuple X is fixed in order to keep the disciplinary population coherent with the other disciplinary populations. At the end of the disciplinary optimizations, each discipline sends a vector of disciplinary design variables optimized X* to the supervisor. Since the vector of common design variables has not been modified in the discipline, a global population can be built and is naturally coherent: PoPcurrent = (,k > x,k >."> xd,k) I1 < k < n} . Optimization at system level: The goal is to propose new and better common design variables (in order to improve the population). So, the current population, Pop , is assembled with the 11 1 current memorized population, Pop , in order to memorized provide a double-sized population: Popdouble. This population is ranked by the Fonseca and Fleming's criterion (notion of Pareto domination) according to all the objective functions of the problem. The best individuals are selected to build a normal-sized population, Popcurrent. This population will be sent to the disciplines. In parallel, cross-over and mutations are made on the common design variables of the population. This new population is saved: Pop . ,. It will be evaluated once by 1 memorized J the disciplines in order to determine its objective and constraint functions. Coupling function treatment: All the values of the coupling variables yi computed in the discipline i at the end of its sub-system optimization are stored in an array. This array contains all the couples (y , xct) where xck is the vector of coupled variables of the kth individual of the population, and y,k is the coupled variables computed by the coupling function li in the discipline i with the common xc and disciplinary variables Xik obtained at the end of the sub-system optimization. When another discipline j needs the value of y, for all its xck it will search the corresponding y.k - when it exists - in the array. If the array does not contain the desired xck, the closest is picked. In other words, for each discipline i, there is a table T. of couples (xc, y) such as Tt c Xc x Yt. Thus, T. is the graph of the relation T =(Xc, Y, T) which is given to the discipline j. Our main objective, in this paper, is to study the behavior of COSMOS on coupled problems and more precisely the evolution of the error of approximation of the coupled variables introduced by using the relation T instead of the function l. We will first introduce the problems caused by the separation of the design problem into more autonomous sub-problems. 3 PROBLEMS CAUSED BY DISCIPLINE AUTONOMY In order to study the treatment of coupled functions, we need to point out the types of difficulties that coupled functions and division of work in subsystems may introduce in multidisciplinary problems. From a global point of view, a multi-objective multidisciplinary problem is composed of two steps: analysis and optimization. Within the natural organization - set A in Figure 2 - analysis could be performed on the whole system aside from the optimization but distribution of work in multiple disciplines implies to split the problem in a way that suit to industrial context - set B in Figure 2. This raises new difficulties in problem solving because optimization and analysis are no more fully solved at each step. Indeed, they are split in two sub-problems in which optimization and analysis are partially performed for each discipline. min «i := ,y2) IM Dl yi = h{-''c-J'i tu) l m = h{xc,22,ui min (12 := e2(xc, j:2.