1、英语原文 : Application of Genetic Programming to Nonlinear Modeling Introduction Identification of nonlinear models which are based in part at least on the underlying physics of the real system presents many problems since both the structure and parameters of the model may need to be determined. Many me
2、thods exist for the estimation of parameters from measures response data but structural identification is more difficult. Often a trial and error approach involving a combination of expert knowledge and experimental investigation is adopted to choose between a number of candidate models. Possible st
3、ructures are deduced from engineering knowledge of the system and the parameters of these models are estimated from available experimental data. This procedure is time consuming and sub-optimal. Automation of this process would mean that a much larger range of potential model structure could be inve
4、stigated more quickly. Genetic programming (GP) is an optimization method which can be used to optimize the nonlinear structure of a dynamic system by automatically selecting model structure elements from a database and combining them optimally to form a complete mathematical model. Genetic programm
5、ing works by emulating natural evolution to generate a model structure that maximizes (or minimizes) some objective function involving an appropriate measure of the level of agreement between the model and system response. A population of model structures evolves through many generations towards a s
6、olution using certain evolutionary operators and a “survival-of-the-fittest” selection scheme. The parameters of these models may be estimated in a separate and more conventional phase of the complete identification process. Application Genetic programming is an established technique which has been
7、applied to several nonlinear modeling tasks including the development of signal processing algorithms and the identification of chemical processes. In the identification of continuous time system models, the application of a block diagram oriented simulation approach to GP optimization is discussed
8、by Marenbach, Bettenhausen and Gray, and the issues involved in the application of GP to nonlinear system identification are discussed in Grays another paper. In this paper, Genetic programming is applied to the identification of model structures from experimental data. The systems under investigati
9、on are to be represented as nonlinear time domain continuous dynamic models. The model structure evolves as the GP algorithm minimizes some objective function involving an appropriate measure of the level of agreement between the model and system responses. One examples is nieJ121 (1) Where 1e is th
10、e error between model output and experimental data for each of N data points. The GP algorithm constructs and reconstructs model structures from the function library. Simplex and simulated annealing method and the fitness of that model is evaluated using a fitness function such as that in Eq.(1). Th
11、e general fitness of the population improves until the GP eventually converges to a model description of the system. The Genetic programming algorithm For this research, a steady-state Genetic-programming algorithm was used. At each generation, two parents are selected from the population and the of
12、fspring resulting from their crossover operation replace an existing member of the same population. The number of crossover operations is equal to the size of the population i.e. the crossover rate is 100. The crossover algorithm used was a subtree crossover with a limit on the depth of the resultin
13、g tree. Genetic programming parameters such as mutation rate and population size varied according to the application. More difficult problems where the expected model structure is complex or where the data are noisy generally require larger population sizes. Mutation rate did not appear to have a si
14、gnificant effect for the systems investigated during this research. Typically, a value of about 2 was chosen. The function library varied according to application rate and what type of nonlinearity might be expected in the system being identified. A core of linear blocks was always available. It was
15、 found that specific nonlinearity such as look-up tables which represented a physical phenomenon would only be selected by the Genetic Programming algorithm if that nonlinearity actually existed in the dynamic system. This allows the system to be tested for specific nonlinearities. Programming model
16、 structure identification Each member of the Genetic Programming population represents a candidate model for the system. It is necessary to evaluate each model and assign to it some fitness value. Each candidate is integrated using a numerical integration routine to produce a time response. This sim
17、ulation time response is compared with experimental data to give a fitness value for that model. A sum of squared error function (Eq.(1) is used in all the work described in this paper, although many other fitness functions could be used. The simulation routine must be robust. Inevitably, some of th
18、e candidate models will be unstable and therefore, the simulation program must protect against overflow error. Also, all system must return a fitness value if the GP algorithm is to work properly even if those systems are unstable. Parameter estimation Many of the nodes of the GP trees contain numer
19、ical parameters. These could be the coefficients of the transfer functions, a gain value or in the case of a time delay, the delay itself. It is necessary to identify the numerical parameters of each nonlinear model before evaluating its fitness. The models are randomly generated and can therefore c
20、ontain linearly dependent parameters and parameters which have no effect on the output. Because of this, gradient based methods cannot be used. Genetic Programming can be used to identify numerical parameters but it is less efficient than other methods. The approach chosen involves a combination of
21、the Nelder-Simplex and simulated annealing methods. Simulated annealing optimizes by a method which is analogous to the cooling process of a metal. As a metal cools, the atoms organize themselves into an ordered minimum energy structure. The amount of vibration or movement in the atoms is dependent
22、on temperature. As the temperature decreases, the movement, though still random, become smaller in amplitude and as long as the temperature decreases slowly enough, the atoms order themselves slowly enough, the atoms order themselves into the minimum energy structure. In simulated annealing, the par
23、ameters start off at some random value and they are allowed to change their values within the search space by an amount related to a quantity defined as system temperature. If a parameter change improves overall fitness, it is accepted, if it reduces fitness it is accepted with a certain probability. The temperature decreases