Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Optimization_by_a_new_improved_Contraction-Expansion_algorithm_and_application.pdf

MengLuo
March 13, 2016

 Optimization_by_a_new_improved_Contraction-Expansion_algorithm_and_application.pdf

MengLuo

March 13, 2016
Tweet

More Decks by MengLuo

Other Decks in Research

Transcript

  1. Optimization by a new improved contraction-expansion algorithm and its application

    Meng Luo, Shiliang Gu Yangzhou University March 13, 2016 Annual Report 2015
  2. PART ONE  Curve and surface fitting using Contraction-expansion optimization

    algorithm PART TWO  A New Regression Method: Screening Stepwise Regression PART THREE  The Traveling Salesman Problem (TSP)
  3. Introduction  General methods for curve and surface fitting Contraction-expansion

    algorithm Instance verification (Experimental part) Discussion and conclusion Future work OUTLINE
  4. INTRODUCTION:FITTING • Fitting with a curve and surface, or fitting

    a statistical/mathematical model to data finds its application in almost all empirical sciences, viz., physics, chemistry, zoology, botany, environmental sciences and economics, etc. That the processing of data in experimental methods, i.e., such as UV spectroscopy, X-ray analysis, IR spectrometry or chromatographic techniques, etc., is a key part of any research work, in which curve and surface fitting has become a standard in the last few years. • It has four objectives: the first, to describe the observed (or experimentally obtained) dataset by a statistical/mathematical formula that select the appropriate model; the second, to estimate the parameters of the formula so obtained and interpret them, which the interpretation is consistent with the generally accepted principles of the discipline concerned; the third, to predict, interpolate or extrapolate the expected values of the dependent variable with the estimated formula; and the last, to use the formula for designing, controlling or planning. • There are many principles of curve and surface fitting, such as the Least Squares (of errors), the Least Absolute Errors, the Generalized Method of Moments and so on.
  5.  Lest sum of squares to deviation from regression(RSS, or

    Q) is usually used as objective function of nonlinear regression. It may be noted that a nonlinear least squares problem is fundamentally a problem in optimization of nonlinear functions. And the elementary optimization main included Univariate Optimization, Multivariate Optimization and Constrained Optimization. Initially optimization of nonlinear functions was methodologically based on the Lagrange-Leibniz-Newton principles and therefore could not easily escape local optima.  According to all above, the methods of parameter estimation can be divided into two categories. One is analysis method characterized by figuring out Q’s partial derivative simultaneous equations. This method main includes Gradient method, Newton method, Gauss-Newton method and so on. Another kind is a direct method which directly using iterative estimation from nonlinear equations. It main includes Genetic Algorithm(GA), Simulated Annealing(SA), Evolutionary Algorithm(EA), Particle Swarm Optimization(PSO) and Geometric Fitting Algorithms, etc. OPTIMIZATION
  6. Objectives of the Present Work • The objective of the

    present work is to evaluate the performance of the Contraction-expansion algorithm at nonlinear curve and surface fitting. • For this purpose, we have collected problems-models and datasets-mostly from two main sources; the first from the website of NIST [National Institute of Standards and Technology (NIST), US Department of Commerce, USA at http://www.itl.nist.gov/div898/strd/nls/nls_m and the second, the website of the CPC-X Software (makers of the AUTO2FIT Software at http://www.geocities.com/neuralpower now new website at http://www.7d- soft.com/en/auto2fit.htm, And also now the new soft name is 1stOpt). Also some models (and datasets) have been obtained from other sources also (www.MathWorkS.com). • It also may be noted that the difficulty level of a Least Squares curve and surface fitting problem depends on: (Ⅰ) the (statistical) model, (Ⅱ) the dataset, (Ⅲ) the algorithm used for optimization, and (Ⅳ) the guessed range (or the starting points of search that any arbitrary value) of parameters.
  7. • According to the level of difficulty, the problems can

    be classified into four categories: Lower, Average, Higher, and Extra Hard. The list of problems (dealt with in the present study) so categorized is given below table1: Table1: Classification of Problems according to Difficulty Level Difficulty level Problem Names Source of Problem Classified by Lower Misrela, MisrelB, Chwirut1, Chwirut2, Lanczos3, Gauss1, Gauss2, DanWood, NIST NIST Judge Goffe Author Average ENSO, Gauss3, Hahn, Kirby, Lanczos1,Lanczos2, MGH17, Misra1c, Misra1d, Nelson, Roszman1 NIST NIST Higher Bennett, BoxBOD, Eckerle4, MGH09,MGH10, Ratkowsky42, Ratkowsky43,Thurber NIST NIST Hougen Mathworks.com Author Extra Hard 1stOpt problems (all 9 functions), Mount , SinCos 1stOpt(Auto2Fit) 1stOpt(Auto2Fit)
  8. GENERAL METHODS FOR CURVE AND SURFACE FITTING Objective function of

    fitting multi-variable material: ) , , , ( , , 2 1 i im ik i i Y X X X X     nonlinear function: ) , 2 , 1 ; , 2 , 1 ( ), ; ( ) ; , , , , ( ˆ 2 1 m k n i b X f b X X X X f Y i im ik i i i          Where is the regression value of the nonlinear function. ˆ i Y ) , , , , , ( ' 2 1 p j b b b b b        is p nonzero nonlinear regression parameters. So the sum of squares to deviation from regression(Q):          n i n i i i i i b F b X f Y Y Y Q 1 1 2 2 ) ( )) ; ( ( ) ˆ ( So called optimal fitting that is the estimation of must be content: min ) ( )) ; ( ( 1 2       opt n i opt i i b F b X f Y Q ) , , , , ( , , , 2 , 1 opt p opt j opt opt opt b b b b b         to content Q to be minimum in the p-dimensional space. (1) (2)
  9. CONTRACTION-EXPANSION ALGORITHM Contraction stage (C stage) Contraction stage constitutes an

    initial searching domain begin with initial values and shrinks the searching range constantly to get the optimal parameters. (1) (0) (1) (1) (0) , 2, 1,0,1,2 kj j j j j b b kl k l b             To the j-th parameter (0) j b (1) j l For the center of initial value For the initial step length 5 equal distance searching points small positive number 1,2, j p  L Calculate the searching points in turn to find the step points with minimum Q-value as the next searching center points . (1) j b
  10.  If there are p parameters, the number of the

    searching points in this cycle is 5p. Calculate the searching points in turn to find the points with minimum Q-value and obtain the optimal point as the center of next circle. Use these optimal points with reduced step length to search the similar points. got the optimal points ( ) ( 1) ( ) ( ) ( 1) , 2, 1,0,1,2 / 2.05 t t t kj j j t t j j b b kl k l l              the (t-1)-th cycle ( ) t b the jth parameter as searching points 1,2, j p  L ( ) min C Q ( ) C opt b Note : Objective function , the optimal parameter . Because the searching step length reduces the half after each circle, the searching domain will decrease quickly with the increase of t, and the value of objective function will decrease increasingly. ) , , , , ( ) 1 ( ) 1 ( ) 1 ( 2 ) 1 ( 1 ) 1 ( p j b b b b b      Because Q usually is not the monotone function of b, it usually cannot get the optimal parameters after contraction stage, and it always turns up . ( ) min min C Q Q 
  11. Expansion stage (E stage)  The Expansion stage still adopt

    the five-point-method, which starts with the optimal points after the Contraction stage and expands the searching areas in order to search the parameters better than in the areas nearby V=4-6 the (t-1)-th cycle got the optimal points ( ' 1) t b  The t’-th circle of searching points ( ') ( ' 1) ( ') ( ) ( ' 1) , 2, 1,0,1,2 t t t kj j j t t j j b b kl k l vl              1,2, j p  L Note : the searching area will expand quickly as a result of the step length expand v times after each circle. If the former optimal parameter points are local optimal points, E stage can skip the local pitfalls and reach the global optimum.        circle next the of points center the be t can' points current the circle next the of points center the be can points current the D D  Set a critical value D and let ) ( ) ( ' ' t old t new Q Q   
  12. IMPROVED CONTRACTION-EXPANSION ALGORITHM Step points method’s improvement There are p

    parameters Five points method Three points method The total calculated points of each cycle 5p 3p The more calculation The less calculation Note :when the parameters are few and it would be still better to use the five points method.  Confirmed the centre point of the next cycle.  During the searching process, the parameter point can be the centre point of the current cycle instantly as long as the new optimal point (spring point) appears.
  13. COMBINING NUMERICAL DERIVATIVE WITH CONTRACTION-EXPANSION ALGORITHM Numerical differentiation  It

    is hard for us to get the partial derivative function, so it isn’t suitable to put it become the first condition of curve and surface fitting. If we use numerical differentiation, we can get the estimation value of the partial derivative function in b . Expressed by the formula: ' 0 ( ) ( ) ( ) ( ) lim ( ) df b f b f b f b db b b           Two nearby points If is a small number, the derivative of the point can be approximate to:  ' ( ) ( ) ( ) ( ) ( ) ( ) f b f b f b f b f b b b            
  14.  When the surface and curve fitting involve multi-parameter, the

    approximate partial derivative of the j-th parameter of the formula in the optimal parameter point b (0) : (0) (0) (0) (0) 0 ( ) ( ) ( ) ( ) ( ) i i j i i j i j j j j j f f b f b f b f b b b b               ,is a minor difference of the j-th parameter, which can direct be the step length after the C-stage. Improved Gauss-Newton: 1 A K     (0) b b     When there are some difference between b and b(0) , it must let b instead of b(0) figure out the new .  ) 0 , , , , 0 , 0 ( '   j j   
  15. Instance verification (Experimental part)  In what follows, we present

    our findings on the performance of the Contraction-expansion algorithm at optimization of the Least Squares problems. The datasets and the models are available at the source (NIST, Auto2Fit). Table 2 8 datasets with higher level of difficulty Dataset name Nonlinear function Number of Parameters(p) Number of Observations (n) Level of difficulty Thurber 7 37 High MGH09 4 11 High BoxBOD 2 6 High Rat42 3 9 High MGH10 3 16 High Eckerle4 3 35 High Rat43 4 15 High Bennett5 3 154 High 2 3 1 2 3 4 2 3 5 6 7 ˆ 1 b b b X b X Y b X b X b X        2 1 2 2 3 4 ( ) ˆ b X b X Y X b X b     1 2 ˆ (1 exp( )) Y b b X    4 1 1/ 2 3 ˆ (1 exp( )) b b Y b b X    2 1 3 ˆ exp( ) b Y b X b   2 1 3 2 2 2 ( ) ˆ exp( ) 2 b X b Y b b    1 2 3 ˆ 1 exp( ) b Y b b X    3 1/ 1 2 ˆ ( ) b Y b b X   
  16. Datasets name b1 B2 b3 b4 RSS MHG09 NIST 1.928069346e-1

    1.912823287e-1 1.230565069e-1 1.360623307e-1 3.075056039e-4 C-E 1.928069348e-1 1.912823214e-1 1.230565037e-1 1.360623277e-1 3.075056039e-4 BoxBOD NIST 2.138094089e-2 5.472374854e-1 —— —— 1.168008877e-3 C-E 2.138094092e-2 5.472374813e-1 —— —— 1.168008877e-3 Rat42 NIST 7.246223758e+1 2.618076840e+0 6.735920007e-2 —— 8.056522934e+0 C-E 7.246223748e+1 2.618076843e+0 6.735920014e-2 —— 8.056522934e+0 MGH10 NIST 5.609636471e-3 6.181346346e+3 3.452236346e+2 —— 8.484585517e+1 C-E 5.609636614e-3 6.181346326e+3 3.452236340e+2 —— 8.484585517e+1 Eckerle4 NIST 1.556382718e+0 4.088832175e+0 4.515412184e+2 —— 1.463588749e-3 C-E 1.554382716e+0 4.088832156e+0 4.515412184e+2 —— 1.463588749e-3 Rat43 NIST 6.996415127e+2 5.277125303e+0 7.596293833e-1 1.279248386e+0 8.786404908e+3 C-E 6.996415123e+2 5.277125330e+0 7.596293876e-1 1.279248391e+0 8.786404908e+3 Bennett5 NIST -2.52350580e+3 4.673656464e+1 9.321848319e-1 —— 5.240474407e-4 C-E -2.52196804e+3 4.673018418e+1 9.322900378e-1 —— 5.240432639e-4 Table 3 The certified results of NIST and estimated by improved C-E algorithm for 7 datasets • Except the last dataset of the bennett5 et al( NIST,1994), the improved C-E algorithm fitting the results that objective function is equal to all the questions the public NIST results ( Table 2 ), in which the present algorithm that come true various issues of optimal fitting. It also verified the NIST that the majority of the results are correct .
  17. 50 60 70 80 90 100 110 120 0.5 1

    1.5 2 2.5 3 x 104 X Y data fit MGH10 400 410 420 430 440 450 460 470 480 490 500 0.05 0.1 0.15 0.2 0.25 0.3 0.35 X Y data fit Eckerle4 -3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 200 400 600 800 1000 1200 1400 X Y data fit Thurber Fig.3 The curve fitting figure of result MGH10, Thurber and Eckerle4 E.g fitting figure
  18. From the 1stOpt datasets analysis Test Dataset Regression Equations R2

    Notes Dimension(m) Parameters(p) 1 0.996780 1 5 2 0.934642 4 9 3 0.969930 1 3 4 0.805143 4 9 5 0.994633 2 8 6 0.999644 1 7 7 0.971547 2 7 8 0.995372 3 6 9 0.970475 1 4 5 3 4 1 2 1 ˆ ( ) b b Y b X b b X    1 2 1 3 2 4 3 5 4 6 1 7 2 8 3 9 4 ˆ 1 b b X b X b X b X Y b X b X b X b X          1 2 3 ˆ (1 ) b Y b X X b    1 2 1 3 2 4 3 5 4 6 1 7 2 8 3 9 4 ˆ 1 b b X b X b X b X Y b X b X b X b X          3 5 7 8 1 2 1 4 2 6 1 2 ˆ b b b b Y b b X b X b X X     3 5 7 1 2 4 6 ˆ b b b Y b b X b X b X     1 2 1 3 2 4 1 2 5 1 6 2 7 1 2 ˆ 1 b b X b X b X X Y b X b X b X X        6 1 5 3 2 2 1 3 2 3 4 ˆ ( )(1 )( ) b b Y b X b X b X X b      4 1 2 3 ˆ exp( ) b Y b b X b   Table 4 The 1stOpt testing problems and result (R2)  How performance with C-E algorithm curve and surface fitting software systems and other nonlinear fitting ability to achieve the global optimum were required testing by 1stOpt test questions. These tests are really challenging problems, but using the C-E algorithm can be achieved fitting optimal that through give any initial value, and fundamentally achieved the global optimal solution. For the nine test questions, the results suggestion that C-E through the test problems, and found that for most results are consistent with the results of 1stOpt.
  19. N. b1/b6 b2/b7 b3/b8 b4/b9 b5 RSS SSY R2 1

    1.77400378E-04 7.11350731E-33 16.7253406 1.21591936E-03 3.04365188 1939218.994 586570000 0.996694 2 4.583533 0.954423E-04 0.262201E-03 -0.30464E-05 -0.795377E-05 -0.66982E-02 -0.270534E-01 0.006681996 0.331803E-01 - 3.479215815 25.2536 0.862229 3 -101.078842 -1258.50245 -170.113551 - - 7.68651757 255.5656 0.969924 4 674.679627 0.572582863 227.745913 5.55642598 2120.32839 0.033438616 1.64255958 -0.560015938 -176.051256 53118.24153 272600 0.805143 5 1.02849313 2.67993125E-3 -2.75083053E-14 0.253301683 4.04558865 1.26806961 -1.36709592E- 03 - 1.62322193 - 1.753875407 326.7796 0.994633 6 -2.43500445 1.74691786 8.88851139 0.818860686 -1.16451411 - -5.5959852E-04 - 4.02126762 - 0.004610731 12.9610 0.999644 7 92.0801585 -3.03915462e-2 -2.67381750e-2 1.07051864e-5 -2.72093319 - 7.44537711e-4 - -3.84596257e-4 - 21.26377223 747.3331 0.971547 8 178962.386 -2.59652299 3672.73182 - 0.530213052 - 27.8050105 - 195.346878 - 0.010560486 2.2822 0.995373 9 19.1581777 -0.362592753 -29.8159227 2.29795107 - 14.66642182 488.5455 0.969979 Table 5 The fitting results of 1stOpt testing problems with improved C-E algorithm Compared 1stOpt results
  20. N. b1 b2 b3 b4 b5 b6 b7 RSS R2

    7-1 92.0801585 -2.67382e-2 -2.7209332 7.445377e-4 -3.84596e-4 -3.03916e-2 1.070517e-5 21.263772 0.971547 7-2 66.313876 -0.013405 -2.0433683 3.60357e-4 -2.1008e-4 -3.2448e-2 5.78165e-6 34.325346 0.954070 7-3 -7190171.8 4480.2531 459000.36 -235.14145 59.44377 1914.3001 -3.20806 49.945287 0.933169 Table 6 Global and local optimal for the problem (7) Fig.4 The surface fitting figure of result 7-1(center), 7-2(left) and 7-3(right) Test 7
  21. Fig. 5 The surface fitting figure of result 8 N.

    b1 b2 b3 b4 b5 b6 RSS R2 8 178962.39 3672.7318 0.5302131 27.805011 195.34688 -2.596523 0.0105605 0.995373 6 1 5 3 2 2 1 3 2 3 4 ˆ ( )(1 )( ) b b Y b X b X b X X b      Test 8
  22. Fig. 7 The curve fitting plot of Mount function Mount

    function 2 1 2 3 2 ˆ ( / )exp( 0.5(( ) / ) ) Y b b X b b    Function Algorithms b1 b2 b3 RSS R2 Mount Maximum Inherit Optimization 1.5413091 4.0173585 450.89195 0.005159 0.997148464 Mount Contraction Expansion Optimization 1.5412806 4.0172844 450.89201 0.005159 0.99715 400 410 420 430 440 450 460 470 480 490 500 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 data fit Table 7 compared the different Optimization Algorithm for the Mount Function
  23. Fig. 7 The surface fitting plot of SinCos function SinCos

    function 1 1 2 3 2 3 4 1 2 3 / ( / ) ˆ ( ) b X b COS b X X Y b SIN X X X      Function Algorithms b1 b2 b3 b4 RSS R2 SinCos Maximum Inherit Optimization 0.493369 2.937520810.999959 5.837468 0.03861956 0.9974046 SinCos Contraction Expansion Optimization 0.4934612 2.939078410.9999625.8368422 0.038619065 0.9974 Table 8 compared the different Optimization Algorithm for the SinCos Function
  24. Simulated data analysis b’=(5.004019,3.0379797,10.007446,0.19857432,1.0006819,-0.35678459,0.70202144, 1.0002495,0.99617730,0.99930820,0.96377287), Q=0.02780356233, R2=0.999923. 2 2 2

    2 2 2 1 8 2 5 1 10 2 1 11 7 2 [ ( 1) ] ( ) [ ( ) ] 2 3 5 1 2 1 3 4 1 1 9 2 6 ˆ (1 ) ( ) X b X b X b X X b b X Y b b X e b b X X b X e b e                Fig. 7 The surface fitting plot of MathWorks Peaks function simulation data
  25. CONCLUSION AND FUTURE WORK • The contraction-expansion (CE) algorithm (method)

    applied to fit functions to datasets given by NIST and others has exhibited a mixed performance. It has been successful at the job for all problems, of various degrees of difficulty, given by NIST, and 1stOpt(was Auto2Fit) test questions. • Being with only the contraction-expansion algorithm part, while most of the problems can be achieved global optimization, but more cycles and more seizes a long time to calculate. Also having only the numerical differentiation and improved Gauss-Newton method section, while faster achieved global optimization; but if take any arbitrary improper initial value, it is easy to fall into the pitfalls of the local optimum, which the estimated parameters are basically not optimal. Therefore, this paper contraction-expansion algorithm will combination numerical differentiation and improved Gauss- Newton methods. Not only can overcome their shortcomings, but also made curve and surface fitting achieved optimal.
  26. • we may conclude that its success rate is appreciably

    high and it may be used for solving nonlinear curve and surface fitting problem with some good degree of reliability and dependability. It may be noted that there cannot be any ‘sure success method’ to solve all the problem of nonlinear least squares curve and surface fitting. • “Some of those test data are very hard, and may never get right answers without using 1stOpt(Auto2Fit). Even for 1stOpt(Auto2Fit), it does not ensure every run will be successful. … In some cases, you may try to change the control parameter of ‘Population Size’” They have suggested that to solve these problems one should use Global Levenberg-Marquard or Global BFGS method. • For the new contraction expansion algorithm that has been compiled practical program based on Matlab (License No. 271282) software platform. One-dimensional (curve) and two-dimensional (surface) of the fitting process that can intuitive and eye-catching dynamic display, and it can easily offer curve and surface fitting user to application. • Apart from the improvements needed to make our algorithm more robust, the next step could be to find new applications for our method. I.e., Traveling Salesman Problems(TSP) .
  27. REFERENCES 1. Whitley D, Rana S, Dzubera J, Mathias KE.

    Evaluating evolutionary algorithms. Artificial Intelligence 1996,85:245- 276. 2. Whitley D. An overview of evolutionary algorithms: practical issues and common pitfalls. Information and Software Technology 2001,43:817-831. 3. Ló renz-Fonfrı́a VcA, Padró s E. Curve-fitting of Fourier manipulated spectra comprising apodization, smoothing, derivation and deconvolution. Spectrochimica Acta Part A: Molecular and Biomolecular Spectroscopy 2004,60:2703- 2710. 4. Romanenko SV, Stromberg AG, Selivanova EV, Romanenko ES. Resolution of the overlapping peaks in the case of linear sweep anodic stripping voltammetry via curve fitting. Chemometrics and Intelligent Laboratory Systems 2004,73:7-13. 5. Hu Y, Liu J, Li W. Resolution of overlapping spectra by curve-fitting. Analytica Chimica Acta 2005,538:383-389. 6. Chen K, Li T, Cao T. Tribe-PSO: A novel global optimization algorithm and its application in molecular docking. Chemometrics and Intelligent Laboratory Systems 2006,82:248-259. 7. Mizrach B. SIMANN: A Global Optimization Algorithm using Simulated Annealing. Studies in Nonlinear Dynamics & Econometrics 2007,1:169-176. 8. Lange K. Elementary Optimization. In: Optimization. Edited by Lange K. New York, NY: Springer New York; 2013. pp. 1-21. 9. Wang L, Xu L, Feng S, Meng MQH, Wang K. Multi-Gaussian fitting for pulse waveform using Weighted Least Squares and multi-criteria decision making method. Computers in Biology and Medicine 2013,43:1661-1672. 10. Sulthana A, Latha KC, Imran M, Rathan R, Sridhar R, Balasubramanian S. Non-linear modeling using fuzzy principal component regression for Vidyaranyapuram sewage treatment plant, Mysore - India. Water Sci Technol 2014,70:1040-1047。
  28. 11. Li X, Zhou GH, Chen YJ, Xu XL, Xu

    BC, Li CB. A New Method for Characterizing Mechanical Properties of Meat Product under Stress-Relaxation Based on Gaussian Curve-Fitting. International Journal of Food Properties 2015,18:2571-2583. 12. Sai-Ut S, Benjakul S, Kraithong S, Rawdkuen S. Optimization of antioxidants and tyrosinase inhibitory activity in mango peels using response surface methodology. Lwt-Food Science and Technology 2015,64:742-749. 13. Li MM, Verma B. Nonlinear curve fiting to stopping power data using RBF neural networks. Expert Systems with Applications 2016,45:161-171. 14. Chaharsooghi SK, Meimand Kermani AH. An effective ant colony optimization algorithm (ACO) for multi-objective resource allocation problem (MORAP). Applied Mathematics and Computation 2008,200:167-177. 15. Paine CET, Marthews TR, Vogt DR, Purves D, Rees M, Hector A, et al. How to fit nonlinear plant growth models and calculate growth rates: an update for ecologists. Methods in Ecology and Evolution 2012,3:245-256. 16. Perez-Rodriguez P, Gianola D, Gonzalez-Camacho JM, Crossa J, Manes Y, Dreisigacker S. Comparison between linear and non-parametric regression models for genome-enabled prediction in wheat. G3 (Bethesda) 2012,2:1595-1605. 17. Botbol N, Buse L, Chardin M. Fitting ideals and multiple points of surface parameterizations. Journal of Algebra 2014,420:486-508. 18. Reis RM, Cecon PR, Puiatti M, Finger FL, Nascimento M, Silva FF, et al. Nonlinear regression models applied to clusters of garlic accessions. Horticultura Brasileira 2014,32:178-183. 19. Wang J, Yu Z. Quality mesh smoothing via local surface fitting and optimum projection. Graphical Models 2011,73:127-139. 20. Polo-Corpa MJ, Salcedo-Sanz S, Perez-Bellido AM, Lopez-Espi P, Benavente R, Perez E. Curve fitting using heuristics and bio-inspired optimization algorithms for experimental data processing in chemistry. Chemometrics and Intelligent Laboratory Systems 2009,96:34-42. FOLLOW ON