<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Neurosci.</journal-id>
<journal-title>Frontiers in Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-453X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnins.2022.854685</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Feature Selection in High Dimensional Biomedical Data Based on BF-SFLA</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Dai</surname> <given-names>Yongqiang</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1635982/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Niu</surname> <given-names>Lili</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Wei</surname> <given-names>Linjing</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Tang</surname> <given-names>Jie</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>School of Information Science and Technology, Gansu Agricultural University</institution>, <addr-line>Lanzhou</addr-line>, <country>China</country></aff>
<aff id="aff2"><sup>2</sup><institution>School of Food Science and Engineering, Gansu Agricultural University</institution>, <addr-line>Lanzhou</addr-line>, <country>China</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Hanshu Cai, Lanzhou University, China</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Bo Wu, Tokyo University of Technology, Japan; Chengxi Li, University of Kentucky, United States</p></fn>
<corresp id="c001">&#x002A;Correspondence: Yongqiang Dai, <email>dyq@gsau.edu.cn</email></corresp>
<fn fn-type="other" id="fn004"><p>This article was submitted to Neuroprosthetics, a section of the journal Frontiers in Neuroscience</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>18</day>
<month>04</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>16</volume>
<elocation-id>854685</elocation-id>
<history>
<date date-type="received">
<day>14</day>
<month>01</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>23</day>
<month>02</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2022 Dai, Niu, Wei and Tang.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Dai, Niu, Wei and Tang</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>High-dimensional biomedical data contained many irrelevant or weakly correlated features, which affected the efficiency of disease diagnosis. This manuscript presented a feature selection method for high-dimensional biomedical data based on the chemotaxis foraging-shuffled frog leaping algorithm (BF-SFLA). The performance of the BF-SFLA based feature selection method was further improved by introducing chemokine operation and balanced grouping strategies into the shuffled frog leaping algorithm, which maintained the balance between global optimization and local optimization and reduced the possibility of the algorithm falling into local optimization. To evaluate the proposed method&#x2019;s effectiveness, we employed the K-NN (k-nearest Neighbor) and C4.5 decision tree classification algorithm with a comparative analysis. We compared our proposed approach with improved genetic algorithms, particle swarm optimization, and the basic shuffled frog leaping algorithm. Experimental results showed that the feature selection method based on BF-SFLA obtained a better feature subset, improved classification accuracy, and shortened classification time.</p>
</abstract>
<kwd-group>
<kwd>feature selection</kwd>
<kwd>shuffled frog leaping algorithm</kwd>
<kwd>classification accuracy</kwd>
<kwd>bacterial foraging algorithm</kwd>
<kwd>biomedical data</kwd>
</kwd-group>
<counts>
<fig-count count="13"/>
<table-count count="9"/>
<equation-count count="9"/>
<ref-count count="29"/>
<page-count count="13"/>
<word-count count="8845"/>
</counts>
</article-meta>
</front>
<body>
<sec id="S1" sec-type="intro">
<title>Introduction</title>
<p>Biomedical datasets provide the basis for medical diagnostics and scientific research, and feature subset selection was an important data mining method in many application areas (<xref ref-type="bibr" rid="B15">Lu and Han, 2003</xref>). Such datasets were generally characterized by high-dimensionality, multiple classes, useless data, and a very lot of features, many of which had weak correlation or independence to corresponding diagnostic or research problems (<xref ref-type="bibr" rid="B16">Misra et al., 2002</xref>). Moreover, there may be features (in biomedical datasets) that exhibit a weak correlation with specific diagnostic or research problems. The recognition of the optimal feature subsets can eliminate redundant information and reduce the computational cost required for data mining while improving classification accuracy (<xref ref-type="bibr" rid="B24">Vergara and Est&#x00E9;vez, 2014</xref>). Feature selection can enhance classification accuracy and decrease the computational complexity in classification. The feature subset should be indispensable and sufficient to describe the target concept while maintaining suitably high precision in the representing the original features.</p>
<p>Effective identification and selection of candidate subsets require an effective and efficient search method and learning algorithm. However, developing such approaches and learning algorithms to identify optimal subsets remains an open research issue. This manuscript proposed a method for enabling feature selection from high-dimensional biomedical data based on the Bacterial Foraging&#x2013;Shuffled Frog Leaping Algorithm (BF-SFLA).</p>
<p>The BF-SFLA was developed by introducing the convergence factor of the Bacterial Foraging Algorithm (BFA) into the Shuffled Frog Algorithm (SLFA), which was discussed in detail in later sections of this manuscript.</p>
<p>We have used <italic>K</italic>-<italic>NN</italic> and <italic>C</italic>4.5 Decision Tree Classification Method combined with high-dimensional biomedical data to evaluate the BF-SFLA, including performing a comparative analysis of improvement Genetic Algorithm (IGA), improvement Particle Swarm Optimization (IPSO), and the SFLA. The experimental results showed that the feature selection based on BF-SFLA demonstrates better performance in identifying relevant subsets with higher classification accuracy than the alternative methods.</p>
<p>The structure of this manuscript was as follows: the related research was considered in Section II. The BF-SFLA was presented in Section III with the analysis of improvement strategy in Section IV. In Section V, we discussed the application of feature selection. This manuscript ended with Section VI, in which we provide concluding comments.</p>
</sec>
<sec id="S2">
<title>Related Research</title>
<p>There were many feature selection algorithms documented in the literature (<xref ref-type="bibr" rid="B26">Wang et al., 2007</xref>). A memetic feature selection algorithm was proposed in <xref ref-type="bibr" rid="B13">Lee and Kim (2015)</xref> for multi-label classification, preventing premature convergence and improving efficiency. The proposed method employs a memetic procedure to refine the feature subsets found obtained by a genetic search, which improves multi-label classification performance. Empirical studies using a variety of tests indicate the proposed method was superior to the conventional multi-label feature selection methods.</p>
<p>A novel algorithm was proposed in <xref ref-type="bibr" rid="B27">Wang et al. (2017)</xref> based on information theory called the Semi-supervised Representatives Feature Selection (SRFS) algorithm. The SRFS was independent of any algorithm learning classification. It can quickly and effectively identify and remove unnecessary information with irrelevant and redundant features. More critical, the unlabeled data were used as the labeled data in the Markov blanket through the correlation gain. The results on several benchmark datasets show that SRFS can significantly improve existing supervised and semi-supervised algorithms.</p>
<p><xref ref-type="bibr" rid="B14">Li et al. (2015)</xref> aim to introduce a new method to stable feature selection algorithms. The experiments used open source &#x201C;actual microarray data,&#x201D; challenging for high-dimensional minor sample problems. The reported results indicate that the proposed integrated FREE was stable and has better (or at least comparable) accuracy than was the case for some other commonly stable feature weighting methods.</p>
<p><xref ref-type="bibr" rid="B23">Tabakhi et al. (2014)</xref> proposed an unsupervised feature selection method based on ant colony optimization, which was called UFSACO. In this method, the optimal feature subset was found through multiple iterations without using any learning algorithm(s). UFSAC can be classified as a filter-based multivariate approach. The proposed method has low computational complexity. Therefore, it can be applied to high-dimensional data sets. By comparing the performance of UFSACO with 11 famous univariate and multivariate feature selection methods using different classifiers (support vector machine, decision tree, and Bayes), the experimental results of several commonly used data sets show the efficiency and effectiveness of the UFSACO method and the relevant improvements in the past.</p>
<p><xref ref-type="bibr" rid="B1">AbdEl-Fattah Sayed et al. (2016)</xref> proposed a new hybrid algorithm, which combines the Clonal Selection Algorithm (CSA) with the Flower Pollination Algorithm (FPA) to form Binary Clonal Flower Pollination Algorithm (BCFA), aiming at solving the problem of feature selection. The Optimum-Path Forest (OPF) classification accuracy was taken as the objective function. Experimental testing has been carried out on three public datasets. The reported results demonstrate that the proposed hybrid algorithm achieved striking results compared with other famous algorithms, such as the Binary Cuckoo Search Algorithm (BCSA), the Binary Bat Algorithm (BBA), the Binary Differential Evolution Algorithm (BDEA), and the Binary Flower Pollination Algorithm (BFPA).</p>
<p><xref ref-type="bibr" rid="B21">Shrivastava et al. (2017)</xref> compared and analyzed various nature-inspired algorithms to select the optimal features required to help in the classification of affected patients from the population. The reported experimental results show that the BBA outperformed traditional techniques such as Particle Swarm Optimization (PSO), Genetic Algorithms (GA), and the Modified Cuckoo Search Algorithm (MCSA) with a competitive recognition rate for the selected features dataset.</p>
<p><xref ref-type="bibr" rid="B29">Zhang et al. (2015)</xref> suggested a new method using the Bones Particle Swarm Optimization (BPSO) to find the optimal feature subset, which was termed the binary BPSO. In this algorithm, a reinforcement memory strategy was designed to update the local &#x201C;leaders&#x201D; of particles to avoid the degradation of excellent genes in particles. A uniform combination was proposed to balance the local exploitation and the global mining of the algorithm. In addition, the 1-nearest neighbor method was used as a classifier to evaluate the classification accuracy of particles. The proposed algorithm was evaluated by several international standard datasets. Experimental testing shows that the proposed algorithm has strong competitiveness in classification accuracy and computational performance.</p>
<p>Based on the concept of decomposition and fusion, a practical feature selection method for large-scale hybrid datasets was proposed by <xref ref-type="bibr" rid="B25">Wang and Liang (2016)</xref> to identify an effective feature subset in a short time. By using two common classifiers as evaluation functions, experiments have been performed on 12 UCI data sets. The result of the experiment showed that the proposed method was effective and efficient.</p>
<p><xref ref-type="bibr" rid="B3">Cai et al. (2020</xref>, <xref ref-type="bibr" rid="B4">2021)</xref> aimed to construct a novel multimodal model by fusing different electroencephalogram (EEG) data sources, which were under neutral, negative and positive audio stimulation, to discriminate between depressed patients and normal controls. Then, from the EEG signals of each modality, linear and nonlinear features were extracted and selected to obtain features of each modality.that the fusion modality could achieve higher depression recognition accuracy rate compared with the individual modality schemes. This study may provide an similarity between features, which leads to minimizing the redundancy. As a result, it could be classified as a filter-based multivariate approach. The proposed approach has low computational complexity. Therefore, it was suitable for high-dimensional data sets.</p>
<p>The relevant research shows that nature incentive systems represent a practical basis for feature selection. In this manuscript, we have applied nature-inspired method using our new extended SFLA (the BF-SFLA) for high-dimensional biomedical data feature selection.</p>
</sec>
<sec id="S3">
<title>The Proposed Based on the Chemotaxis Foraging-Shuffled Frog Leaping Algorithm</title>
<sec id="S3.SS1">
<title>The Shuffled Frog Leaping Algorithm</title>
<p>The biological characteristics of the SFLA are shown in <xref ref-type="fig" rid="F1">Figure 1</xref>. It could be seen from the figure that a large number of individual frogs were distributed in the search space, and there were several food-dense areas (extremal points of the function). The individuals were assigned to several groups based on the fitness (from big/small to small/big). The algorithm update strategy is shown in Equations (1) and (2), in which the worst individual (<italic>P</italic><sub><italic>w</italic></sub>) learned from the best individual (<italic>P</italic><sub><italic>b</italic></sub>) of the subgroup. Without progress, (<italic>P</italic><sub><italic>w</italic></sub>) would learn from the global best individual (<italic>P</italic><sub><italic>g</italic></sub>). If there was still no progress, (<italic>P</italic><sub><italic>w</italic></sub>) would be replaced by random individuals. The number of iterations in the algorithm was given by (t). Where: 1) <italic>P</italic><sub><italic>w</italic></sub>(<italic>t</italic>+1) was a new individual generated by the updating strategy, 2) <italic>D</italic>(<italic>t</italic>+1) was the length of each moving step, and 3) <italic>R</italic> was a random number with a change range of [0, 1].</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption><p>The simulation diagram of biological characteristics of SFLA.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g001.tif"/>
</fig>
<disp-formula id="S3.E1">
<label>(1)</label>
<mml:math id="M1">
<mml:mrow>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>t</mml:mi>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo rspace="5.8pt">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mi>R</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>P</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
<mml:mo>-</mml:mo>
<mml:msub>
<mml:mi>P</mml:mi>
<mml:mi>w</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="S3.E2">
<label>(2)</label>
<mml:math id="M2">
<mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>P</mml:mi>
<mml:mi>w</mml:mi>
</mml:msub>
<mml:mo>&#x2062;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>t</mml:mi>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo rspace="5.8pt">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>P</mml:mi>
<mml:mi>w</mml:mi>
</mml:msub>
<mml:mo>&#x2062;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>t</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>t</mml:mi>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<p>Following updating, if the newly generated <italic>P</italic><sub><italic>w</italic></sub>(<italic>t</italic>+1) was better than the old <italic>P</italic><sub><italic>w</italic></sub>(<italic>t</italic>), <italic>P</italic><sub><italic>w</italic></sub>(<italic>t</italic>) would be replaced by <italic>P</italic><sub><italic>w</italic></sub>(<italic>t</italic>+1). Otherwise, (<italic>P</italic><sub><italic>b</italic></sub>) would be replaced by (<italic>P</italic><sub><italic>g</italic></sub>). If (<italic>P</italic><sub><italic>w</italic></sub>) was still not improving, it would be randomly replaced by a new individual. This iterative process with the number of iterations was equal to the number of subgroup individuals. When the subgroup processing was completed, all subgroups would be randomly sorted and reclassified into new subgroups. The process was repeated until the pre-determined termination conditions were satisfied.</p>
<p>The SFLA was one of many nature-inspired algorithms based on swarm intelligence (<xref ref-type="bibr" rid="B8">Eusuff and Lansey, 2003</xref>). It has the following characteristics: (1) a simple concept, (2) reduced parameters, (3) strong performance optimization, (4) fast calculation speed, and (5) easy implementation. It has been widely used in many fields such as model recognition problems (<xref ref-type="bibr" rid="B20">Shahriari-kahkeshi and Askari, 2011</xref>; <xref ref-type="bibr" rid="B10">Hasanien, 2015</xref>), scheduling problems (<xref ref-type="bibr" rid="B17">Pan et al., 2011</xref>; <xref ref-type="bibr" rid="B2">Alghazi et al., 2012</xref>), parameter optimization problems (<xref ref-type="bibr" rid="B19">Perez et al., 2013</xref>), traveling salesman problem (<xref ref-type="bibr" rid="B21">Shrivastava et al., 2017</xref>), unit commitment problem (<xref ref-type="bibr" rid="B7">Ebrahimi et al., 2012</xref>), distribution problem (<xref ref-type="bibr" rid="B9">Gomez Gonzalez et al., 2013</xref>), and the controller problem (<xref ref-type="bibr" rid="B12">Huynh and Nguyen, 2009</xref>).</p>
</sec>
<sec id="S3.SS2">
<title>The Bacterial Foraging Algorithm</title>
<p>Through simulation, <italic>E. coli</italic> ate food in the human intestinal tract. The Bacterial Foraging Algorithm (referred to as BFA) (<xref ref-type="bibr" rid="B18">Passino, 2002</xref>) was proposed in 2002 by Passino et al., and because the BFA has shown improved optimization performance, it has attracted significant research by scholars in the field. The BFA included three steps, Chemokines Operation (referred to as CO), Propagation Operation (referred to as PO), and Dissipation Operation (Referred to as DO), and the (CO) was the core step.</p>
<p>The (CO) corresponds to the direction selection strategy adopted by bacteria in searching for food, which played a significant role in the algorithm&#x2019;s convergence. In the process of (CO), the motion mode of bacteria could be divided into two states: Rotation and Forward. The Rotating motion mode refers to the operation of the moving unit step after the bacteria changes the direction. In contrast, the Forward motion mode refers to that after the bacteria complete the rotating motion; if the quality of the solution was improved, the bacteria would continue to move several steps in the same direction until the adaptive value of the function did not change, or the predetermined number of moving steps was reached.</p>
</sec>
<sec id="S3.SS3">
<title>The Shuffled Frog Leaping Algorithm Based on Chemotactic Operation</title>
<sec id="S3.SS3.SSS1">
<title>Proposed Improvements</title>
<p>In the SFLA, the worst individual (<italic>P</italic><sub><italic>w</italic></sub>) from a subgroup learned to form the optimal individual (<italic>P</italic><sub><italic>b</italic></sub>) in the same subgroup or the optimal global individual (<italic>P</italic><sub><italic>g</italic></sub>) iteratively. IF the fitness was not improved in this process, a randomly generated new individual replaced the existing (<italic>P</italic><sub><italic>w</italic></sub>), while maintaining population diversity may result in the failure to identify potentially more optimal solutions. This result was because following the (<italic>P</italic><sub><italic>w</italic></sub>) learned from (<italic>P</italic><sub><italic>b</italic></sub>) or (<italic>P</italic><sub><italic>g</italic></sub>), while partial improvement (in the fitness) may have been achieved, there may be better solutions in the neighborhood if the new randomly generated individual was used in place of the existing (<italic>P</italic><sub><italic>w</italic></sub>). The possibility of finding a better solution was lost by the SFLA. Inspired by the (CO) of the BFA, this manuscript introduced the (CO) into SFLA and guided (<italic>P</italic><sub><italic>w</italic></sub>) to refine the search in the neighborhood and find better solutions.</p>
</sec>
<sec id="S3.SS3.SSS2">
<title>Proposed Updating Strategy</title>
<p>Section (III B), considered Rotation and Progression. Our updating strategy proposed that (<italic>P</italic><sub><italic>w</italic></sub>) moved stepwise in random directions (in the solution space) and completed the rotation operation. IF the fitness was improved, (<italic>P</italic><sub><italic>w</italic></sub>) would move forward in the same direction repeatedly until the fitness no longer was improved, at which point (<italic>P</italic><sub><italic>w</italic></sub>) would be replaced by a random individual in the solution space. The chemotaxis operation strategy was used in a secondary process to increase the granularity of the solution space exploration. This processed secondary aims to search for the potential optimal solution(s) in the (<italic>P</italic><sub><italic>w</italic></sub>) neighborhood, expand the individual search level, improve the local search ability, and improve the search accuracy of the algorithm while maintaining the population diversity.</p>
<p>Of course, when (<italic>P</italic><sub><italic>w</italic></sub>) learned from (<italic>P</italic><sub><italic>b</italic></sub>) and (<italic>P</italic><sub><italic>g</italic></sub>) without progress, the (CO) was not always performed on every iteration. To strengthen space exploration ability at the early stage of the iteration, the algorithm must keep specific diversity, so the (CO) was used with less probability, and in the middle and later stages of the iteration, to strengthen the optimal neighborhood mining density, the algorithm must improve the local searchability. To balance the relationship between algorithm exploration and mining, the curve change formula was introduced to calculate the (CO) perform probability.</p>
<disp-formula id="S3.E3">
<label>(3)</label>
<mml:math id="M3">
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mi>a</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mrow>
<mml:mtext>exp</mml:mtext>
<mml:mo>&#x2062;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mo>-</mml:mo>
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mn>30</mml:mn>
</mml:mpadded>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mfrac>
<mml:mi>g</mml:mi>
<mml:mi>G</mml:mi>
</mml:mfrac>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mi>s</mml:mi>
</mml:msup>
</mml:mrow>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="S3.E4">
<label>(4)</label>
<mml:math id="M4">
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mi>C</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable displaystyle="true" rowspacing="0pt">
<mml:mtr>
<mml:mtd columnalign="center">
<mml:mrow>
<mml:mpadded width="+5pt">
<mml:mn>1</mml:mn>
</mml:mpadded>
<mml:mi>i</mml:mi>
<mml:mpadded width="+5pt">
<mml:mi>f</mml:mi>
</mml:mpadded>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mpadded width="+3.3pt">
<mml:mi>a</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">&lt;</mml:mo>
<mml:mi>R</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd columnalign="center">
<mml:mrow>
<mml:mpadded width="+5pt">
<mml:mn>0</mml:mn>
</mml:mpadded>
<mml:mi>i</mml:mi>
<mml:mpadded width="+5pt">
<mml:mi>f</mml:mi>
</mml:mpadded>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mpadded width="+3.3pt">
<mml:mi>a</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">&#x2265;</mml:mo>
<mml:mi>R</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
<mml:mi/>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<p>The function (a) was calculated by Equation (3), where (g) was current iteration number and (G) was total iteration number. <xref ref-type="fig" rid="F2">Figure 2</xref> was the graph of the value of function a when (s) was equal to 3, 5, and 8, respectively. To balance the relationship between the algorithm exploration and mining, (s) was set as 5 in subsequent experiments. (R) was the random number between [0, 1]. C was the decision factor in Equation (4), if C was 1 perform the (CO), and if <italic>C</italic> was 0 do not perform the (CO).</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption><p>The curve of function <italic>a.</italic></p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g002.tif"/>
</fig>
</sec>
</sec>
<sec id="S3.SS4">
<title>The Improvement of Grouping Strategy</title>
<p>The grouping strategy of the SFLA was as follows: suppose that <italic>P</italic> individuals were sorted into <italic>m</italic> groups according to the quality of the solution (function evaluation value), and <italic>n</italic> groups were divided into each group, where <italic>P</italic> = <italic>m</italic>&#x002A;<italic>n</italic>. Then the <italic>first</italic> individual, the <italic>m</italic>+1 individual, &#x2026;, the (<italic>n</italic>&#x2013;1)&#x002A;<italic>m</italic>+1 individual, was assigned to the 1st group. The <italic>second</italic> individual, the <italic>m</italic>+2 individuals&#x2026;, the (<italic>n</italic>&#x2013;1)&#x002A;<italic>m</italic>+2 individuals were assigned to the second group, and so on, the <italic>mth</italic> individual, the 2<italic>m</italic> individual&#x2026;, the <italic>nth</italic>&#x002A;<italic>m</italic> individuals were assigned to the group. Until all the individuals were grouped, this grouping strategy was called Classic Grouping Strategy (CGS).</p>
<p>To verify the contribution of CGS to the global optimal solution <italic>P</italic><sub><italic>g</italic></sub>, 15 standard test functions were used for the simulation experiment. The parameters of the test function were shown in <xref ref-type="table" rid="T1">Table 1</xref>. Test function parameters and target accuracy information were shown in <xref ref-type="table" rid="T1">Table 1</xref>. The average value of the algorithm ran independently 30 times was used for the experimental data. Algorithm parameters were set as follows: total population, 200; number of groups, 10; individual in a subgroup, 20; number of updates and evolution within subgroup, 20; number of iterations of the algorithm, 500. The operating environment of the algorithm was Windows 10 operating system, 8-core 64-bit processor and 8G memory, and the running software was MATLAB2012 a.</p>
<table-wrap position="float" id="T1">
<label>TABLE 1</label>
<caption><p>Parameters of the benchmark function.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">Function</td>
<td valign="top" align="center">Dimensions(n)</td>
<td valign="top" align="center">Scope</td>
<td valign="top" align="center">Optimal value</td>
<td valign="top" align="center">Accuracy</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ1"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>1</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;5.12,5.12]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>&#x2013;16</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ2"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>2</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi mathvariant="normal">n</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mrow><mml:mn>100</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mn>2</mml:mn></mml:msubsup><mml:mo>-</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mo>+</mml:mo><mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>1</mml:mn><mml:mo>-</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;30,30]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>1</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ3"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>3</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mrow><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:msubsup><mml:mo>-</mml:mo><mml:mrow><mml:mn>10</mml:mn><mml:mo>&#x2062;</mml:mo><mml:mi mathvariant="normal">c</mml:mi><mml:mo>&#x2062;</mml:mo><mml:mi mathvariant="normal">o</mml:mi><mml:mo>&#x2062;</mml:mo><mml:mi mathvariant="normal">s</mml:mi><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>2</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msub><mml:mrow><mml:mtext mathvariant="italic">&#x03C0;</mml:mtext><mml:mtext> x</mml:mtext></mml:mrow><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mn>10</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;5.12,5.12]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>1</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ4"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>4</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:mtext>x</mml:mtext><mml:mo maxsize="120%" minsize="120%" rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mrow><mml:mrow><mml:mfrac><mml:mn>1</mml:mn><mml:mn>4000</mml:mn></mml:mfrac><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:mrow><mml:mo>-</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x220F;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:mrow><mml:mi>cos</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:mfrac><mml:msub><mml:mrow><mml:mtext>x</mml:mtext></mml:mrow><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub><mml:msqrt><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msqrt></mml:mfrac><mml:mo maxsize="120%" minsize="120%">)</mml:mo></mml:mrow></mml:mrow></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;600,600]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>&#x2013;2</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ5"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>5</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:mtext>x</mml:mtext><mml:mo maxsize="120%" minsize="120%" rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mrow><mml:mrow><mml:mo>-</mml:mo><mml:mrow><mml:mn>20</mml:mn><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mi>exp</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:mrow><mml:mo>-</mml:mo><mml:mrow><mml:mn>0.2</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msqrt><mml:mrow><mml:mfrac><mml:mn>1</mml:mn><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:mfrac><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:mrow></mml:msqrt></mml:mrow></mml:mrow><mml:mo maxsize="120%" minsize="120%">)</mml:mo></mml:mrow></mml:mrow></mml:mrow></mml:mrow><mml:mo>-</mml:mo><mml:mrow><mml:mi>exp</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:mrow><mml:mfrac><mml:mn>1</mml:mn><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:mfrac><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:mrow><mml:mi>cos2</mml:mi><mml:mo>&#x2062;</mml:mo><mml:mi>&#x03C0;</mml:mi><mml:mo>&#x2062;</mml:mo><mml:msub><mml:mi mathvariant="normal">x</mml:mi><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub></mml:mrow></mml:mrow></mml:mrow><mml:mo maxsize="120%" minsize="120%">)</mml:mo></mml:mrow></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mn>20</mml:mn><mml:mo>+</mml:mo><mml:mtext>e</mml:mtext></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;32,32]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>&#x2013;7</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ6"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>6</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi mathvariant="normal">n</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup><mml:mrow><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:msubsup><mml:mo>+</mml:mo><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mn>2</mml:mn></mml:msubsup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mn>0.25</mml:mn></mml:msup><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mrow><mml:msup><mml:mi>sin</mml:mi><mml:mn>2</mml:mn></mml:msup><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mn>50</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:msubsup><mml:mo>+</mml:mo><mml:msubsup><mml:mtext>x</mml:mtext><mml:mrow><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mn>2</mml:mn></mml:msubsup></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>0.1</mml:mn></mml:msup></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mrow></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;100,100]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>0</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ7"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>7</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:mrow><mml:mo>|</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub><mml:mo>|</mml:mo></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x220F;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">|</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub><mml:mo stretchy="false">|</mml:mo></mml:mrow></mml:mrow></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;10,10]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>&#x2013;16</sup></td>
</tr>
<tr>
<td valign="top" align="left">f<sub>8</sub>(x) = Max{|x<sub>i</sub>|}</td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;100,100]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>&#x2013;2</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ9"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>9</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:mrow><mml:mi>int</mml:mi><mml:mo>&#x2062;</mml:mo><mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mn>0.5</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;100,100]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>&#x2013;16</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ10"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>10</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:msubsup><mml:mtext>ix</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mn>4</mml:mn></mml:msubsup></mml:mrow><mml:mo>+</mml:mo><mml:mtext>r</mml:mtext></mml:mrow></mml:mrow></mml:math></inline-formula><italic>andom</italic>(0,1]</td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;1.28,1.28]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>&#x2013;3</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ12"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>11</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mo>-</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:mrow><mml:msub><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mtext>sin</mml:mtext><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msqrt><mml:mrow><mml:mo stretchy="false">|</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub><mml:mo stretchy="false">|</mml:mo></mml:mrow></mml:msqrt><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:mrow></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;500,500]</td>
<td valign="top" align="center">&#x2013;418.9829<italic>n</italic></td>
<td valign="top" align="center">| Actual Value &#x2013;(&#x2013;418.9829<italic>n</italic>) | &#x003C; 1 &#x00D7; 10<sup>2</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ13"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>12</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:mtext>x</mml:mtext><mml:mo maxsize="120%" minsize="120%" rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mrow><mml:mfrac><mml:mi mathvariant="normal">&#x03A0;</mml:mi><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:mfrac><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">{</mml:mo><mml:mrow><mml:mrow><mml:mn>10</mml:mn><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:msup><mml:mi>sin</mml:mi><mml:mn>2</mml:mn></mml:msup><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:mrow><mml:mi mathvariant="normal">&#x03A0;</mml:mi><mml:mo>&#x2062;</mml:mo><mml:msub><mml:mi mathvariant="normal">y</mml:mi><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub></mml:mrow><mml:mo maxsize="120%" minsize="120%">)</mml:mo></mml:mrow></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi mathvariant="normal">n</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup><mml:mrow><mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mtext>y</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">[</mml:mo><mml:mrow><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mrow><mml:mn>10</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msup><mml:mtext>sin</mml:mtext><mml:mn>2</mml:mn></mml:msup><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:mrow><mml:mi mathvariant="normal">&#x03A0;</mml:mi><mml:mo>&#x2062;</mml:mo><mml:msub><mml:mi mathvariant="normal">y</mml:mi><mml:mrow><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:mrow><mml:mo maxsize="120%" minsize="120%">)</mml:mo></mml:mrow></mml:mrow></mml:mrow><mml:mo maxsize="120%" minsize="120%">]</mml:mo></mml:mrow></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mtext>y</mml:mtext><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mo maxsize="120%" minsize="120%">}</mml:mo></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:msubsup><mml:mo largeop="true" symmetric="true">&#x2211;</mml:mo><mml:mrow><mml:mpadded width="+3.3pt"><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:mpadded><mml:mo rspace="5.8pt">=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mtext>n</mml:mtext></mml:mrow></mml:msubsup><mml:mrow><mml:mtext>u</mml:mtext><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo maxsize="120%" minsize="120%">(</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mrow><mml:mtext>i</mml:mtext></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mn>10</mml:mn><mml:mo>,</mml:mo><mml:mn>100</mml:mn><mml:mo>,</mml:mo><mml:mn>4</mml:mn><mml:mo maxsize="120%" minsize="120%">)</mml:mo></mml:mrow></mml:mrow></mml:mrow></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center"/><td valign="top" align="center"/><td valign="top" align="center"/><td valign="top" align="center"/></tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ14"><mml:mrow><mml:msub><mml:mtext>y</mml:mtext><mml:mtext>i</mml:mtext></mml:msub><mml:mtext>&#x00A0;=&#x00A0;</mml:mtext><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mtext>x</mml:mtext><mml:mtext>i</mml:mtext></mml:msub><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mn>4</mml:mn></mml:mfrac><mml:mo>,</mml:mo><mml:mtext>&#x00A0;u</mml:mtext><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mtext>x</mml:mtext><mml:mtext>i</mml:mtext></mml:msub><mml:mo>,</mml:mo><mml:mtext>a</mml:mtext><mml:mo>,</mml:mo><mml:mtext>k</mml:mtext><mml:mo>,</mml:mo><mml:mtext>m</mml:mtext></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mtext>&#x00A0;=&#x00A0;</mml:mtext><mml:mrow><mml:mo>{</mml:mo> <mml:mrow><mml:mtable><mml:mtr><mml:mtd><mml:mrow><mml:mtext>&#x00A0;k</mml:mtext><mml:msup><mml:mrow><mml:mo stretchy='false'>(</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mtext>i</mml:mtext></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mtext>a</mml:mtext><mml:mo stretchy='false'>)</mml:mo></mml:mrow><mml:mtext>m</mml:mtext></mml:msup><mml:msub><mml:mrow><mml:mtext>&#x00A0;x</mml:mtext></mml:mrow><mml:mtext>i</mml:mtext></mml:msub><mml:mtext>&#x00A0;&#x003E;&#x00A0;a</mml:mtext></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mn>0</mml:mn><mml:mtext>&#x00A0;&#x00A0;&#x00A0;&#x00A0;&#x00A0;</mml:mtext><mml:mo>&#x2212;</mml:mo><mml:mtext>a&#x00A0;&#x00A0;</mml:mtext><mml:mo>&#x2264;</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mtext>i</mml:mtext></mml:msub><mml:mtext>&#x00A0;&#x00A0;</mml:mtext><mml:mo>&#x2264;</mml:mo><mml:mtext>a</mml:mtext></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mtext>k</mml:mtext><mml:msup><mml:mrow><mml:mo stretchy='false'>(</mml:mo><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mtext>i</mml:mtext></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mtext>a</mml:mtext><mml:mo stretchy='false'>)</mml:mo></mml:mrow><mml:mtext>m</mml:mtext></mml:msup><mml:msub><mml:mrow><mml:mtext>&#x00A0;&#x00A0;&#x00A0;&#x00A0;x</mml:mtext></mml:mrow><mml:mtext>i</mml:mtext></mml:msub><mml:mtext>&#x00A0;&#x003C;&#x00A0;</mml:mtext><mml:mo>&#x2212;</mml:mo><mml:mtext>a</mml:mtext></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:mrow> <mml:mo>}</mml:mo></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">30/60/90</td>
<td valign="top" align="center">[&#x2013;50,50]</td>
<td valign="top" align="center">0</td>
<td valign="top" align="center">| Actual Value &#x2013;0| &#x003C; 1 &#x00D7; 10<sup>&#x2013;15</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ16"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>13</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mrow><mml:mrow><mml:mrow><mml:mrow><mml:mn>4</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msubsup><mml:mi mathvariant="normal">x</mml:mi><mml:mn>1</mml:mn><mml:mn>2</mml:mn></mml:msubsup></mml:mrow><mml:mo>-</mml:mo><mml:mrow><mml:mn>2.1</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msubsup><mml:mtext>x</mml:mtext><mml:mn>1</mml:mn><mml:mn>4</mml:mn></mml:msubsup></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mfrac><mml:msubsup><mml:mrow><mml:mtext>x</mml:mtext></mml:mrow><mml:mn>1</mml:mn><mml:mn>6</mml:mn></mml:msubsup><mml:mn>3</mml:mn></mml:mfrac><mml:mo>+</mml:mo><mml:mrow><mml:msub><mml:mtext>x</mml:mtext><mml:mn>1</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mn>2</mml:mn></mml:msub></mml:mrow></mml:mrow><mml:mo>-</mml:mo><mml:mrow><mml:mn>4</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msubsup><mml:mi mathvariant="normal">x</mml:mi><mml:mn>2</mml:mn><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mn>4</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msubsup><mml:mi mathvariant="normal">x</mml:mi><mml:mn>2</mml:mn><mml:mn>4</mml:mn></mml:msubsup></mml:mrow></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">[&#x2013;5,5]</td>
<td valign="top" align="center">&#x2013;1.0316285</td>
<td valign="top" align="center">| Actual Value &#x2013;1.0316285)| &#x003C; 1 &#x00D7; 10<sup>&#x2013;3</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ17"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>14</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mrow><mml:msub><mml:mtext>x</mml:mtext><mml:mn>2</mml:mn></mml:msub><mml:mo>-</mml:mo><mml:mrow><mml:mfrac><mml:mn>5.1</mml:mn><mml:mrow><mml:mn>4</mml:mn><mml:mo>&#x2062;</mml:mo><mml:msup><mml:mi>&#x03C0;</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:mfrac><mml:mo>&#x2062;</mml:mo><mml:msubsup><mml:mtext>x</mml:mtext><mml:mn>1</mml:mn><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mfrac><mml:mn>5</mml:mn><mml:mi>&#x03C0;</mml:mi></mml:mfrac><mml:mo>&#x2062;</mml:mo><mml:msub><mml:mtext>x</mml:mtext><mml:mn>1</mml:mn></mml:msub></mml:mrow></mml:mrow><mml:mo>-</mml:mo><mml:mn>6</mml:mn></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup><mml:mo>+</mml:mo><mml:mrow><mml:mn>10</mml:mn><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mn>1</mml:mn><mml:mo>-</mml:mo><mml:mfrac><mml:mn>1</mml:mn><mml:mn>8</mml:mn></mml:mfrac></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2062;</mml:mo><mml:msub><mml:mtext>cosx</mml:mtext><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mo>+</mml:mo><mml:mn>10</mml:mn></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">[&#x2013;15,15]</td>
<td valign="top" align="center">0.398</td>
<td valign="top" align="center">| Actual Value &#x2013;(0.398)| &#x003C; 1 &#x00D7; 10<sup>&#x2013;2</sup></td>
</tr>
<tr>
<td valign="top" align="left"><inline-formula><mml:math id="INEQ18"><mml:mrow><mml:mrow><mml:msub><mml:mtext>f</mml:mtext><mml:mn>15</mml:mn></mml:msub><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mtext>x</mml:mtext><mml:mo rspace="5.8pt">)</mml:mo></mml:mrow></mml:mrow><mml:mo rspace="5.8pt">=</mml:mo><mml:mrow><mml:mfrac><mml:mrow><mml:mrow><mml:msup><mml:mrow><mml:mtext>sin</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:msup><mml:mo>&#x2062;</mml:mo><mml:msqrt><mml:mrow><mml:msubsup><mml:mrow><mml:mtext>x</mml:mtext></mml:mrow><mml:mn>1</mml:mn><mml:mn>2</mml:mn></mml:msubsup><mml:mo>+</mml:mo><mml:msubsup><mml:mrow><mml:mtext>x</mml:mtext></mml:mrow><mml:mn>2</mml:mn><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:msqrt></mml:mrow><mml:mo>-</mml:mo><mml:mn>0.5</mml:mn></mml:mrow><mml:msup><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mrow><mml:mn>0.001</mml:mn><mml:mo>&#x2062;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mrow><mml:mtext>x</mml:mtext></mml:mrow><mml:mn>1</mml:mn><mml:mn>2</mml:mn></mml:msubsup><mml:mo>+</mml:mo><mml:msubsup><mml:mrow><mml:mtext>x</mml:mtext></mml:mrow><mml:mn>2</mml:mn><mml:mn>2</mml:mn></mml:msubsup></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow></mml:mrow><mml:mo>]</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mfrac><mml:mo>-</mml:mo><mml:mn>0.5</mml:mn></mml:mrow></mml:mrow></mml:math></inline-formula></td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">[&#x2013;100,100]</td>
<td valign="top" align="center">&#x2013;1</td>
<td valign="top" align="center">| Actual Value &#x2013;(&#x2013;1)| &#x003C; 1 &#x00D7; 10<sup>&#x2013;4</sup></td>
</tr>
</tbody>
</table></table-wrap>
<p>The experimental results were shown in <xref ref-type="fig" rid="F3">Figure 3</xref>. In the figure, the abscissa represented the group number, and the ordinate represented the average contribution rate of each group updating <italic>P</italic><sub><italic>g</italic></sub>. It could be seen from the figure that, compared with other groups, group 1 to group 5 obtained a higher average update contribution rate to <italic>P</italic><sub><italic>g</italic></sub>, among which group 1 obtained the highest contribution rate (14.11%), and the total contribution rate of the five groups was 43.00%.</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption><p>The average contribution rate of each group updating <italic>P</italic><sub><italic>g</italic></sub>.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g003.tif"/>
</fig>
<p>According to the CGS grouping strategy, the individuals with a higher quality of each equilateral solution were first assigned to the groups with smaller numbers. The smaller the group number, the higher the quality of the assigned solution would be. The individual quality of groups with smaller group numbers was better than groups with more significant group numbers. In the process of algorithm operation, if these grouping individuals once fell into the local optimal, because the update of <italic>P</italic><sub><italic>g</italic></sub> was highly dependent on these groups, it would be difficult to rely on other groups with low contribution rate to <italic>P</italic><sub><italic>g</italic></sub> to guide the algorithm to jump out of the local optimal, thus increasing the probability of the algorithm falling into the local optimal overall. To avoid this situation, it was necessary to balance the contribution proportion of each group to <italic>P</italic><sub><italic>g</italic></sub>, reduce the dependence of <italic>P</italic><sub><italic>g</italic></sub> update on specific groups, and improve the ability to jump out after the algorithm fell into the local optimal.</p>
<sec id="S3.SS4.SSS1">
<title>Improved Grouping Strategy</title>
<p>1 to <italic>m</italic> individuals were assigned to each group in sequence (1) in each group, the <italic>m</italic>+1 to 2&#x002A;<italic>m</italic> individuals according to the reverse was assigned to each group (1) in each group, then the 2&#x002A;<italic>m</italic>+1 to 3&#x002A;<italic>m</italic> individuals were assigned to each group by the order again (1) in each group, the 3&#x002A;<italic>m</italic>+1 to 4&#x002A;<italic>m</italic> individuals according to the reverse was assigned to each group (1) in each group, and so on, until all the individual were grouped.</p>
<p>The improved grouping strategy could effectively avoid the individuals with better quality of solutions into the same group and guarantee the average solution quality of individuals in each group. In this way, the proportion of each group&#x2019;s contribution to the optimal global solution could be effectively balanced, thus reducing the possibility of the algorithm falling into the local optimal. This grouping strategy was called Balance Grouping Strategy (BGS).</p>
</sec>
</sec>
</sec>
<sec id="S4">
<title>The Analysis of Improvement Strategy</title>
<p>After (CO) was introduced into the SFLA, the balance between Exploratory Search in the early stage and Refined search in the later stage of the algorithm iteration were well handled, the SFLA with a single introduction of (CO) was named as SFLA1. The contribution of (BGS) was to balance the update contribution rate of groups for the global best individual (<italic>P</italic><sub><italic>g</italic></sub>) and avoid the SFLA falling into the local optimization. The SFLA with a single (BGS) was named SFLA2.</p>
<p>(CO) and (BGS) were two improved strategies of SFLA. Among them, the former was the improvement of the updating method for the worst individuals, and the latter was the optimization of the algorithm grouping method. Although one kind of single improvement strategy could improve the optimization performance of the algorithm to a certain extent, the improvement effect was limited. However, the performance improvement of the algorithm would be more evident if the two improvement strategies were combined. (CO) and (BGS) were all introduced into the SFLA simultaneously. The improved algorithm was named Bacterial Foraging-Shuffled Frog Leaping Algorithm, referred to as BF-SFLA.</p>
<p>To verify the actual optimization performance of SFLA1, SFLA2, and BF-SFLA, 15 standard test functions were selected for verification experiments. The Parameter Settings of test functions were shown in <xref ref-type="table" rid="T1">Table 1</xref>. The algorithms parameters were set as follows: the total population was 400. The subgroups number was 40. The number of individuals in each subgroup was 10. The number of updating evolution within every subgroup was 10. The number of algorithm evolution was 500. The experimental results were shown in <xref ref-type="table" rid="T2">Table 2</xref>. The operating environment was Windows 10, 8-core 64-bit operating system with 8G of memory, and the running software was MATLAB 2012a.</p>
<table-wrap position="float" id="T2">
<label>TABLE 2</label>
<caption><p>The experimental results under fixed iteration number.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left"><italic>Function</italic></td>
<td valign="top" align="center" colspan="2">SFLA<hr/></td>
<td valign="top" align="center" colspan="2">SFLA1<hr/></td>
<td valign="top" align="center" colspan="2">SFLA2<hr/></td>
<td valign="top" align="center" colspan="2">SFLA<sup>[25]</sup><hr/></td>
<td valign="top" align="center" colspan="2">SFLA<sup>[26]</sup><hr/></td>
<td valign="top" align="center" colspan="2">BF-SFLA<hr/></td>
</tr>
<tr>
<td/>
<td valign="top" align="center">Ave</td>
<td valign="top" align="center">Std</td>
<td valign="top" align="center">Ave</td>
<td valign="top" align="center">Std</td>
<td valign="top" align="center">Ave</td>
<td valign="top" align="center">Std</td>
<td valign="top" align="center">Ave</td>
<td valign="top" align="center">Std</td>
<td valign="top" align="center">Ave</td>
<td valign="top" align="center">Std</td>
<td valign="top" align="center">Ave</td>
<td valign="top" align="center">Std</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">f<sub>1</sub></td>
<td valign="top" align="center">9.36E&#x2013;01</td>
<td valign="top" align="center">8.66E&#x2013;02</td>
<td valign="top" align="center">1.47E&#x2013;33</td>
<td valign="top" align="center">4.92E&#x2013;20</td>
<td valign="top" align="center">9.05E&#x2013;01</td>
<td valign="top" align="center">6.68E&#x2013;02</td>
<td valign="top" align="center">6.45E&#x2013;03</td>
<td valign="top" align="center">3.12E&#x2013;03</td>
<td valign="top" align="center">5.22E&#x2013;03</td>
<td valign="top" align="center">7.32E&#x2013;33</td>
<td valign="top" align="center">3.21E&#x2013;18</td>
<td valign="top" align="center">5.02E&#x2013;33</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>2</sub></td>
<td valign="top" align="center">1.46E+02</td>
<td valign="top" align="center">6.59E+01</td>
<td valign="top" align="center">2.54E+01</td>
<td valign="top" align="center">1.71E+01</td>
<td valign="top" align="center">1.01E+02</td>
<td valign="top" align="center">6.08E+01</td>
<td valign="top" align="center">2.67E+02</td>
<td valign="top" align="center">5.28E+01</td>
<td valign="top" align="center">1.29E+02</td>
<td valign="top" align="center">3.05E&#x2013;01</td>
<td valign="top" align="center">2.57E+01</td>
<td valign="top" align="center">4.63E&#x2013;01</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>3</sub></td>
<td valign="top" align="center">1.59E+01</td>
<td valign="top" align="center">4.39E+00</td>
<td valign="top" align="center">1.03E+00</td>
<td valign="top" align="center">3.19E+00</td>
<td valign="top" align="center">1.30E+01</td>
<td valign="top" align="center">4.11E+00</td>
<td valign="top" align="center">1.95E+01</td>
<td valign="top" align="center">7.07E+00</td>
<td valign="top" align="center">1.16E+01</td>
<td valign="top" align="center">1.56E+00</td>
<td valign="top" align="center">8.73E+00</td>
<td valign="top" align="center">2.03E+00</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>4</sub></td>
<td valign="top" align="center">1.09E+00</td>
<td valign="top" align="center">4.57E&#x2013;02</td>
<td valign="top" align="center">1.00E+00</td>
<td valign="top" align="center">1.60E&#x2013;16</td>
<td valign="top" align="center">1.04E+00</td>
<td valign="top" align="center">3.11E&#x2013;02</td>
<td valign="top" align="center">1.00E+00</td>
<td valign="top" align="center">2.14E&#x2013;04</td>
<td valign="top" align="center">1.00E+00</td>
<td valign="top" align="center">1.93E&#x2013;16</td>
<td valign="top" align="center">1.00E+00</td>
<td valign="top" align="center">2.14E&#x2013;16</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>5</sub></td>
<td valign="top" align="center">1.41E+00</td>
<td valign="top" align="center">5.68E&#x2013;01</td>
<td valign="top" align="center">1.06E&#x2013;14</td>
<td valign="top" align="center">2.62E&#x2013;12</td>
<td valign="top" align="center">1.07E+00</td>
<td valign="top" align="center">5.26E&#x2013;01</td>
<td valign="top" align="center">1.09E+00</td>
<td valign="top" align="center">6.62E&#x2013;01</td>
<td valign="top" align="center">7.50E&#x2013;01</td>
<td valign="top" align="center">3.18E&#x2013;15</td>
<td valign="top" align="center">1.12E&#x2013;12</td>
<td valign="top" align="center">7.68E&#x2013;15</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>6</sub></td>
<td valign="top" align="center">2.44E+01</td>
<td valign="top" align="center">7.33E+00</td>
<td valign="top" align="center">1.91E&#x2013;01</td>
<td valign="top" align="center">3.37E+00</td>
<td valign="top" align="center">2.27E+01</td>
<td valign="top" align="center">8.54E+00</td>
<td valign="top" align="center">1.91E+01</td>
<td valign="top" align="center">4.46E+00</td>
<td valign="top" align="center">1.69E+01</td>
<td valign="top" align="center">2.88E&#x2013;01</td>
<td valign="top" align="center">6.05E+00</td>
<td valign="top" align="center">3.88E&#x2013;01</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>7</sub></td>
<td valign="top" align="center">1.01E+00</td>
<td valign="top" align="center">1.08E&#x2013;01</td>
<td valign="top" align="center">1.14E&#x2013;17</td>
<td valign="top" align="center">6.22E&#x2013;35</td>
<td valign="top" align="center">9.68E&#x2013;01</td>
<td valign="top" align="center">3.66E&#x2013;02</td>
<td valign="top" align="center">5.99E&#x2013;01</td>
<td valign="top" align="center">1.50E&#x2013;01</td>
<td valign="top" align="center">3.11E&#x2013;01</td>
<td valign="top" align="center">2.66E&#x2013;17</td>
<td valign="top" align="center">1.14E&#x2013;35</td>
<td valign="top" align="center">2.77E&#x2013;18</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>8</sub></td>
<td valign="top" align="center">6.62E+00</td>
<td valign="top" align="center">9.98E&#x2013;01</td>
<td valign="top" align="center">3.32E&#x2013;04</td>
<td valign="top" align="center">3.53E&#x2013;01</td>
<td valign="top" align="center">4.06E+00</td>
<td valign="top" align="center">9.44E&#x2013;01</td>
<td valign="top" align="center">5.01E+00</td>
<td valign="top" align="center">7.40E&#x2013;01</td>
<td valign="top" align="center">4.32E+00</td>
<td valign="top" align="center">2.54E&#x2013;04</td>
<td valign="top" align="center">1.08E+00</td>
<td valign="top" align="center">4.47E&#x2013;04</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>9</sub></td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>10</sub></td>
<td valign="top" align="center">5.18E&#x2013;01</td>
<td valign="top" align="center">1.57E&#x2013;01</td>
<td valign="top" align="center">1.02E&#x2013;03</td>
<td valign="top" align="center">8.29E&#x2013;04</td>
<td valign="top" align="center">5.02E&#x2013;01</td>
<td valign="top" align="center">9.63E&#x2013;02</td>
<td valign="top" align="center">2.16E&#x2013;03</td>
<td valign="top" align="center">8.03E&#x2013;04</td>
<td valign="top" align="center">2.90E&#x2013;03</td>
<td valign="top" align="center">3.30E&#x2013;04</td>
<td valign="top" align="center">2.41E&#x2013;03</td>
<td valign="top" align="center">3.99E&#x2013;04</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>11</sub></td>
<td valign="top" align="center">&#x2013;3.05E+03</td>
<td valign="top" align="center">4.01E+02</td>
<td valign="top" align="center">&#x2013;4.61E+03</td>
<td valign="top" align="center">6.48E+02</td>
<td valign="top" align="center">&#x2013;3.01E+03</td>
<td valign="top" align="center">4.20E+02</td>
<td valign="top" align="center">&#x2013;5.09E+03</td>
<td valign="top" align="center">5.67E+02</td>
<td valign="top" align="center">&#x2013;4.77E+03</td>
<td valign="top" align="center">3.55E+02</td>
<td valign="top" align="center">&#x2013;4.94E+03</td>
<td valign="top" align="center">2.48E+02</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>12</sub></td>
<td valign="top" align="center">9.30E&#x2013;01</td>
<td valign="top" align="center">6.60E&#x2013;02</td>
<td valign="top" align="center">1.92E&#x2013;32</td>
<td valign="top" align="center">7.25E&#x2013;15</td>
<td valign="top" align="center">8.09E&#x2013;01</td>
<td valign="top" align="center">8.40E&#x2013;02</td>
<td valign="top" align="center">4.90E&#x2013;02</td>
<td valign="top" align="center">7.51E&#x2013;02</td>
<td valign="top" align="center">5.74E&#x2013;02</td>
<td valign="top" align="center">2.06E&#x2013;33</td>
<td valign="top" align="center">1.11E&#x2013;17</td>
<td valign="top" align="center">1.40E&#x2013;32</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>13</sub></td>
<td valign="top" align="center">&#x2013;7.68E&#x2013;01</td>
<td valign="top" align="center">2.01E&#x2013;01</td>
<td valign="top" align="center">&#x2013;1.03E+00</td>
<td valign="top" align="center">2.51E&#x2013;04</td>
<td valign="top" align="center">&#x2013;7.87E&#x2013;01</td>
<td valign="top" align="center">2.53E&#x2013;01</td>
<td valign="top" align="center">&#x2013;1.03E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">&#x2013;1.03E+00</td>
<td valign="top" align="center">1.05E&#x2013;03</td>
<td valign="top" align="center">&#x2013;1.03E+00</td>
<td valign="top" align="center">9.72E&#x2013;04</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>14</sub></td>
<td valign="top" align="center">3.98E&#x2013;01</td>
<td valign="top" align="center">1.70E&#x2013;01</td>
<td valign="top" align="center">3.98E&#x2013;01</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">3.98E&#x2013;01</td>
<td valign="top" align="center">1.78E&#x2013;01</td>
<td valign="top" align="center">3.98E&#x2013;01</td>
<td valign="top" align="center">3.98E&#x2013;01</td>
<td valign="top" align="center">3.98E&#x2013;01</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">3.98E&#x2013;01</td>
<td valign="top" align="center">0.00E+00</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>15</sub></td>
<td valign="top" align="center">&#x2013;8.77E&#x2013;01</td>
<td valign="top" align="center">6.40E&#x2013;02</td>
<td valign="top" align="center">&#x2013;1.00E+00</td>
<td valign="top" align="center">3.36E&#x2013;03</td>
<td valign="top" align="center">&#x2013;8.63E&#x2013;01</td>
<td valign="top" align="center">7.23E&#x2013;02</td>
<td valign="top" align="center">&#x2013;1.00E+00</td>
<td valign="top" align="center">0.00E+00</td>
<td valign="top" align="center">&#x2013;9.98E&#x2013;01</td>
<td valign="top" align="center">2.69E&#x2013;04</td>
<td valign="top" align="center">&#x2013;1.00E+00</td>
<td valign="top" align="center">7.35E&#x2013;04</td>
</tr>
</tbody>
</table></table-wrap>
<p>Two modes, (1) the algorithm optimization accuracy analysis under fixed iterations number and (2) the algorithm iterations number analysis under the fixed optimization accuracy, were used to evaluate the optimization performance of the algorithm.</p>
<p>(1) The algorithm optimization accuracy analysis under fixed iterations number</p>
<p>The experimental results were analyzed with the algorithm optimization accuracy under fixed iterations number, as shown in <xref ref-type="table" rid="T2">Table 2</xref>. Where (Ave) represented the average optimal value of the algorithm running 30 times, (Std) represented the standard deviation, and (AvgT(s)) represented the average running time each time, in seconds (s). The following results could be obtained from <xref ref-type="table" rid="T2">Table 2</xref>:</p>
<p>(1) For all test functions (F1 to F15), SFLA1 and SFLA2 obtained better (Ave) and (Std) than SFLA to varying degrees, indicating that the two improvement strategies all played a specific role in improving the performance of the algorithm. Compared with the SFLA, the (Ave) of SFLA1 and SFLA2 had been improved by E<sup>0</sup> to E<sup>10</sup>, and the (Std) had been reduced by E<sup>0</sup> to E<sup>20</sup>, indicating that the improved strategies of SFLA1 and SFLA2 played more pronounced effects on improving the optimization accuracy and stability of the algorithm.</p>
<p>(2) For all test functions, BF-SFLA obtained more minor (Ave) and (Std) compared with SFLA1 and SFLA2 to varying degrees, indicating that the optimization accuracy and stability of the algorithm after the introduction of the combined improvement strategies were better than single improvement strategy. SFLA1 and SFLA2 were two algorithms obtained by SFLA after introducing (CO) and (BGS), respectively. (CO) was the improvement of updating method for (<italic>P</italic><sub><italic>w</italic></sub>), while (BGS) was the optimization for algorithm grouping method. Although a single improvement strategy could improve the optimization performance of the algorithm to a certain extent, the room for improvement was limited. However, by combining multiple improvement strategies and improving the algorithm from different perspectives, the performance improvement of the algorithm would be more obvious. Compared with the improved algorithms in literature (<xref ref-type="bibr" rid="B22">Sun et al., 2008</xref>) and (<xref ref-type="bibr" rid="B6">Dai and Wang, 2012</xref>), BF-SFLA had obtained better (Ave) for almost all test functions (except f<sub>10</sub>). On the whole, it showed that BF-SFLA had better optimization accuracy and performance.</p>
<p>(2) The algorithm iterations number analysis under the fixed optimization accuracy</p>
<p>The SFLA, SFLA1, SFLA2, Improved SFLA in literature (<xref ref-type="bibr" rid="B22">Sun et al., 2008</xref>; <xref ref-type="bibr" rid="B6">Dai and Wang, 2012</xref>), and BF-SFLA were used to optimize and verify the test function, verify the iteration conditions of six algorithms independently executing 30 times (the maximum number of iterations being 500) to meet the accuracy requirements in <xref ref-type="table" rid="T1">Table 1</xref>. The relevant information was shown in <xref ref-type="table" rid="T3">Table 3</xref>. In the table, (Avg(%)) represented the success rate (the percentage of the number of experiments where the algorithm achieved the required accuracy in the total number of experiments). (AveN) represented the average number of iterations with the required accuracy. The following results can could be obtained from <xref ref-type="table" rid="T3">Table 3</xref>.</p>
<table-wrap position="float" id="T3">
<label>TABLE 3</label>
<caption><p>The experimental results under fixed optimization accuracy.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left"><italic>Function</italic></td>
<td valign="top" align="center" colspan="2">SFLA<hr/></td>
<td valign="top" align="center" colspan="2">SFLA1<hr/></td>
<td valign="top" align="center" colspan="2">SFLA2<hr/></td>
<td valign="top" align="center" colspan="2">SFLA<sup>[25]</sup><hr/></td>
<td valign="top" align="center" colspan="2">SFLA<sup>[26]</sup><hr/></td>
<td valign="top" align="center" colspan="2">BF-SFLA<hr/></td>
</tr>
<tr>
<td/>
<td valign="top" align="center">Ave(%)</td>
<td valign="top" align="center">AveN</td>
<td valign="top" align="center">Ave(%)</td>
<td valign="top" align="center">AveN</td>
<td valign="top" align="center">Ave(%)</td>
<td valign="top" align="center">AveN</td>
<td valign="top" align="center">Ave(%)</td>
<td valign="top" align="center">AveN</td>
<td valign="top" align="center">Ave(%)</td>
<td valign="top" align="center">AveN</td>
<td valign="top" align="center">Ave(%)</td>
<td valign="top" align="center">AveN</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">f<sub>1</sub></td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">23%</td>
<td valign="top" align="center">407</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">261</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">283</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">248</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>2</sub></td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">93%</td>
<td valign="top" align="center">298</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">121</td>
<td valign="top" align="center">97%</td>
<td valign="top" align="center">260</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">94</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>3</sub></td>
<td valign="top" align="center">23%</td>
<td valign="top" align="center">385</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">145</td>
<td valign="top" align="center">43%</td>
<td valign="top" align="center">306</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">140</td>
<td valign="top" align="center">47%</td>
<td valign="top" align="center">295</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">126</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>4</sub></td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">90%</td>
<td valign="top" align="center">201</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">97%</td>
<td valign="top" align="center">149</td>
<td valign="top" align="center">80%</td>
<td valign="top" align="center">339</td>
<td valign="top" align="center">93%</td>
<td valign="top" align="center">138</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>5</sub></td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">267</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">266</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">249</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>6</sub></td>
<td valign="top" align="center">3%</td>
<td valign="top" align="center">482</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">208</td>
<td valign="top" align="center">7%</td>
<td valign="top" align="center">437</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">192</td>
<td valign="top" align="center">7%</td>
<td valign="top" align="center">424</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">182</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>7</sub></td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">344</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">342</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>8</sub></td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">120</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">120</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>9</sub></td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">128</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">23</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">127</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">65</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">76</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">20</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>10</sub></td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">93</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">32</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">72</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">23</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">119</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">26</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>11</sub></td>
<td valign="top" align="center">63%</td>
<td valign="top" align="center">231</td>
<td valign="top" align="center">93%</td>
<td valign="top" align="center">66</td>
<td valign="top" align="center">73%</td>
<td valign="top" align="center">220</td>
<td valign="top" align="center">63%</td>
<td valign="top" align="center">220</td>
<td valign="top" align="center">70%</td>
<td valign="top" align="center">231</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">170</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>12</sub></td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">93%</td>
<td valign="top" align="center">232</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">0%</td>
<td valign="top" align="center">&#x2013;</td>
<td valign="top" align="center">97%</td>
<td valign="top" align="center">192</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>13</sub></td>
<td valign="top" align="center">70%</td>
<td valign="top" align="center">148</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">31</td>
<td valign="top" align="center">40%</td>
<td valign="top" align="center">144</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">72</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">59</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">77</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>14</sub></td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">20</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">16</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">20</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">19</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">16</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">15</td>
</tr>
<tr>
<td valign="top" align="left">f<sub>15</sub></td>
<td valign="top" align="center">77%</td>
<td valign="top" align="center">220</td>
<td valign="top" align="center">80%</td>
<td valign="top" align="center">133</td>
<td valign="top" align="center">87%</td>
<td valign="top" align="center">217</td>
<td valign="top" align="center">87%</td>
<td valign="top" align="center">182</td>
<td valign="top" align="center">100%v</td>
<td valign="top" align="center">117</td>
<td valign="top" align="center">100%</td>
<td valign="top" align="center">74</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>The symbol &#x201C;&#x2013;&#x201D; indicates that the fixed optimization accuracy cannot be achieved within the 500 times.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p>(1) SFLA had a success rate of 0% for test functions f<sub>1</sub>, f<sub>2</sub>, f<sub>4</sub>, f<sub>5</sub>, f<sub>7</sub>, f<sub>8</sub>, and f<sub>12</sub>, and could not achieve the required optimization accuracy within a fixed number of iterations (500), indicating that SFLA had a slow convergence speed and low convergence accuracy. Compared with SFLA, SFLA1 and SFLA2 achieved a specific success rate for all test functions, indicating that the algorithm improved by introducing a single strategy improved the convergence accuracy of the algorithm to a certain extent.</p>
<p>(2) The BF-SFLA achieved a success rate of 93&#x2013;100% for all test functions. The result was significantly higher than the other five algorithms. It showed that BF-SFLA had better-searching precision and stability. From the AveN indexes with fixed optimization accuracy, BF-SFLA was smaller than the other five algorithms on the whole. The results showed that BF-SFLA converges faster and obtains the same optimization precision with fewer iteration times.</p>
<p><xref ref-type="table" rid="T4">Table 4</xref> was the index mean information table under fixed iteration times. Where AVE(Avg) and AVE(Std) were, respectively the means of (Ave) and (Std) for all test functions in <xref ref-type="table" rid="T2">Table 2</xref>. Compared with SFLA, SFLA1, SFLA2, and literature (<xref ref-type="bibr" rid="B22">Sun et al., 2008</xref>; <xref ref-type="bibr" rid="B6">Dai and Wang, 2012</xref>), the smaller AVE(Ave) and AVE(Std) were achieved by BF-SFLA, so the better optimization performance was achieved by BF-SFLA. <xref ref-type="table" rid="T5">Table 5</xref> was the index mean value under fixed optimization accuracy. Where AVE(ave%) and AVE(AveN) were, respectively the means of (Ave(%)) and (AveN) for all test functions in <xref ref-type="table" rid="T3">Table 3</xref>. Compared with SFLA, SFLA1, SFLA2, and literature (<xref ref-type="bibr" rid="B22">Sun et al., 2008</xref>; <xref ref-type="bibr" rid="B6">Dai and Wang, 2012</xref>), the smaller AVE(Ave(%)) and AVE(AveN) were achieved by BF-SFLA, so the better optimization performance was also achieved by BF-SFLA.</p>
<table-wrap position="float" id="T4">
<label>TABLE 4</label>
<caption><p>The index mean of fixed iteration times.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">Attribute</td>
<td valign="top" align="center">SFLA</td>
<td valign="top" align="center">SFLA1</td>
<td valign="top" align="center">SFLA2</td>
<td valign="top" align="center">SFLA<sup>[25]</sup></td>
<td valign="top" align="center">SFLA<sup>[26]</sup></td>
<td valign="top" align="center">BF-SFLA</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">AVE(Ave)</td>
<td valign="top" align="center">6.48E+02</td>
<td valign="top" align="center">5.32E+02</td>
<td valign="top" align="center">6.47E+02</td>
<td valign="top" align="center">5.20E+02</td>
<td valign="top" align="center">5.31E+02</td>
<td valign="top" align="center"><bold>5.11E+02</bold></td>
</tr>
<tr>
<td valign="top" align="left">AVE(Std)</td>
<td valign="top" align="center">3.21E+01</td>
<td valign="top" align="center">4.48E+01</td>
<td valign="top" align="center">3.30E+01</td>
<td valign="top" align="center">4.22E+01</td>
<td valign="top" align="center">2.38E+01</td>
<td valign="top" align="center"><bold>1.67E+01</bold></td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>The best value is in bold.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<table-wrap position="float" id="T5">
<label>TABLE 5</label>
<caption><p>The index mean value under fixed optimization accuracy.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">Attribute</td>
<td valign="top" align="center">SFLA</td>
<td valign="top" align="center">SFLA1</td>
<td valign="top" align="center">SFLA2</td>
<td valign="top" align="center">SFLA<sup>[25]</sup></td>
<td valign="top" align="center">SFLA<sup>[26]</sup></td>
<td valign="top" align="center">BF-SFLA</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">AVE(Ave(%))</td>
<td valign="top" align="center">35.73%</td>
<td valign="top" align="center">91.47%</td>
<td valign="top" align="center">36.67%</td>
<td valign="top" align="center">76.47%</td>
<td valign="top" align="center">57.21%</td>
<td valign="top" align="center"><bold>99.33%</bold></td>
</tr>
<tr>
<td valign="top" align="left">AVE(AveN)</td>
<td valign="top" align="center">323.62</td>
<td valign="top" align="center">139.85</td>
<td valign="top" align="center">311.00</td>
<td valign="top" align="center">217.54</td>
<td valign="top" align="center">282.77</td>
<td valign="top" align="center"><bold>133.15</bold></td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>The best value is in bold.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
</sec>
<sec id="S5">
<title>The Application of Feature Selection Based on BF-SLFA Algorithm</title>
<sec id="S5.SS1">
<title>Discretization of the Shuffled Frog Leaping Algorithm</title>
<p>To represent the feature subset, SFLA should be converted to binary SFLA. Assuming that one solution of the algorithm was (0, 1, 0, 1, 0, 0, 1, 0, 0, 1), then the dimension of the solution was 10, and the matching feature subset was one feature subset composed of four in all ten features (the 2nd, 4th, 7th, and 10th). The transformation formula discussed in <xref ref-type="bibr" rid="B11">Hu and Dai (2018)</xref> was shown in formula (3, 4), and new <italic>P</italic><sub><italic>w</italic></sub> was converted into a vector of binary range [0, 1] by Equation (5, 6):</p>
<disp-formula id="S5.E5">
<label>(5)</label>
<mml:math id="M5">
<mml:mrow>
<mml:mrow>
<mml:mi>s</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mi>g</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>D</mml:mi>
<mml:mo rspace="5.8pt" stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mrow>
<mml:mn>1</mml:mn>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mi>e</mml:mi>
<mml:mrow>
<mml:mo>-</mml:mo>
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mi>A</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:mi>D</mml:mi>
</mml:mrow>
</mml:mrow>
</mml:msup>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="S5.Ex1">
<mml:math id="M6">
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mi>A</mml:mi>
</mml:mpadded>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mfrac>
<mml:mi>g</mml:mi>
<mml:mi>G</mml:mi>
</mml:mfrac>
<mml:mo>&#x2062;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>F</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
<mml:mo>-</mml:mo>
<mml:msub>
<mml:mi>F</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>F</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="S5.E6">
<label>(6)</label>
<mml:math id="M7">
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:msub>
<mml:mi>P</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mpadded>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable displaystyle="true" rowspacing="0pt">
<mml:mtr>
<mml:mtd columnalign="center">
<mml:mrow>
<mml:mpadded width="+5pt">
<mml:mn>1</mml:mn>
</mml:mpadded>
<mml:mi>i</mml:mi>
<mml:mi>f</mml:mi>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>s</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>g</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>D</mml:mi>
<mml:mo rspace="5.8pt">)</mml:mo>
</mml:mrow>
<mml:mo rspace="5.8pt">&gt;</mml:mo>
<mml:mi>R</mml:mi>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd columnalign="center">
<mml:mrow>
<mml:mpadded width="+5pt">
<mml:mn>0</mml:mn>
</mml:mpadded>
<mml:mi>i</mml:mi>
<mml:mi>f</mml:mi>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>s</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>g</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>D</mml:mi>
<mml:mo rspace="5.8pt">)</mml:mo>
</mml:mrow>
<mml:mo rspace="5.8pt">&#x2264;</mml:mo>
<mml:mi>R</mml:mi>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
<mml:mi/>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<p>(<italic>P</italic><sub><italic>i</italic></sub>) was the value of the <italic>i</italic>-dimension after the individual was discrete, (<italic>D</italic>) was the step size of the individual, (<italic>R</italic>) was the random number between [0, 1], and <italic>A</italic> was the adjustment coefficient, reflecting the degree of certainty that the individual linear solution was converted to the discrete solution. The value of (<italic>A</italic>) changed from large to small, the determinacy of the individual linear solution to discrete solution changed from strong to weak, and the diversity of individuals changed from weak to strong. Meanwhile, the global exploration ability of individuals changed from strong to weak, and the local mining ability changed from weak to strong. So the value of <italic>A</italic> was neither bigger nor smaller. The value of <italic>A</italic> was determined by four parameters, namely (<italic>g</italic>) (current iteration number), (<italic>G</italic>) (total iteration number), (<italic>F</italic><sub>1</sub>) (start control parameter), and (<italic>F</italic><sub>2</sub>) (end control parameter). It was expected that at the beginning of the iteration, (<italic>A</italic>) should be a large value to enhance the exploration ability of the algorithm to traverse the solution space globally in the early stage of the iteration. In contrast, at the later iteration stage, (<italic>A</italic>) should be a small value to enhance the algorithm&#x2019;s local refinement searchability. Therefore, the value range of (<italic>F</italic><sub>1</sub>) was set as [0.90, 0.95], and the value range of (<italic>F</italic><sub>2</sub>) was set as [1.05, 1.1].</p>
<p>The addition and subtraction operation of the discrete binary solution was basically the same as the binary addition and subtraction operation method. The difference was that the highest bit could be borrowed or carried without recording to ensure that the number of elements of the solution vector was consistent with the original number of features. The specific operation was shown in <xref ref-type="table" rid="T6">Table 6</xref>.</p>
<table-wrap position="float" id="T6">
<label>TABLE 6</label>
<caption><p>Addition and subtraction of discrete binary solutions.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">X<sup>1</sup></td>
<td valign="top" align="center">X<sup>2</sup></td>
<td valign="top" align="center">X<sup>1</sup>-X<sup>2</sup></td>
<td valign="top" align="center">X<sup>1</sup>+X<sup>2</sup></td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">(1, 0, 1, 0)</td>
<td valign="top" align="center">(0, 1, 0, 0)</td>
<td valign="top" align="center">(0, 1, 1, 0)</td>
<td valign="top" align="center">(1, 1, 1, 0)</td>
</tr>
</tbody>
</table></table-wrap>
</sec>
<sec id="S5.SS2">
<title>Algorithm Flow</title>
<p>The algorithm flow of the feature selection application based on BF-SFLA was as follows.</p>
<p>Step 1: Set the relevant parameters: (i) randomly generate (<italic>L</italic>) frogs within the scope of the domain, (ii) the number of subgroups was (<italic>A</italic>), (iii) the number for each subgroup frog was (<italic>B</italic>), (iv) the number of global information exchange was <italic>C1</italic>, and (v) the number of local searches was <italic>C2.</italic></p>
<p>Step 2: Calculate the fitness [value] for each frog. Rank and group all frogs according to the target function value.</p>
<p>Step 3: <italic>IF</italic> (<italic>P</italic><sub><italic>w</italic></sub>) had not been improved after learning from (<italic>P</italic><sub><italic>b</italic></sub>) or (<italic>P</italic><sub><italic>g</italic></sub>), the (CO) would be implemented. <italic>IF</italic> there was no improvement, (<italic>P</italic><sub><italic>w</italic></sub>) was replaced in the solution space by randomly generated individuals.</p>
<p>Step 4: Reorder each subgroup and update (<italic>P</italic><sub><italic>w</italic></sub>), (<italic>P</italic><sub><italic>b</italic></sub>), and (<italic>P</italic><sub><italic>g</italic></sub>) in each subgroup.</p>
<p>Step 5: Determine <italic>IF</italic> the number of local search iterations reaches C2, <italic>IF</italic> not, return to step 3 and continue to execute.</p>
<p>Step 6: Determine <italic>IF</italic> global information exchange iterations reach <italic>C1</italic> or (<italic>P</italic><sub><italic>g</italic></sub>) and <italic>IF</italic> the requirements of convergence precision were achieved. <italic>IF NOT</italic>, return to step 2 to continue. <italic>IF</italic> the termination of the algorithm was reached, output (<italic>P</italic><sub><italic>g</italic></sub>).</p>
<p>The details of the process used for enabling Feature Selection with BF-SFLA were shown in <xref ref-type="fig" rid="F4">Figure 4</xref>; (<italic>L</italic>) was the number of times the algorithm was executed in each experiment, (<italic>D</italic><sub><italic>max</italic></sub>) was the upper limit of feature subsets number, and (<italic>L</italic><sub><italic>max</italic></sub>) was the experiment number.</p>
<fig id="F4" position="float">
<label>FIGURE 4</label>
<caption><p>The feature selection flow chart.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g004.tif"/>
</fig>
<p>The classification accuracy and the number of feature subsets were two critical indexes for designing the evaluation function. The classification accuracy was usually obtained by the classification algorithm. <italic>K</italic>-<italic>NN</italic> (k-nearest Neighbor) and <italic>C</italic>4.5 decision tree classification algorithms were used to classify and evaluate the feature subsets without loss of generality.</p>
<p>K-nearest neighbor method was a non-parametric classification technique based on analogy learning. It was very effective in pattern recognition based on statistics, and could achieve high classification accuracy for unknown and non-normal distribution. It had the advantages of robustness and clear concept. The main idea of the <italic>K</italic>-<italic>NN</italic> classification algorithm was as follows: first calculate the distance or similarity between the sample to be classified and the training sample of the known category (usually used Euclidean distance to determine the similarity of the sample), and find the nearest (<italic>K</italic>) neighbors of the distance or similarity with the sample to be classified. Then the category of the sample data to be classified was judged according to the category of the neighbors. If the (<italic>K</italic>) neighbors of the sample data to be classified all belonged to the same category, then the sample to be classified also belonged to the same category. Otherwise, each candidate category was graded to determine the sample data category to be classified according to some rule (<xref ref-type="bibr" rid="B3">Cai et al., 2020</xref>).</p>
<p><italic>C</italic>4.5 decision tree classification algorithm was a greedy algorithm, which adopted a top-down divide and conquer construction. It deduced the classification rules in the form of decision tree representation from a group of unordered and irregular cases, and it was an inductive learning method based on examples. The decision tree classification algorithm was one of the widely used classification algorithms. The advantages of this method were simple description, fast classification speed, and easy-to-understand classification rules.</p>
<p>In our proposed method, the classification accuracy and the number of selected features were the two indicators used to design the evaluation function as defined in <xref ref-type="bibr" rid="B5">Chuang et al. (2008)</xref>:</p>
<disp-formula id="S5.E7">
<label>(7)</label>
<mml:math id="M8">
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mtext>fitness</mml:mtext>
</mml:mpadded>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:msub>
<mml:mtext>W</mml:mtext>
<mml:mn>1</mml:mn>
</mml:msub>
</mml:mpadded>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:msub>
<mml:mtext>accW</mml:mtext>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
<mml:mo>&#x002A;</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mn>1</mml:mn>
<mml:mo>-</mml:mo>
<mml:mfrac>
<mml:mtext>n</mml:mtext>
<mml:mtext>N</mml:mtext>
</mml:mfrac>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<p>The fitness function defined by equation (7) had two predefined weights: (<italic>W</italic><sub>1</sub>) (the classification accuracy) and (<italic>W</italic><sub>2</sub>) (the selected feature). If accuracy was the most critical factor, the accuracy [of the weight] could be adjusted to a high value. In this manuscript, the values for (<italic>W</italic><sub>1</sub>) and (<italic>W</italic><sub>2</sub>) were (<xref ref-type="bibr" rid="B15">Lu and Han, 2003</xref>) and [0.1], respectively. Assuming that an individual with a high fitness [value] had a high probability of including the positions of other individuals in the next iteration, the weights (<italic>W</italic><sub>1</sub>) and (<italic>W</italic><sub>2</sub>) must be adequately defined; (<italic>acc</italic>) was the classification accuracy, where (<italic>n</italic>) was the number of unique features and (<italic>N</italic>) was the total number of features.</p>
<p>The fitness definition (<italic>acc</italic>) represented the percentage of correctly classified examples as assessed by Equation (8). The number of correct and wrong classification examples was denoted by (<italic>num</italic><sub><italic>c</italic></sub>) and (<italic>num</italic><sub><italic>i</italic></sub>), respectively.</p>
<disp-formula id="S5.E8">
<label>(8)</label>
<mml:math id="M9">
<mml:mrow>
<mml:mrow>
<mml:mi>a</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mi>c</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mpadded width="+3.3pt">
<mml:mi>c</mml:mi>
</mml:mpadded>
</mml:mrow>
<mml:mo rspace="5.8pt">=</mml:mo>
<mml:mrow>
<mml:mpadded width="+3.3pt">
<mml:mfrac>
<mml:mrow>
<mml:mi>n</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mi>u</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:msub>
<mml:mi>m</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mi>n</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mi>u</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:msub>
<mml:mi>m</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mi>n</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:mi>u</mml:mi>
<mml:mo>&#x2062;</mml:mo>
<mml:msub>
<mml:mi>m</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
</mml:mpadded>
<mml:mo rspace="5.8pt">&#x00D7;</mml:mo>
<mml:mrow>
<mml:mn>100</mml:mn>
<mml:mo>%</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
</sec>
<sec id="S5.SS3">
<title>Results and Discussion</title>
<p>We introduced the evaluation function in formula (7). The assessment used several well-known and recognized biomedical datasets (<xref ref-type="bibr" rid="B11">Hu and Dai, 2018</xref>). The datasets include ColonTumor and DLBCL-Outcome etc., and provide data related to gene expression, protein profiling, and genomic sequence for disease classification and diagnosis. All the datasets were high-dimensional and contained fewer instances and irrelevant or weak correlation features, the dimensional ranged from 2,000 to 12,600, and the format of the datasets was shown in <xref ref-type="table" rid="T7">Table 7</xref>.</p>
<table-wrap position="float" id="T7">
<label>TABLE 7</label>
<caption><p>The format of datasets.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">Data set</td>
<td valign="top" align="center">Instances</td>
<td valign="top" align="center">Attributes</td>
<td valign="top" align="center">Classes</td>
<td valign="top" align="center">K-NN (k = 5)</td>
<td valign="top" align="center">C4.5</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">ColonTumor</td>
<td valign="top" align="center">62</td>
<td valign="top" align="center">2,000</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">73.87 (0.24)</td>
<td valign="top" align="left">73.87 (0.24)</td>
</tr>
<tr>
<td valign="top" align="left">DLBCL-Outcome</td>
<td valign="top" align="center">58</td>
<td valign="top" align="center">7,129</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">47.46 (0.51)</td>
<td valign="top" align="left">47.46 (0.51)</td>
</tr>
<tr>
<td valign="top" align="left">ALL-AML-Leukemia</td>
<td valign="top" align="center">106</td>
<td valign="top" align="center">7,130</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">88.39 (0.13)</td>
<td valign="top" align="left">88.39 (0.13)</td>
</tr>
<tr>
<td valign="top" align="left">Lung cancer-Ontario</td>
<td valign="top" align="center">39</td>
<td valign="top" align="center">2,880</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">56.38 (0.34)</td>
<td valign="top" align="left">56.38 (0.34)</td>
</tr>
<tr>
<td valign="top" align="left">DLBCL-Stanford</td>
<td valign="top" align="center">47</td>
<td valign="top" align="center">4,026</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">75.51 (0.26)</td>
<td valign="top" align="left">75.51 (0.26)</td>
</tr>
<tr>
<td valign="top" align="left">Lung cancer-Harvard2</td>
<td valign="top" align="center">181</td>
<td valign="top" align="center">12,534</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">94.38 (0.04)</td>
<td valign="top" align="left">94.38 (0.04)</td>
</tr>
<tr>
<td valign="top" align="left">Nervous-System</td>
<td valign="top" align="center">60</td>
<td valign="top" align="center">7,129</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">54.63 (0.42)</td>
<td valign="top" align="left">54.63 (0.42)</td>
</tr>
<tr>
<td valign="top" align="left">Lung cancer-Harvard1</td>
<td valign="top" align="center">203</td>
<td valign="top" align="center">12,600</td>
<td valign="top" align="center">5</td>
<td valign="top" align="center">87.56 (0.09)</td>
<td valign="top" align="left">87.56 (0.09)</td>
</tr>
<tr>
<td valign="top" align="left">DLBCL-NIH</td>
<td valign="top" align="center">160</td>
<td valign="top" align="center">7,400</td>
<td valign="top" align="center">2</td>
<td valign="top" align="center">47.23 (0.46)</td>
<td valign="top" align="left">47.23 (0.46)</td>
</tr>
</tbody>
</table></table-wrap>
<p>To evaluate the performance of our proposed BF-SFLA algorithm, the SFLA, the improved GA (IGA) (<xref ref-type="bibr" rid="B28">Yang et al., 2008</xref>), and the improved PSO (IPSO) (<xref ref-type="bibr" rid="B5">Chuang et al., 2008</xref>) were selected for comparison. In the experiments, consistent conditions and parameters were used in the comparative analysis, where the population size was 200 and the number of iterations was 500; the classification accuracy of feature subsets was evaluated using <italic>K</italic>-<italic>NN</italic> and <italic>C</italic>4.5 classification algorithms. In the BF-SFLA and the SFLA, (m) and (n) values were set to 5 and 5, respectively.</p>
<p>The training and the test samples should be independent to prove the generalization capability. In the experimentation, we used 10-fold cross-validation to estimate the classification rate for each dataset. These data were divided into 10 folds. For the 10 folds, 9 folds constitute the training set. The rest of the folds were used as the test set.</p>
<p>To avoid deviation, all results were the average of 30 independent executions of the algorithm. The aims were to reduce the number of feature subsets of datasets to less than 100 and improve the classification accuracy of the datasets. Nine typical high-dimensional biomedical data sets were selected, as shown in <xref ref-type="table" rid="T7">Table 7</xref>. The column titled <italic>K-NN</italic> and C4.5 represented the original data set&#x2019;s classification accuracy, and the parentheses&#x2019; data expressed the average absolute error. In <xref ref-type="table" rid="T8">Table 8</xref>, nine datasets and four comparison algorithms were listed. Each algorithm had six attributes, which were i) the average fitness (Ave%), ii) the highest fitness (Max), iii) the lowest fitness (Min%), iv) the standard deviation (std), v) the average number of feature subsets (AveN), and vi) the number of algorithm executions in each experiment (S).</p>
<table-wrap position="float" id="T8">
<label>TABLE 8</label>
<caption><p>The running result for four algorithms.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">Data set</td>
<td valign="top" align="left">Algorithm</td>
<td valign="top" align="left">Ave(%)</td>
<td valign="top" align="left">Max(%)</td>
<td valign="top" align="left">Min(%)</td>
<td valign="top" align="left">Std</td>
<td valign="top" align="left">AveN</td>
<td valign="top" align="left">S</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">ColonTumor</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left"><bold>93.12</bold></td>
<td valign="top" align="left"><bold>95.66</bold></td>
<td valign="top" align="left">90.23</td>
<td valign="top" align="left"><bold>2.67</bold></td>
<td valign="top" align="left"><bold>33.12</bold></td>
<td valign="top" align="left">6</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left">89.02</td>
<td valign="top" align="left">91.66</td>
<td valign="top" align="left"><bold>85.02</bold></td>
<td valign="top" align="left">2.69</td>
<td valign="top" align="left">36.16</td>
<td valign="top" align="left">6</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">86.67</td>
<td valign="top" align="left">88.33</td>
<td valign="top" align="left">83.33</td>
<td valign="top" align="left">2.36</td>
<td valign="top" align="left">38.24</td>
<td valign="top" align="left">6</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">87.67</td>
<td valign="top" align="left">91.67</td>
<td valign="top" align="left">85.01</td>
<td valign="top" align="left">3.65</td>
<td valign="top" align="left">49.40</td>
<td valign="top" align="left">6</td>
</tr>
<tr>
<td valign="top" align="left">DLBCL-outcome</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left"><bold>74.23</bold></td>
<td valign="top" align="left"><bold>77.63</bold></td>
<td valign="top" align="left"><bold>67.21</bold></td>
<td valign="top" align="left"><bold>3.26</bold></td>
<td valign="top" align="left"><bold>26.25</bold></td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left">69.21</td>
<td valign="top" align="left">75.20</td>
<td valign="top" align="left">65.33</td>
<td valign="top" align="left">3.84</td>
<td valign="top" align="left">51.43</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">64.33</td>
<td valign="top" align="left">70.06</td>
<td valign="top" align="left">60.00</td>
<td valign="top" align="left">5.21</td>
<td valign="top" align="left">27.62</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">71.11</td>
<td valign="top" align="left">76.67</td>
<td valign="top" align="left">63.33</td>
<td valign="top" align="left">5.34</td>
<td valign="top" align="left">51.24</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left">ALL-AML-leukemia</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left"><bold>98.42</bold></td>
<td valign="top" align="left"><bold>100.00</bold></td>
<td valign="top" align="left"><bold>98.02</bold></td>
<td valign="top" align="left"><bold>0.86</bold></td>
<td valign="top" align="left"><bold>29.23</bold></td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left">97.27</td>
<td valign="top" align="left">99.09</td>
<td valign="top" align="left">94.52</td>
<td valign="top" align="left">1.93</td>
<td valign="top" align="left">45.65</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">95.09</td>
<td valign="top" align="left">97.27</td>
<td valign="top" align="left">92.73</td>
<td valign="top" align="left">1.65</td>
<td valign="top" align="left">30.63</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">99.01</td>
<td valign="top" align="left">100.00</td>
<td valign="top" align="left">98.18</td>
<td valign="top" align="left">1.04</td>
<td valign="top" align="left">113.5</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left">LungCancer-ontario</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left"><bold>75.55</bold></td>
<td valign="top" align="left"><bold>80.12</bold></td>
<td valign="top" align="left"><bold>71.67</bold></td>
<td valign="top" align="left"><bold>3.24</bold></td>
<td valign="top" align="left"><bold>14.65</bold></td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left">70.22</td>
<td valign="top" align="left">85.12</td>
<td valign="top" align="left">62.54</td>
<td valign="top" align="left">4.84</td>
<td valign="top" align="left">18.46</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">65.51</td>
<td valign="top" align="left">75.21</td>
<td valign="top" align="left">57.52</td>
<td valign="top" align="left">4.18</td>
<td valign="top" align="left">10.22</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">70.00</td>
<td valign="top" align="left">77.50</td>
<td valign="top" align="left">57.50</td>
<td valign="top" align="left">4.89</td>
<td valign="top" align="left">56.25</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left">DLBCL-stanford</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left"><bold>82.44</bold></td>
<td valign="top" align="left"><bold>83.26</bold></td>
<td valign="top" align="left"><bold>78.13</bold></td>
<td valign="top" align="left"><bold>2.24</bold></td>
<td valign="top" align="left"><bold>15.87</bold></td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left">80.01</td>
<td valign="top" align="left">82.01</td>
<td valign="top" align="left">78.04</td>
<td valign="top" align="left">2.06</td>
<td valign="top" align="left">25.67</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">78.80</td>
<td valign="top" align="left">84.02</td>
<td valign="top" align="left">72.02</td>
<td valign="top" align="left">4.83</td>
<td valign="top" align="left">18.43</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">78.10</td>
<td valign="top" align="left">80.02</td>
<td valign="top" align="left">74.11</td>
<td valign="top" align="left">3.19</td>
<td valign="top" align="left">49.50</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left">LungCancer-Harvard2</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left"><bold>98.94</bold></td>
<td valign="top" align="left"><bold>99.65</bold></td>
<td valign="top" align="left"><bold>97.45</bold></td>
<td valign="top" align="left"><bold>0.98</bold></td>
<td valign="top" align="left"><bold>51.87</bold></td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left">98.02</td>
<td valign="top" align="left">98.81</td>
<td valign="top" align="left">96.66</td>
<td valign="top" align="left">1.06</td>
<td valign="top" align="left">75.25</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">96.67</td>
<td valign="top" align="left">98.33</td>
<td valign="top" align="left">95.56</td>
<td valign="top" align="left">1.11</td>
<td valign="top" align="left">52.80</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">96.36</td>
<td valign="top" align="left">99.98</td>
<td valign="top" align="left">93.34</td>
<td valign="top" align="left">2.33</td>
<td valign="top" align="left">98.31</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left">Nervous-system</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left"><bold>81.75</bold></td>
<td valign="top" align="left"><bold>85.26</bold></td>
<td valign="top" align="left"><bold>78.13</bold></td>
<td valign="top" align="left"><bold>3.34</bold></td>
<td valign="top" align="left"><bold>32.24</bold></td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left">76.08</td>
<td valign="top" align="left">80.05</td>
<td valign="top" align="left">71.67</td>
<td valign="top" align="left">3.64</td>
<td valign="top" align="left">57.86</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">71.67</td>
<td valign="top" align="left">81.67</td>
<td valign="top" align="left">61.67</td>
<td valign="top" align="left">7.16</td>
<td valign="top" align="left">30.25</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">72.67</td>
<td valign="top" align="left">78.33</td>
<td valign="top" align="left">63.33</td>
<td valign="top" align="left">6.07</td>
<td valign="top" align="left">45.03</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left">LungCancer-harvard1</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left">90.03</td>
<td valign="top" align="left">91.12</td>
<td valign="top" align="left">88.49</td>
<td valign="top" align="left"><bold>1.11</bold></td>
<td valign="top" align="left"><bold>28.24</bold></td>
<td valign="top" align="left">9</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left"><bold>91.21</bold></td>
<td valign="top" align="left"><bold>92.24</bold></td>
<td valign="top" align="left"><bold>89.22</bold></td>
<td valign="top" align="left">1.23</td>
<td valign="top" align="left">54.71</td>
<td valign="top" align="left">9</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">85.90</td>
<td valign="top" align="left">87.50</td>
<td valign="top" align="left">84.09</td>
<td valign="top" align="left">1.29</td>
<td valign="top" align="left">31.81</td>
<td valign="top" align="left">9</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">91.90</td>
<td valign="top" align="left">94.14</td>
<td valign="top" align="left">90.04</td>
<td valign="top" align="left">1.51</td>
<td valign="top" align="left">44.20</td>
<td valign="top" align="left">9</td>
</tr>
<tr>
<td valign="top" align="left">DLBCL-NIH</td>
<td valign="top" align="left"><bold>BF-SFLA</bold></td>
<td valign="top" align="left"><bold>55.36</bold></td>
<td valign="top" align="left">56.84</td>
<td valign="top" align="left"><bold>52.52</bold></td>
<td valign="top" align="left"><bold>2.09</bold></td>
<td valign="top" align="left"><bold>28.31</bold></td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">SFLA</td>
<td valign="top" align="left">54.16</td>
<td valign="top" align="left"><bold>58.12</bold></td>
<td valign="top" align="left">50.63</td>
<td valign="top" align="left">3.13</td>
<td valign="top" align="left">30.75</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IGA</td>
<td valign="top" align="left">56.02</td>
<td valign="top" align="left">61.24</td>
<td valign="top" align="left">51.78</td>
<td valign="top" align="left">3.66</td>
<td valign="top" align="left">32.12</td>
<td valign="top" align="left">8</td>
</tr>
<tr>
<td valign="top" align="left"/><td valign="top" align="left">IPSO</td>
<td valign="top" align="left">55.11</td>
<td valign="top" align="left">65.02</td>
<td valign="top" align="left">47.51</td>
<td valign="top" align="left">9.01</td>
<td valign="top" align="left">35.11</td>
<td valign="top" align="left">8</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>The best value is in bold.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p>As could be seen from <xref ref-type="table" rid="T8">Table 8</xref>, the BF-SFLA achieved the best Avg result among the four algorithms for eight of the nine data sets and the second best (Ave%) of the remaining dataset. The (Ave%) results for ColonTumor, DLBCL-Outcome, ALL-AML-Leukemia, Lung cancer-Ontario, DLBCL-Stanford, LungCancer-Harvard2, Nervous-System, and DLBCL-NIH obtained by the BF-SFLA were 93.12, 74.23, 98.42, 75.55, 82.44, 98.94, 81.75, and 55.36%, respectively. For the Lung cancer-Harvard1 dataset, the (Ave%) of BF-SLA was 90.03% while the SFLA obtained the best (Ave%) at 91.21%; however, the (AveN) for the SFLA dataset was 54.71, which was much larger than the BF-SFLA.</p>
<p>According to the (AvgN), the BF-SFLA obtained the minimum (AvgN) for all datasets compared with the SFLA, IGA, and IPSO algorithms. We could also observe that the standard deviation (Std) metric for all four algorithms in five of the nine data sets (as obtained by the BF-SFLA) was smaller than those of the other three evaluation algorithms. The best attribute results were shown in bold font in <xref ref-type="table" rid="T8">Table 8</xref>.</p>
<p><xref ref-type="table" rid="T9">Table 9</xref> showed the three average attribute values of AVE(Ave), AVE(Std), and AVE(AveN) for the nine datasets using the four algorithms for evaluation. Through comparative analysis of BF-SFLA with SFLA, IGA, and IPSO, BF-SFLA showed better performance improvement in classification accuracy and stability while using fewer relevant feature subsets. It could also be observed that due to the introduction of the proposed improvements and updating strategy, the BF-SFLA explored possible subsets space to obtain a set of features that maximize the predictive accuracy and minimize irrelevant features in high-dimensional biomedical data.</p>
<table-wrap position="float" id="T9">
<label>TABLE 9</label>
<caption><p>The average attributes value for nine datasets.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left">Attributes</td>
<td valign="top" align="center">BF-SFLA</td>
<td valign="top" align="center">SFLA</td>
<td valign="top" align="center">IGA</td>
<td valign="top" align="center">IPSO</td>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">AVE(Ave)</td>
<td valign="top" align="center"><bold>83.31</bold></td>
<td valign="top" align="center">80.57</td>
<td valign="top" align="center">77.55</td>
<td valign="top" align="center">80.21</td>
</tr>
<tr>
<td valign="top" align="left">AVE(Std)</td>
<td valign="top" align="center"><bold>2.19</bold></td>
<td valign="top" align="center">2.60</td>
<td valign="top" align="center">3.49</td>
<td valign="top" align="center">4.11</td>
</tr>
<tr>
<td valign="top" align="left">AVE(AveN)</td>
<td valign="top" align="center"><bold>28.86</bold></td>
<td valign="top" align="center">43.99</td>
<td valign="top" align="center">30.23</td>
<td valign="top" align="center">60.28</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn><p><italic>The best value is in bold.</italic></p></fn>
</table-wrap-foot>
</table-wrap>
<p>The process of reducing the average value of feature subsets were shown in <xref ref-type="fig" rid="F5">Figures 5</xref>&#x2013;<xref ref-type="fig" rid="F13">13</xref>. In each graph, the abscissa represented the number of feature subsets, and the ordinate represented the average classification accuracy of each algorithm executed 30 times independently. <xref ref-type="fig" rid="F5">Figures 5</xref>&#x2013;<xref ref-type="fig" rid="F13">13</xref> presented a performance comparison between the BF-SFLA and the SFLA, IGA, and IPSO methods. <xref ref-type="fig" rid="F5">Figures 5</xref>, <xref ref-type="fig" rid="F6">6</xref>, <xref ref-type="fig" rid="F9">9</xref>, <xref ref-type="fig" rid="F13">13</xref> showed that although there was no apparent advantage in the early-to-middle stages, the BF-SFLA algorithm could identify fewer feature subsets with higher classification effects and better performance later. Considering <xref ref-type="fig" rid="F5">Figures 5</xref>&#x2013;<xref ref-type="fig" rid="F13">13</xref> and <xref ref-type="table" rid="T8">Tables 8</xref>, <xref ref-type="table" rid="T9">9</xref>, we discovered that the proposed improvements and updating strategy played a vitally important role in the feature selection performance of the BF-SFLA. It was worth noting that the purpose of feature selection was to move non-productive features without reducing the accuracy of prediction; otherwise, although the feature subset was small, the performance might be degraded. For example, for <xref ref-type="fig" rid="F7">Figures 7</xref>, <xref ref-type="fig" rid="F10">10</xref>, <xref ref-type="fig" rid="F12">12</xref>, the average classification accuracy decreased gradually with the reduction of the number of features; therefore, we must balance the relationship between classification accuracy and the number of feature subsets in &#x201C;real-world&#x201D; applications so that the biological datasets set played a more critical role in the diagnosis of disease and improve the effectiveness of disease diagnosis (<xref ref-type="bibr" rid="B24">Vergara and Est&#x00E9;vez, 2014</xref>).</p>
<fig id="F5" position="float">
<label>FIGURE 5</label>
<caption><p>The variation trend of classification accuracy and feature subset of ColonTumor.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g005.tif"/>
</fig>
<fig id="F6" position="float">
<label>FIGURE 6</label>
<caption><p>The variation trend of classification accuracy and feature subset e of DLBCL-Outcome.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g006.tif"/>
</fig>
<fig id="F7" position="float">
<label>FIGURE 7</label>
<caption><p>The variation trend of classification accuracy and feature subset of ALL-AML-Leukemia.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g007.tif"/>
</fig>
<fig id="F8" position="float">
<label>FIGURE 8</label>
<caption><p>The variation trend of classification accuracy and feature subset of LungCancer-Ontario.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g008.tif"/>
</fig>
<fig id="F9" position="float">
<label>FIGURE 9</label>
<caption><p>The variation trend of classification accuracy and feature subset of DLBCL-Stanford.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g009.tif"/>
</fig>
<fig id="F10" position="float">
<label>FIGURE 10</label>
<caption><p>The variation trend of classification accuracy and feature subset of LungCancer-Harvard2.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g010.tif"/>
</fig>
<fig id="F11" position="float">
<label>FIGURE 11</label>
<caption><p>The variation trend of classification accuracy and feature subset of Nervous-System.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g011.tif"/>
</fig>
<fig id="F12" position="float">
<label>FIGURE 12</label>
<caption><p>The variation trend of classification accuracy and feature subset of lungcancer-harvard1.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g012.tif"/>
</fig>
<fig id="F13" position="float">
<label>FIGURE 13</label>
<caption><p>The variation trend of classification accuracy and feature subset of DLBCL-NIH.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-16-854685-g013.tif"/>
</fig>
</sec>
</sec>
<sec id="S6" sec-type="conclusion">
<title>Conclusion</title>
<p>Feature subset selection was an essential technique in many application fields, and different evolutionary algorithms were developed for different feature subset selection problems. In this manuscript, the BF-SFLA algorithm was used to solve the problem of feature selection. By introducing the chemotaxis factor of the BF, a new ISFLA (termed the BF-SFLA) was adopted to solve the problem of feature selection in high-dimensional biomedical data, and the <italic>K</italic>-<italic>NN</italic> and <italic>C</italic>4.5 were used as the evaluator index of the proposed algorithm.</p>
<p>The experimental results showed that this method could effectively reduce the number of dataset features and simultaneously achieve higher classification accuracy. The proposed method could be used as an ideal pre-processing tool to optimize the feature selection process of high-dimensional biomedical data, better explore the function of biological datasets in the medical field, and improve the efficiency of medical diagnostics.</p>
</sec>
<sec id="S7" sec-type="data-availability">
<title>Data Availability Statement</title>
<p>The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.</p>
</sec>
<sec id="S8">
<title>Author Contributions</title>
<p>YD completed the overall experiment and wrote the first draft. LN normalized the data. LW and JT made grammatical modifications to the manuscript. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec id="conf1" sec-type="COI-statement">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="pudiscl1" sec-type="disclaimer">
<title>Publisher&#x2019;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<sec id="S9" sec-type="funding-information">
<title>Funding</title>
<p>This work was supported by the Youth Mentor Fund of Gansu Agricultural University (GAU-QDFC-2019-02), The Innovation Capacity Improvement Project of Colleges and Universities in Gansu Province (2019A-056), Graduate Education Research Project of Gansu Agricultural University (2020-19), and Lanzhou Talents Innovation and Entrepreneurship Project (2021-RC-47).</p>
</sec>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>AbdEl-Fattah Sayed</surname> <given-names>S.</given-names></name> <name><surname>Nabil</surname> <given-names>E.</given-names></name> <name><surname>Badr</surname> <given-names>A.</given-names></name></person-group> (<year>2016</year>). <article-title>A binary clonal flower pollination algorithm for feature selection.</article-title> <source><italic>Pattern Recognit. Lett.</italic></source> <volume>77</volume> <fpage>21</fpage>&#x2013;<lpage>27</lpage>. <pub-id pub-id-type="doi">10.1016/j.patrec.2016.03.014</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Alghazi</surname> <given-names>A.</given-names></name> <name><surname>Selim</surname> <given-names>S. Z.</given-names></name> <name><surname>Elazouni</surname> <given-names>A.</given-names></name></person-group> (<year>2012</year>). <article-title>Performance of shuffled frog-leaping algorithm in finance-based scheduling.</article-title> <source><italic>J. Comput. Civ. Eng.</italic></source> <volume>26</volume> <fpage>396</fpage>&#x2013;<lpage>408</lpage>. <pub-id pub-id-type="doi">10.1061/(asce)cp.1943-5487.0000157</pub-id> <pub-id pub-id-type="pmid">29515898</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cai</surname> <given-names>H. S.</given-names></name> <name><surname>Qu</surname> <given-names>Z. D.</given-names></name> <name><surname>Li</surname> <given-names>Z.</given-names></name> <name><surname>Zhang</surname> <given-names>Y.</given-names></name> <name><surname>Hu</surname> <given-names>X. P.</given-names></name> <name><surname>Hu</surname> <given-names>B.</given-names></name></person-group> (<year>2020</year>). <article-title>Feature-level fusion approaches based on multimodal EEG data for depression recognition.</article-title> <source><italic>Inf. Fusion</italic></source> <volume>59</volume> <fpage>127</fpage>&#x2013;<lpage>138</lpage>. <pub-id pub-id-type="doi">10.1016/j.inffus.2020.01.008</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cai</surname> <given-names>H. S.</given-names></name> <name><surname>Zhang</surname> <given-names>Y.</given-names></name> <name><surname>Xiao</surname> <given-names>H.</given-names></name> <name><surname>Zhang</surname> <given-names>J.</given-names></name> <name><surname>Hu</surname> <given-names>B.</given-names></name> <name><surname>Hu</surname> <given-names>X. P.</given-names></name></person-group> (<year>2021</year>). <article-title>An adaptive neurofeedback method for attention regulation based on the internet of things.</article-title> <source><italic>IEEE Internet Things J.</italic></source> <volume>21</volume> <fpage>15829</fpage>&#x2013;<lpage>15838</lpage>. <pub-id pub-id-type="doi">10.1109/jiot.2021.3083745</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chuang</surname> <given-names>L. Y.</given-names></name> <name><surname>Chang</surname> <given-names>H. W.</given-names></name> <name><surname>Tu</surname> <given-names>C. J.</given-names></name> <name><surname>Yang</surname> <given-names>C. H.</given-names></name></person-group> (<year>2008</year>). <article-title>Improved binary PSO for feature selection using gene expression data.</article-title> <source><italic>Comput. Biol. Chem.</italic></source> <volume>32</volume> <fpage>29</fpage>&#x2013;<lpage>38</lpage>. <pub-id pub-id-type="doi">10.1016/j.compbiolchem.2007.09.005</pub-id> <pub-id pub-id-type="pmid">18023261</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dai</surname> <given-names>Y. Q.</given-names></name> <name><surname>Wang</surname> <given-names>L. G.</given-names></name></person-group> (<year>2012</year>). <article-title>Performance analysis of improved SFLA and the application in economic dispatch of power system.</article-title> <source><italic>Power Syst. Prot. Control</italic></source> <volume>40</volume> <fpage>77</fpage>&#x2013;<lpage>83</lpage>.</citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ebrahimi</surname> <given-names>J.</given-names></name> <name><surname>Hosseinian</surname> <given-names>S. H.</given-names></name> <name><surname>Gharehpetian</surname> <given-names>G. B.</given-names></name></person-group> (<year>2012</year>). <article-title>Unit commitment problem solution using shuffled frog leaping algorithm.</article-title> <source><italic>IEEE Appl. Math. Comput.</italic></source> <volume>218</volume> <fpage>9353</fpage>&#x2013;<lpage>9371</lpage>.</citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eusuff</surname> <given-names>M.</given-names></name> <name><surname>Lansey</surname> <given-names>K. E.</given-names></name></person-group> (<year>2003</year>). <article-title>Optimization of water distribution network design using the shuffled frog leaping algorithm.</article-title> <source><italic>Water Resour. Plan. Manag.</italic></source> <volume>3</volume> <fpage>210</fpage>&#x2013;<lpage>225</lpage>. <pub-id pub-id-type="doi">10.1061/(asce)0733-9496(2003)129:3(210)</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gomez Gonzalez</surname> <given-names>M.</given-names></name> <name><surname>Ruiz Rodriguez</surname> <given-names>F. J.</given-names></name> <name><surname>Jurado</surname> <given-names>F.</given-names></name></person-group> (<year>2013</year>). <article-title>A binary SFLA for probabilistic three-phase load flow in unbalanced distribution systems with technical constraints.</article-title> <source><italic>Electr. Power Energy Syst.</italic></source> <volume>48</volume> <fpage>48</fpage>&#x2013;<lpage>57</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijepes.2012.11.030</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hasanien</surname> <given-names>H. M.</given-names></name></person-group> (<year>2015</year>). <article-title>Shuffled frog leaping algorithm for photovoltaic model identification.</article-title> <source><italic>IEEE Trans. Sustain. Energy</italic></source> <volume>6</volume> <fpage>509</fpage>&#x2013;<lpage>515</lpage>. <pub-id pub-id-type="doi">10.1109/tste.2015.2389858</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hu</surname> <given-names>B.</given-names></name> <name><surname>Dai</surname> <given-names>Y. Q.</given-names></name></person-group> (<year>2018</year>). <article-title>Feature selection for optimized high-dimensional biomedical data using the improved shuffled frog leaping algorithm.</article-title> <source><italic>IEEE/ACM Trans. Comput. Biol. Bioinform.</italic></source> <volume>15</volume> <fpage>1765</fpage>&#x2013;<lpage>1773</lpage>. <pub-id pub-id-type="doi">10.1109/TCBB.2016.2602263</pub-id> <pub-id pub-id-type="pmid">28113635</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Huynh</surname> <given-names>T. H.</given-names></name> <name><surname>Nguyen</surname> <given-names>D. H.</given-names></name></person-group> (<year>2009</year>). &#x201C;<article-title>Fuzzy controller design using a new shuffled frog leaping algorithm</article-title>,&#x201D; in <source><italic>Proceedings of the IEEE International Conference on Industrial Technology</italic></source>, <publisher-loc>Churchill, VIC</publisher-loc>, <fpage>1</fpage>&#x2013;<lpage>6</lpage>.</citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lee</surname> <given-names>J.</given-names></name> <name><surname>Kim</surname> <given-names>D. W.</given-names></name></person-group> (<year>2015</year>). <article-title>Memetic feature selection algorithm for multi-label classification.</article-title> <source><italic>Inf. Sci.</italic></source> <volume>293</volume> <fpage>80</fpage>&#x2013;<lpage>96</lpage>. <pub-id pub-id-type="doi">10.1016/j.ins.2014.09.020</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>Y.</given-names></name> <name><surname>Si</surname> <given-names>J. N.</given-names></name> <name><surname>Zhou</surname> <given-names>G. J.</given-names></name> <name><surname>Huang</surname> <given-names>S. S.</given-names></name> <name><surname>Chen</surname> <given-names>S. C.</given-names></name></person-group> (<year>2015</year>). <article-title>FREL: a Stable Feature Selection Algorithm.</article-title> <source><italic>Trans. Neural Netw. Learn. Syst.</italic></source> <volume>26</volume> <fpage>1388</fpage>&#x2013;<lpage>1402</lpage>. <pub-id pub-id-type="doi">10.1109/TNNLS.2014.2341627</pub-id> <pub-id pub-id-type="pmid">25134091</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lu</surname> <given-names>Y.</given-names></name> <name><surname>Han</surname> <given-names>J.</given-names></name></person-group> (<year>2003</year>). <article-title>Cancer classification using gene expression data.</article-title> <source><italic>Inf. Syst.</italic></source> <volume>28</volume> <fpage>243</fpage>&#x2013;<lpage>268</lpage>. <pub-id pub-id-type="doi">10.1016/s0306-4379(02)00072-8</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Misra</surname> <given-names>J.</given-names></name> <name><surname>Schmitt</surname> <given-names>W.</given-names></name> <name><surname>Hwang</surname> <given-names>D.</given-names></name> <name><surname>Hsiao</surname> <given-names>L.</given-names></name> <name><surname>Gullans</surname> <given-names>S.</given-names></name> <name><surname>Stephanopoulos</surname> <given-names>G.</given-names></name></person-group> (<year>2002</year>). <article-title>Interactive exploration of microarray gene expression patterns in a reduced dimensional space.</article-title> <source><italic>Genome Res.</italic></source> <volume>2</volume> <fpage>1112</fpage>&#x2013;<lpage>1120</lpage>. <pub-id pub-id-type="doi">10.1101/gr.225302</pub-id> <pub-id pub-id-type="pmid">12097349</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pan</surname> <given-names>Q. K.</given-names></name> <name><surname>Wang</surname> <given-names>L.</given-names></name> <name><surname>Gao</surname> <given-names>L.</given-names></name> <name><surname>Li</surname> <given-names>J. Q.</given-names></name></person-group> (<year>2011</year>). <article-title>An effective shuffled frog-leaping algorithm for lot-streaming flow shop scheduling problem.</article-title> <source><italic>Int. J. Adv. Manuf. Technol.</italic></source> <volume>52</volume> <fpage>699</fpage>&#x2013;<lpage>713</lpage>. <pub-id pub-id-type="doi">10.1007/s00170-010-2775-3</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Passino</surname> <given-names>K. M.</given-names></name></person-group> (<year>2002</year>). <article-title>Biomimicry of bacterial foraging for distributed optimization and control.</article-title> <source><italic>IEEE Control Syst.</italic></source> <volume>22</volume> <fpage>52</fpage>&#x2013;<lpage>67</lpage>. <pub-id pub-id-type="doi">10.1016/j.biosystems.2007.08.009</pub-id> <pub-id pub-id-type="pmid">17923256</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perez</surname> <given-names>I.</given-names></name> <name><surname>Gomez Gonzalez</surname> <given-names>M.</given-names></name> <name><surname>Jurado</surname> <given-names>F.</given-names></name></person-group> (<year>2013</year>). <article-title>Estimation of induction motor parameters using shuffled frog-leaping algorithm.</article-title> <source><italic>Electr. Eng.</italic></source> <volume>95</volume> <fpage>267</fpage>&#x2013;<lpage>275</lpage>. <pub-id pub-id-type="doi">10.1007/s00202-012-0261-7</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shahriari-kahkeshi</surname> <given-names>M.</given-names></name> <name><surname>Askari</surname> <given-names>J.</given-names></name></person-group> (<year>2011</year>). &#x201C;<article-title>Nonlinear continuous stirred tank reactor (cstr) identification and control using recurrent neural network trained shuffled frog leaping algorithm</article-title>,&#x201D; in <source><italic>Proceedings of the 2nd International Conference on Control, Instrumentation and Automation</italic></source>, <publisher-loc>Piscataway, NJ</publisher-loc>, <fpage>485</fpage>&#x2013;<lpage>489</lpage>.</citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shrivastava</surname> <given-names>P.</given-names></name> <name><surname>Shukla</surname> <given-names>A.</given-names></name> <name><surname>Vepakomma</surname> <given-names>P.</given-names></name> <name><surname>Bhansali</surname> <given-names>N.</given-names></name> <name><surname>Verma</surname> <given-names>K.</given-names></name></person-group> (<year>2017</year>). <article-title>A survey of nature-inspired algorithms for feature selection to identify Parkinson&#x2019;s disease.</article-title> <source><italic>Comput. Methods Programs Biomed.</italic></source> <volume>139</volume> <fpage>171</fpage>&#x2013;<lpage>179</lpage>. <pub-id pub-id-type="doi">10.1016/j.cmpb.2016.07.029</pub-id> <pub-id pub-id-type="pmid">28187888</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sun</surname> <given-names>X.</given-names></name> <name><surname>Wang</surname> <given-names>Z.</given-names></name> <name><surname>Zhang</surname> <given-names>D.</given-names></name></person-group> (<year>2008</year>). &#x201C;<article-title>A web document classification method based on shuffled frog leaping algorithm</article-title>,&#x201D; in <source><italic>Proceedings of the 2nd International Conference on Genetic and Evolutionary Computing</italic></source>, <publisher-loc>Jinzhou</publisher-loc>, <fpage>205</fpage>&#x2013;<lpage>208</lpage>.</citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tabakhi</surname> <given-names>S.</given-names></name> <name><surname>Moradi</surname> <given-names>P.</given-names></name> <name><surname>Akhlaghian</surname> <given-names>F.</given-names></name></person-group> (<year>2014</year>). <article-title>An unsupervised feature selection algorithm based on ant colony optimization.</article-title> <source><italic>Eng. Applic. Artificial Intell.</italic></source> <volume>32</volume> <fpage>112</fpage>&#x2013;<lpage>123</lpage>. <pub-id pub-id-type="doi">10.1016/j.engappai.2014.03.007</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vergara</surname> <given-names>J. R.</given-names></name> <name><surname>Est&#x00E9;vez</surname> <given-names>P. A.</given-names></name></person-group> (<year>2014</year>). <article-title>A review of feature selection methods based on mutual information.</article-title> <source><italic>Neural Comput. Applic.</italic></source> <volume>24</volume> <fpage>175</fpage>&#x2013;<lpage>186</lpage>. <pub-id pub-id-type="doi">10.1007/s00521-013-1368-0</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>F.</given-names></name> <name><surname>Liang</surname> <given-names>J. Y.</given-names></name></person-group> (<year>2016</year>). <article-title>An efficient feature selection algorithm for hybrid data.</article-title> <source><italic>Neurocomputing</italic></source> <volume>193</volume> <fpage>33</fpage>&#x2013;<lpage>41</lpage>. <pub-id pub-id-type="doi">10.1016/j.neucom.2016.01.056</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>X. Y.</given-names></name> <name><surname>Yang</surname> <given-names>J.</given-names></name> <name><surname>Teng</surname> <given-names>X. L.</given-names></name> <name><surname>Xia</surname> <given-names>W. J.</given-names></name></person-group> (<year>2007</year>). <article-title>Richard jensen, feature selection based on rough sets and particle swarm optimization.</article-title> <source><italic>Pattern Recognit. Lett.</italic></source> <volume>28</volume> <fpage>459</fpage>&#x2013;<lpage>471</lpage>. <pub-id pub-id-type="doi">10.1016/j.patrec.2006.09.003</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>Y. T.</given-names></name> <name><surname>Wang</surname> <given-names>J. D.</given-names></name> <name><surname>Liao</surname> <given-names>H.</given-names></name> <name><surname>Chen</surname> <given-names>H. Y.</given-names></name></person-group> (<year>2017</year>). <article-title>An efficient semi-supervised representatives feature selection algorithmbasedoninformationtheory.</article-title> <source><italic>Pattern Recognit.</italic></source> <volume>61</volume> <fpage>511</fpage>&#x2013;<lpage>523</lpage>. <pub-id pub-id-type="doi">10.1016/j.patcog.2016.08.011</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>C. S.</given-names></name> <name><surname>Chuang</surname> <given-names>L. Y.</given-names></name> <name><surname>Chen</surname> <given-names>Y. J.</given-names></name> <name><surname>Yang</surname> <given-names>C. H.</given-names></name></person-group> (<year>2008</year>). &#x201C;<article-title>Feature selection using memetic algorithms</article-title>,&#x201D; in <source><italic>Proceedings of the Third International Conference on Convergence and Hybrid Information Technology</italic></source>, <publisher-loc>Busan</publisher-loc>, <fpage>416</fpage>&#x2013;<lpage>423</lpage>.</citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>Y.</given-names></name> <name><surname>Gong</surname> <given-names>D. W.</given-names></name> <name><surname>Hu</surname> <given-names>Y.</given-names></name> <name><surname>Zhang</surname> <given-names>W. Q.</given-names></name></person-group> (<year>2015</year>). <article-title>Feature selection algorithm based on bare bones particle swarm optimization.</article-title> <source><italic>Neurocomputing</italic></source> <volume>148</volume> <fpage>150</fpage>&#x2013;<lpage>157</lpage>. <pub-id pub-id-type="doi">10.1109/TCYB.2017.2714145</pub-id> <pub-id pub-id-type="pmid">28650835</pub-id></citation></ref>
</ref-list>
</back>
</article>
