<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article article-type="review-article" dtd-version="2.3" xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Bioeng. Biotechnol.</journal-id>
<journal-title>Frontiers in Bioengineering and Biotechnology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Bioeng. Biotechnol.</abbrev-journal-title>
<issn pub-type="epub">2296-4185</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">1195294</article-id>
<article-id pub-id-type="doi">10.3389/fbioe.2023.1195294</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Bioengineering and Biotechnology</subject>
<subj-group>
<subject>Review</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>A review of algorithmic approaches for cell culture media optimization</article-title>
<alt-title alt-title-type="left-running-head">Zhou et al.</alt-title>
<alt-title alt-title-type="right-running-head">
<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fbioe.2023.1195294">10.3389/fbioe.2023.1195294</ext-link>
</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Zhou</surname>
<given-names>Tianxun</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="fn" rid="fn1">
<sup>&#x2020;</sup>
</xref>
<uri xlink:href="https://loop.frontiersin.org/people/2255557/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Reji</surname>
<given-names>Rinta</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="fn" rid="fn1">
<sup>&#x2020;</sup>
</xref>
<uri xlink:href="https://loop.frontiersin.org/people/2261658/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Kairon</surname>
<given-names>Ryanjit Singh</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Chiam</surname>
<given-names>Keng Hwee</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="corresp" rid="c001">&#x2a;</xref>
<uri xlink:href="https://loop.frontiersin.org/people/468020/overview"/>
</contrib>
</contrib-group>
<aff id="aff1">
<sup>1</sup>
<institution>Bioinformatics Institute</institution>, <institution>Cellular Image Informatics Division</institution>, <institution>A&#x2a;STAR</institution>, <addr-line>Singapore</addr-line>, <country>Singapore</country>
</aff>
<aff id="aff2">
<sup>2</sup>
<institution>School of Biological Sciences</institution>, <institution>Nanyang Technological University</institution>, <addr-line>Singapore</addr-line>, <country>Singapore</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>
<bold>Edited by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/509285/overview">Eric von Lieres</ext-link>, Forschungszentrum J&#xfc;lich, Germany</p>
</fn>
<fn fn-type="edited-by">
<p>
<bold>Reviewed by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/2263695/overview">Ajoy Velayudhan</ext-link>, University College London, United Kingdom</p>
<p>
<ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/120084/overview">Jesus Pic&#xf3;</ext-link>, Universitat Polit&#xe8;cnica de Val&#xe8;ncia, Spain</p>
</fn>
<corresp id="c001">&#x2a;Correspondence: Keng Hwee Chiam, <email>chiamkh@bii.a-star.edu.sg</email>
</corresp>
<fn fn-type="equal" id="fn1">
<label>
<sup>&#x2020;</sup>
</label>
<p>These authors have contributed equally to this work and share first authorship</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>11</day>
<month>05</month>
<year>2023</year>
</pub-date>
<pub-date pub-type="collection">
<year>2023</year>
</pub-date>
<volume>11</volume>
<elocation-id>1195294</elocation-id>
<history>
<date date-type="received">
<day>28</day>
<month>03</month>
<year>2023</year>
</date>
<date date-type="accepted">
<day>03</day>
<month>05</month>
<year>2023</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#xa9; 2023 Zhou, Reji, Kairon and Chiam.</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>Zhou, Reji, Kairon and Chiam</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<p>Cell culture media composition and culture conditions play a crucial role in product yield, quality and cost of production. Culture media optimization is the technique of improving media composition and culture conditions to achieve desired product outcomes. To achieve this, there have been many algorithmic methods proposed and used for culture media optimization in the literature. To help readers evaluate and decide on a method that best suits their specific application, we carried out a systematic review of the different methods from an algorithmic perspective that classifies, explains and compares the available methods. We also examine the trends and new developments in the area. This review provides recommendations to researchers regarding the suitable media optimization algorithm for their applications and we hope to also promote the development of new cell culture media optimization methods that are better suited to existing and upcoming challenges in this biotechnology field, which will be essential for more efficient production of various cell culture products.</p>
</abstract>
<kwd-group>
<kwd>culture media optimization</kwd>
<kwd>media optimization algorithm</kwd>
<kwd>fermentation optimization</kwd>
<kwd>surrogate model</kwd>
<kwd>metaheuistics</kwd>
<kwd>design of experiements</kwd>
<kwd>benchmark comparison</kwd>
</kwd-group>
<contract-num rid="cn001">H20H8a0003</contract-num>
<contract-sponsor id="cn001">Agency for Science, Technology and Research<named-content content-type="fundref-id">10.13039/501100001348</named-content>
</contract-sponsor>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-at-acceptance</meta-name>
<meta-value>Bioprocess Engineering</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<sec id="s1">
<title>1 Introduction</title>
<p>Cell culture is widely used in biotechnology to manufacture various useful products for applications such as pharmaceuticals, food, biofuel, and industrial products. Cell culture production systems in industry and research span different kingdoms of life from free living microbes such as bacteria, archaea, and fungi, to cell lines derived from multicellular organisms including insect and mammalian species. Products include pharmaceuticals such as antibiotics and monoclonal antibodies; food products such as rennet, single-cell protein, and cultivated meat; biofuel from lipid-producing algae; industrial products such as cleaning enzymes and organic acids.</p>
<p>In cell culture, culture media is a crucial input that provides energy and materials required by cells to grow, proliferate and produce the products of interest. The components that make up culture media and their concentrations within the media affect important aspects of cell growth, productivity and product quality, and are instrumental to the success of the cell culture application (<xref ref-type="bibr" rid="B126">Yao and Asayama, 2017</xref>; <xref ref-type="bibr" rid="B96">Ritacco et al., 2018</xref>). In most laboratory applications, a universal standardized media is used but to achieve cost-efficient upscaling to industrial production, there is a need to identify a combination of concentrations for media components that optimize for the desired property, such as biomass or specific biomolecule yield in the cell culture system. There have been many methods using optimization algorithms developed for culture media optimization for both microbial systems and animal cell culture. Despite this, there is a lack of systematic review and comparison of these different methods from an algorithmic perspective that provides researchers with a comprehensive overview of the available algorithms and provides perspectives on the applicability of these methods in new contexts.</p>
<p>A few other review articles are available on culture media optimization. The review by <xref ref-type="bibr" rid="B103">Singh et al. (2017)</xref> focuses on fermentation media for microbial systems. While it lists some of the techniques used in literature, it does not include several types of algorithms that were typically used in media optimization, and also lacked a benchmark comparison between the methods. Similarly, the reviews by <xref ref-type="bibr" rid="B119">van der Valk et al. (2010)</xref> <xref ref-type="bibr" rid="B96">Ritacco et al. (2018)</xref> and <xref ref-type="bibr" rid="B30">Galbraith et al. (2018)</xref> does not go into depth on the available optimization algorithms. The unique requirements of the culture media optimization problem, such as the need for a low number of iterations due to experimental resource constraints, and noisy results are also unaddressed by reviews of optimization algorithms for black-box functions that evaluate algorithms in a general context.</p>
<p>The purpose and unique contribution of this review is to provide readers with a comprehensive overview of the main types of algorithms applicable to culture media optimization. We synthesized existing works and provided a generalized framework for understanding and designing culture media optimization experiments. We have examined and summarized the strategies adopted by previous works, classify and explain them using this framework and examine algorithmic features that were designed and chosen during past efforts to address specific challenges of this problem. We have also provided recommendations on the type of algorithm to use based on benchmark comparisons and identify gaps for future research.</p>
</sec>
<sec sec-type="materials|methods" id="s2">
<title>2 Materials and methods</title>
<sec id="s2-1">
<title>2.1 Scope of review</title>
<p>The cell culture media optimization works covered by this review apply to both microbial and animal cell culture for a variety of outcomes including maximizing biomolecule yield, biomass production and cell proliferation. The methods reviewed cover algorithmic approaches for optimization that have been used in literature for optimizing culture media conditions.</p>
<p>Simple methods such as one-factor-at-a-time (OFAT) and factor screening through statistical design of experiment (DOE) are not covered in this review. These methods are relatively simple and standard and are well discussed in other works. These methods are also generally recognized to be insufficient for more complex media formulations that contain more components (more than 10) and concentration levels due to combinatorial explosion, and potential complex interaction effects that exist between components.</p>
<p>Bioinformatics methods such as metabolic network models and expression analysis are also excluded from the review. These methods require specific and customized analysis and modeling for the cell line and process involved.</p>
</sec>
<sec id="s2-2">
<title>2.2 Methodology of review</title>
<p>We carried out a systematic literature search for cell culture media on NCBI PubMed and Google Scholar with the query:<list list-type="simple">
<list-item>
<p>&#x2022; ((&#x201c;Cell culture&#x201d; OR &#x201c;Culture&#x201d; OR &#x201c;fermentation&#x201d;) AND (&#x201c;media&#x201d; OR &#x201c;medium&#x201d;) AND &#x201c;optimization&#x201d;) AND (algorithm[Text Word])</p>
</list-item>
</list>
</p>
<p>We also conducted a literature search for specific algorithms that we are aware have been or may be applied to the media optimization problem by replacing the generic term &#x201c;algorithm&#x201d; in the search with the following keywords:<list list-type="simple">
<list-item>
<p>&#x2022; iterative</p>
</list-item>
<list-item>
<p>&#x2022; direct search</p>
</list-item>
<list-item>
<p>&#x2022; simplex</p>
</list-item>
<list-item>
<p>&#x2022; metaheuristics</p>
</list-item>
<list-item>
<p>&#x2022; differential evolution</p>
</list-item>
<list-item>
<p>&#x2022; evolutionary strategies</p>
</list-item>
<list-item>
<p>&#x2022; swarm</p>
</list-item>
<list-item>
<p>&#x2022; simulated annealing</p>
</list-item>
<list-item>
<p>&#x2022; surrogate model</p>
</list-item>
<list-item>
<p>&#x2022; regression</p>
</list-item>
<list-item>
<p>&#x2022; kriging</p>
</list-item>
<list-item>
<p>&#x2022; bayesian</p>
</list-item>
<list-item>
<p>&#x2022; gaussian process</p>
</list-item>
<list-item>
<p>&#x2022; support vector</p>
</list-item>
<list-item>
<p>&#x2022; decision tree</p>
</list-item>
<list-item>
<p>&#x2022; random forest</p>
</list-item>
<list-item>
<p>&#x2022; ensemble</p>
</list-item>
<list-item>
<p>&#x2022; neural network</p>
</list-item>
<list-item>
<p>&#x2022; deep neural network</p>
</list-item>
<list-item>
<p>&#x2022; deep learning</p>
</list-item>
<list-item>
<p>&#x2022; machine learning</p>
</list-item>
<list-item>
<p>&#x2022; artificial intelligence</p>
</list-item>
</list>
</p>
</sec>
<sec id="s2-3">
<title>2.3 Method for simulation experiment for benchmarking</title>
<p>Most available works in culture media optimization generally propose a single optimization algorithm for a particular cell culture experiment. It is difficult to draw conclusions on the effectiveness of each of these algorithms vis-a-vis other algorithms. This presents an obstacle for other researchers seeking to make informed choices on which algorithms to utilize in the face of experimental resource constraints.</p>
<p>Few of the papers reviewed provided code implementations of the algorithms. We default to open-source implementations of algorithms written in Python if available. Otherwise, we modify open-source code or write our own implementations to recreate the algorithms. The codes for the simulation experiment for this study can be found at <ext-link ext-link-type="uri" xlink:href="https://github.com/zhoutianxun/Review-of-culture-media-optimization-methods">https://github.com/zhoutianxun/Review-of-culture-media-optimization-methods</ext-link>.</p>
<p>To provide a benchmark comparison of the algorithms, we propose to compare them on a large set of test functions with different characteristics. The test functions used come from the Black-Box Optimization Benchmarking (BBOB) test suite (<xref ref-type="bibr" rid="B40">Hansen et al., 2021</xref>). Details about the test functions are given in <xref ref-type="sec" rid="s17">Supplementary Table S1</xref>.</p>
<p>For our main experiments, we used the 5, 20 and 40-dimension versions of the test functions to represent typical media optimization experiments. For each individual experiment, 10 runs were performed with a population size of 50. The methods were only run to a maximum of 10 iterations to model the constraints of typical cell culture optimization experiments.</p>
<p>To generate noisy functions, we simulated additive Gaussian noise by sampling from a Gaussian distribution and adding that to the true function value. Additive noise was used to better simulate the measurement uncertainty from assays used to quantify the yield of interest in a cell culture experiment.</p>
</sec>
</sec>
<sec id="s3">
<title>3 Classification of algorithmic approaches</title>
<sec id="s3-1">
<title>3.1 Basic terminology</title>
<p>There are many terminologies used in culture media optimization literature that may be confusing to newcomers in the field, originating from different sources from the fields of experiment design, mathematical optimization, statistics, evolutionary computation, machine learning and others.</p>
<p>Here we define some of the basic terminologies that would be encountered in culture media optimization literature, with the most common names in bold:</p>
<p>Common terminology.<list list-type="simple">
<list-item>
<p>&#x2022; The adjustable components in the culture media</p>
<list list-type="simple">
<list-item>
<p>&#x2013; <bold>Component, Factor, Parameter, Variable</bold>, Input, Dimension, Features</p>
</list-item>
</list>
</list-item>
<list-item>
<p>&#x2022; The value of the factor. Levels are used specifically for discretized factor value</p>
<list list-type="simple">
<list-item>
<p>&#x2013; <bold>Concentration, Level, Value</bold>
</p>
</list-item>
</list>
</list-item>
<list-item>
<p>&#x2022; A list of culture media candidates, each with varying values for the factors</p>
<list list-type="simple">
<list-item>
<p>&#x2013; <bold>Design, Candidates</bold>, <bold>Population</bold>, Experiments, Runs, Formulation</p>
</list-item>
</list>
</list-item>
<list-item>
<p>&#x2022; The target value to be optimized for, obtained through experiment</p>
<list list-type="simple">
<list-item>
<p>&#x2013; <bold>Response, Objective value,</bold> Fitness, Output, Read-out</p>
</list-item>
</list>
</list-item>
<list-item>
<p>&#x2022; The combined set of culture media candidates and their corresponding response obtained through experiment</p>
<list list-type="simple">
<list-item>
<p>&#x2013; <bold>Data(point), Results,</bold> Training Set</p>
</list-item>
</list>
</list-item>
<list-item>
<p>&#x2022; In an iterative optimization workflow, each batch of experiment</p>
<list list-type="simple">
<list-item>
<p>&#x2013; <bold>Iteration, Generation, Round,</bold> Batch</p>
</list-item>
</list>
</list-item>
</list>
</p>
</sec>
<sec id="s3-2">
<title>3.2 A generalization of media optimization methods</title>
<p>The methods for media optimization can be generalized as an iterative computational-experimental workflow (<xref ref-type="fig" rid="F1">Figure 1</xref>). In this general workflow, a list of components and their range of values are defined prior to the optimization process. For each iteration, the optimization algorithm is used to propose a list of candidates, also known as the experiment design. Note that in the first iteration, the initial list of candidates, known as the initial design of experiment, is proposed not by the optimization algorithm but rather using standard designs or random sampling from the input space. Next, the culture media corresponding to these candidates are prepared and cells are cultured in each candidate media. A measure of the objective of interest, such as protein yield, is quantified experimentally. These values are then fed back to the optimization algorithm to propose the next iteration of candidates. Alternatively, the workflow terminates if no more improvement is observed, or if experimental budget is reached.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption>
<p>Generalization of media optimization methods.</p>
</caption>
<graphic xlink:href="fbioe-11-1195294-g001.tif"/>
</fig>
<p>Thus, based on this general workflow, we can divide a media optimization method into several parts, and classify existing methods based on these parts:<list list-type="simple">
<list-item>
<p>1. Problem definition, i.e., optimization objective and media component space</p>
</list-item>
<list-item>
<p>2. Initial design of experiment</p>
</list-item>
<list-item>
<p>3. Optimization algorithm</p>
</list-item>
</list>
</p>
</sec>
<sec id="s3-3">
<title>3.3 Problem definition</title>
<sec id="s3-3-1">
<title>3.3.1 Objective type</title>
<p>Optimization can be classified based on the objective as single-objective or multi-objective. Single-objective is the most common type, especially if the value of the product greatly exceeds the cost of the culture media, in which case optimizing for yield alone is sufficient. Multi-objective optimization is used when multiple outcomes of interests are considered. For example, maximizing the yield of a useful product while minimizing the production of a toxic metabolite, or maximizing yield while minimizing the cost of culture media.</p>
<p>For multi-objective optimization, a range of possible optimal solutions exists known as the Pareto front. A Pareto optimal solution is a point where it is not possible to improve one objective without worsening another.</p>
<p>When trying to optimize culture media with multiple objectives in mind, it can either be tackled as a multi-objective problem or converted to a single-objective problem by modifying the objective function. For example, to balance the goal of increased cell proliferation, low cost and ease of use, <xref ref-type="bibr" rid="B19">Cosenza et al. (2021)</xref> used a single objective function that normalizes the measure of cell proliferation by the volume of fetal bovine serum (FBS) which is the costly component used in the media.</p>
</sec>
<sec id="s3-3-2">
<title>3.3.2 Factor value type</title>
<p>The values of the input factors of the culture media can be classified as continuous values or discrete levels. Discrete levels simplify the problem by reducing possible combinations to a finite set. However, continuous values allow for a finer-grained optimization of the factors.</p>
<p>In some cases, by increasing the number of levels, it is possible to approximate a continuous value with discrete levels such as in <xref ref-type="bibr" rid="B41">Havel et al. (2006)</xref> where 7 binary bits resulting in 128 levels were used to represent the values for a discrete genetic algorithm. Vice-versa, the opposite can be achieved by rounding off continuous values to fixed discrete levels as seen in <xref ref-type="bibr" rid="B63">Kim &#x26; Audet (2019)</xref>. This can be useful when adapting algorithms that were designed originally for discrete or continuous problems to the requirement of the problem.</p>
<p>In the works reviewed, a few (2&#x2013;3) discrete levels are commonly adopted during the initial design of experiment. In subsequent iterations, continuous values are often employed for more precise media formulations.</p>
</sec>
<sec id="s3-3-3">
<title>3.3.3 Choice of factors</title>
<p>There are many important factors that affect cell proliferation and metabolite production, and these factors change according to the desired outcome of the optimization. However, not all media components contribute significantly to the desired outcome and changing concentrations of these components are inconsequential. Thus, it is important to omit such factors from the optimization strategy to prevent the redundant and excessive use of laboratory resources. One strategy used by many researchers is to conduct a screening of factors using DOE methods such as Plackett-Burman design or Definitive Screening design. This serves to identify the most important factors that affect the response, thus reducing the input space to a more manageable dimension. The use of statistical DOE for factor screening is well established and we refer interested readers to <xref ref-type="bibr" rid="B4">Antony (2014)</xref>. Considerations on the number of factors given experimental budget constraint are further discussed in <xref ref-type="sec" rid="s7">Section 7</xref>.</p>
</sec>
</sec>
<sec id="s3-4">
<title>3.4 Initial design of experiment</title>
<p>An experiment design refers to a list of candidates that have varying values for each of the factors.</p>
<p>In the first iteration, an initial design of experiment (DOE) is conducted without the need for the optimization algorithm as a starting point for the optimization problem, assuming that no data is available yet. Designs can be classified into two types, statistical DOE and random designs. It is also possible to have a mixture of both by supplementing statistical designs with some random designs.</p>
</sec>
<sec id="s3-5">
<title>3.5 Optimization algorithm</title>
<p>The types of algorithms used in cell culture media optimization works can be broadly classified into two classes: direct optimization, and surrogate-based optimization (<xref ref-type="fig" rid="F2">Figure 2</xref>).</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption>
<p>Classification of optimization algorithms used in media optimization literature.</p>
</caption>
<graphic xlink:href="fbioe-11-1195294-g002.tif"/>
</fig>
<p>We define direct optimization algorithms as methods that optimize by evaluating the objective function directly. Direct search and metaheuristics algorithms are two types of methods that fall under our definition of direct optimization.</p>
<p>Direct search methods perform a sequential examination of trial solutions, involving a comparison of each trial solution with the best solution obtained up to that time (<xref ref-type="bibr" rid="B6">Audet and Hare, 2017</xref>). A strategy is used to determine what the next trial solution will be. Direct search methods are deterministic in nature. Commonly used direct search methods include Nelder-Mead downhill simplex, pattern search, and mesh adaptive direct search. None of the papers reviewed used direct search methods to directly optimize culture media. However, direct search methods have been used as acquisition methods in surrogate-based optimization, which will be discussed in <xref ref-type="sec" rid="s6-3">Section 6.3</xref>.</p>
<p>Metaheuristics optimization algorithms are designed with heuristics or strategies for generating candidates to find good solutions to an optimization problem (<xref ref-type="bibr" rid="B6">Audet and Hare, 2017</xref>). Metaheuristics algorithms are often inspired by naturally occurring optimizing phenomena, such as the optimization of fitness through evolution by natural selection by life, the efficient foraging of food by animal swarms or the reordering of atoms into lower energy states through the process of annealing. In metaheuristics optimization, the objective function values of candidates are used as information to inform the generation of new candidates by applying heuristic rules. They also contain elements of stochasticity that helps to avoid being trapped in local optima. This characteristic of stochasticity is a key difference compared to direct search methods. Commonly used metaheuristics optimization algorithms include genetic algorithm, simulated annealing and particle swarm algorithm.</p>
<p>Surrogate-based optimization (SBO) is the other major class of optimization algorithms. SBOs were initially conceptualized as a way of optimizing computationally expensive simulation experiments but have since been used to optimize black-box functions in general (<xref ref-type="bibr" rid="B64">Koziel et al., 2011</xref>). A surrogate model is used to estimate the true objective function by training on a relatively small set of samples obtained from the true function. Because the surrogate model is much cheaper to evaluate than the true function, it can be queried more extensively and used to suggest candidates for further testing. The surrogate model is trained on the initial DOE inputs and the experimentally obtained outputs. Using the surrogate model as an estimator of the true objective function, new candidate solutions are proposed based on a certain strategy. The new candidate is then evaluated experimentally, and the new data is used to update the surrogate model. The process is repeated until the result is satisfactory or if the experiment budget is exceeded. Commonly used surrogate models include quadratic response surface method (RSM), Kriging (also known as Gaussian process regression), and neural networks.</p>
<p>Other than the two classes described above, within optimization literature hybrid methods are also used.Hybrid methods combine aspects of both metaheuristics and surrogate based optimization (<xref ref-type="bibr" rid="B109">Stork et al., 2022</xref>). This may be achieved in several ways. For example, different algorithms can be run simultaneously and proposed candidates combined; or during the optimization iteration, one control the outer loop, while another method is used subsequently in the inner loopbe nested in another algorithm. In the culture media optimization literature surveyed, we have not come across studies that used hybrid methods. Some studies perform sets of experiment using metaheuristics and SBO in parallel to compare their performance, but the candidates are not combined between the two sets.</p>
</sec>
</sec>
<sec id="s4">
<title>4 Initial design of experiment</title>
<sec id="s4-1">
<title>4.1 Statistical design of experiment (DOE) designs</title>
<p>Statistical Design of Experiments (DOE) is a method used to systematically investigate the relationship between variables in a controlled experimental setting. Several types of DOE designs are available and can be used for the initial design (<xref ref-type="bibr" rid="B53">Jankovic et al., 2021</xref>).</p>
<p>Full factorial design: This design includes all possible combinations of the levels of each variable. While this method is the most comprehensive, it is unfeasible for media with many components, as the total number of experiments would be x<sup>k</sup> where x is the number of levels and k is the number of components.</p>
<p>Fractional factorial design: This design includes a fraction of all the possible combinations of levels of each variable. The use of this design results in more efficient use of resources but result in confounding with interaction terms.</p>
<p>Taguchi design: This is a design that is an improvement to full and fractional factorial designs. It involves using orthogonal arrays to arrange the factors affecting the experiment and determine the levels at which they should be set. In contrast to traditional DOE, Taguchi treats noise as a focus of analysis (<xref ref-type="bibr" rid="B43">Hernadewita et al., 2019</xref>).</p>
<p>Central composite design (CCD): This design method starts with an embedded factorial or fractional factorial design with center points and adds &#x201c;star&#x201d; points to estimate curvature.</p>
<p>Box-Behnken Design (BBD): This design, unlike central composite design, is an independent quadratic design that does not contain an embedded factorial or fractional factorial design. The component combinations are the midpoints of edges of the process space and the center point. Compared to central composite design, the Box-Behnken design has limited capability for orthogonal blocking.</p>
</sec>
<sec id="s4-2">
<title>4.2 Random designs</title>
<p>When a large number of factors are present, statistical DOE designs are often infeasible to be carried out as the number of experiments required grows exponentially. An alternative to DOE designs is random designs.</p>
<p>A simple random sampling method that samples from a uniformly random distribution for each factor is non-ideal as it tends to result in uneven distances between samples in the input space. This means some parts of the input space are inadequately sampled and reduces the effectiveness of subsequent optimization. In comparison, space filling designs are able to evenly cover the input space with samples.</p>
<p>The most commonly used space filling design is the Latin hypercube design (LHS). Latin hypercube samples are generated such that each hyperplane of the input space contains only one sample. LHS is used by a few works (<xref ref-type="bibr" rid="B18">Cosenza et al., 2022</xref>; <xref ref-type="bibr" rid="B127">Yoshida et al., 2022</xref>) as the initial DOE.</p>
</sec>
</sec>
<sec id="s5">
<title>5 Metaheuristics optimization</title>
<p>Following the taxonomy proposed by <xref ref-type="bibr" rid="B109">Stork et al. (2022)</xref>, metaheuristics optimization can be broadly classified as population type, hill climbing type and trajectory type. For iterative experiments in cell culture experiment settings, the use of automated liquid handlers to prepare solutions allows multiple candidates to be tested in each iteration. Population-type algorithms are most suitable to leverage this capability and provide faster optimization.</p>
<p>Popular population-type metaheuristics include evolution-inspired algorithms such as genetic algorithm, evolutionary strategies, and differential evolution, and swarm-inspired algorithms such as particle swarm algorithm and ant-colony algorithm. We will introduce the algorithms that have been applied for cell culture optimization. For more in-depth details on metaheuristics optimization, we refer interested readers to Du and Swamy (2016).</p>
<sec id="s5-1">
<title>5.1 Genetic algorithm</title>
<p>Genetic algorithm (GA) is a metaheuristics optimization inspired by the process of evolution through natural selection. It generates new candidates through biologically inspired operations of selection, crossover and mutation.</p>
<p>In the classic GA, candidates are represented as a &#x201c;chromosome&#x201d;, where the value of each factor, in discrete levels, is coded as a bit string. In each iteration, three steps take place to generate the subsequent population of candidates (<xref ref-type="bibr" rid="B81">Mirjalili, 2019</xref>).<list list-type="simple">
<list-item>
<p>1. Selection, which selects some candidates of the previous generation as &#x201c;parents&#x201d;. By the analogy of natural selection where fitter candidates survive more to give rise to offspring, preference is given to candidates that score highly on the objective function.</p>
</list-item>
<list-item>
<p>2. Crossover, where &#x201c;parents&#x201d; are combined to produce new &#x201c;offspring&#x201d; candidates. Random sites are chosen on the &#x201c;chromosome&#x201d; to be retained on the offspring candidate</p>
</list-item>
<list-item>
<p>3. Mutation, where random changes occur to the offspring candidate to create more diversity and to prevent premature convergence to local optima.</p>
</list-item>
</list>
</p>
<p>Many variants of GA exist, with different definitions of the three operations. The classic GA is designed for discrete problems as each input is represented as a string. However, GA has also been adopted for continuous problems, by adopting a continuous &#x201c;chromosome&#x201d; representation, and modifying the crossover and mutation operations.</p>
<p>GA is a popular method in culture media optimization. It has been often used for problems that have many factors. GA is also one of the most popular methods used as the acquisition method in surrogate-based optimization.</p>
</sec>
<sec id="s5-2">
<title>5.2 Differential evolution</title>
<p>Differential evolution (DE) is another evolution-inspired metaheuristics optimization that is popular, especially for continuous problems. DE also generates new candidates through the biologically inspired operation of mutation and crossover like GA but instead of strings, the inputs are represented as real-valued vectors, implemented as floats.</p>
<p>In classic DE, in each iteration, three steps take place to generate the new population of candidates (<xref ref-type="bibr" rid="B92">Price, 2013</xref>).<list list-type="simple">
<list-item>
<p>1. Mutation is achieved by selecting a random target vector and adding it to a scaled difference of two other randomly selected solution vectors, to generate a trial vector.</p>
</list-item>
<list-item>
<p>2. Crossover is achieved by discrete recombination of the parent vector and trial vector with a given crossover probability P<sub>CR</sub>, to obtain the offspring vector.</p>
</list-item>
<list-item>
<p>3. After evaluating the offspring vectors in the objective function, the fitter one between the offspring and parent vector is retained for the next iteration.</p>
</list-item>
</list>
</p>
<p>Advanced variants of DE such as L-SHADE have been proposed in black-box optimization literature. DE optimizes continuous problems by default but can be adapted to discrete problems easily by rounding off candidates to their nearest discrete level. Variants of DE have been applied to optimize media components in some studies.</p>
</sec>
<sec id="s5-3">
<title>5.3 Particle swarm optimization</title>
<p>Particle swarm optimization (PSO) is a metaheuristics optimization inspired by mimicking animal flocking behavior. Each particle&#x2019;s movement is influenced by both its personal best-known position and the best-known positions of the swarm as a whole, in the expectation that the swarm as a whole moves towards the global optimum (<xref ref-type="bibr" rid="B60">Kennedy and Eberhart, 1995</xref>). The algorithm starts with a population, also known as a swarm, of candidate solutions.<list list-type="simple">
<list-item>
<p>&#x2022; Each particle (candidate solution) moves around the search space.</p>
</list-item>
<list-item>
<p>&#x2022; The current location <italic>x</italic>
<sub>
<italic>id</italic>
</sub> and velocity <italic>v</italic>
<sub>
<italic>id</italic>
</sub> and personal best <italic>pbest</italic>
<sub>
<italic>i</italic>
</sub> of any particle <italic>i</italic> and the global best <italic>gbest</italic> of the entire population are used to compute how the particles should move next in the <italic>d</italic>-dimensional hyperspace.</p>
</list-item>
</list>
</p>
<p>This process is repeated until a satisfactory optimum is reached. A few papers have used this approach(<xref ref-type="bibr" rid="B16">Cockshott and Hartman, 2001</xref>; <xref ref-type="bibr" rid="B47">Huang et al., 2007</xref>; <xref ref-type="bibr" rid="B31">Garlapati et al., 2010</xref>; <xref ref-type="bibr" rid="B61">Khaouane et al., 2012</xref>).</p>
</sec>
</sec>
<sec id="s6">
<title>6 Surrogate-based optimization</title>
<p>Surrogate-based optimization (SBO) can be viewed as comprising three separate components: the surrogate model, the acquisition function, and the acquisition method (<xref ref-type="bibr" rid="B28">Forrester and Keane, 2009</xref>). The surrogate model refers to the model that is used to approximate the actual function of interest by fitting experimental data. Before the first iteration, the model is fitted with the data collected from the initial DOE. The acquisition function refers to the function or criterion that is used to decide which candidates to propose and evaluate in the next round of experiments. The acquisition method refers to how the candidates are found by optimizing the acquisition function. (Note: The acquisition of new points, i.e., both the choice of acquisition function and acquisition method may also be referred to as infill strategy or infill criteria in literature (<xref ref-type="bibr" rid="B129">Zhou et al., 2020</xref>). It is also closely related to the concept of active learning in machine learning, which seeks to improve the model&#x2019;s accuracy with less training data by systematically acquiring training sample (Ren et al., 2021).</p>
<p>To illustrate with an example, a common SBO workflow used in media optimization is the ANN-GA method. In this method, the surrogate model used is an artificial neural network (ANN), that predicts the response given a valid media design as input. The ANN is trained with actual response values collected from experiments performed with the initial DOE candidates. As we assume that the ANN is an accurate predictor of the response, and we would like to maximize the response, we would then computationally find the input that maximizes the predicted response of the ANN. The acquisition function in this case would be the predicted value (PV) because we acquire new candidates based on the predicted value output by the surrogate model. These candidates are found by running GA on the ANN as the objective function. GA in this case would be the acquisition method. The acquired candidates that maximize the response of ANN are then evaluated experimentally and added to training data to update the ANN model in the next iteration.</p>
<p>We will introduce the common types of surrogate models, acquisition functions and acquisition methods (if they have not been covered in <xref ref-type="sec" rid="s5">Section 5</xref>), that have been applied for cell culture optimization. We also refer interested readers to (<xref ref-type="bibr" rid="B28">Forrester and Keane, 2009</xref>, Jiang et al., 2020) for more in-depth resources on surrogate optimization.</p>
<sec id="s6-1">
<title>6.1 Surrogate models</title>
<sec id="s6-1-1">
<title>6.1.1 Polynomial response surfaces</title>
<p>Response surface methodology (RSM) is a type of surrogate-based optimization method that was first introduced in 1951 and has found wide applications in optimizing industrial processes (<xref ref-type="bibr" rid="B62">Khuri and Mukhopadhyay, 2010</xref>). The typical RSM approach is to first determine significant factors with a linear model, known as factor screening. After which, only the significant factors are kept for optimization. A linear model is again used to find the suitable range of input where the optimal is likely to lie in, by going along the path of steepest ascent or descent. After which, a DOE is conducted typically with CCD or BBD, and the data is used to fit a higher-order polynomial model. The polynomial model, which is essentially a surrogate model, is then used for optimization.</p>
<p>The most common response surface is the second-order polynomial, based on the assumption that the landscape of the objective function can be approximated with a second-order Taylor expansion. The standard form of the model is given as<disp-formula id="equ1">
<mml:math id="m1">
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>&#x3b2;</mml:mi>
<mml:mn>0</mml:mn>
</mml:msub>
<mml:mo>&#x2b;</mml:mo>
<mml:mrow>
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>m</mml:mi>
</mml:munderover>
</mml:mstyle>
<mml:msub>
<mml:mi>&#x3b2;</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x2b;</mml:mo>
<mml:mrow>
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>m</mml:mi>
</mml:munderover>
</mml:mstyle>
<mml:msub>
<mml:mi>&#x3b2;</mml:mi>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:msubsup>
<mml:mi>x</mml:mi>
<mml:mi>j</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo>&#x2b;</mml:mo>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo>&#x2211;</mml:mo>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mo>&#x3c;</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
<mml:mi>m</mml:mi>
</mml:munderover>
</mml:mstyle>
<mml:msub>
<mml:mi>&#x3b2;</mml:mi>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>Polynomial response surfaces are easy to fit and computationally inexpensive. However, as they are relatively simple, empirically they are often less accurate than other types of surrogate models. When there are many factors involved, polynomial response surfaces are often inadequate in providing accurate predictions. In media optimization literature, polynomial response surfaces are mostly used for problems involving 5 factors or less.</p>
<p>Analytical solution to find the optima of the polynomial response surface through canonical and ridge analysis is one way to find the maxima/minima of the surrogate model, although other methods such as genetic algorithms are used as well, sometimes with better results.</p>
<p>Typical usage of RSM for optimization follows a one-step optimization process where the model is fitted with the DOE results and one set of candidates is proposed based on the model which is subsequently tested in the real-world experiment before the optimization process terminates. However, iterative refinement of the polynomial response surface is also possible. Although not in any of the reviewed culture media optimization literature that uses RSM, the iterative workflow has been used in other applications (<xref ref-type="bibr" rid="B34">Goswami et al., 2016</xref>).</p>
</sec>
<sec id="s6-1-2">
<title>6.1.2 Gaussian process models</title>
<p>Gaussian process (GP) model, also known as Kriging model, is one of the most widely used surrogate models (<xref ref-type="bibr" rid="B28">Forrester and Keane, 2009</xref>), especially in engineering applications. GP-based surrogate model optimization is often known as Bayesian optimization (<xref ref-type="bibr" rid="B107">Snoek et al., 2012</xref>) in the machine learning literature used for optimizing model hyperparameters.</p>
<p>GPs are non-parametric probabilistic regression models that model a distribution over functions and are hence able to give confidence for predictions. This is a useful property that is used for acquisition functions. It first defines a prior over functions, which can be converted into a posterior distribution over functions once data is observed. The main idea of GPs is the assumption that subsets of the function&#x2019;s values have a joint Gaussian distribution. This means that given a set of inputs, the corresponding outputs will be distributed according to a multivariate Gaussian distribution. The covariance of the joint distribution is computed with a kernel function, which can be thought of as a similarity measure between the inputs. When observations, i.e., training data are provided, we can condition upon these observations to update the prior and compute the posterior distribution. The prediction for input with an unknown function value is done by marginalizing the posterior distribution on that input and extracting the mean value. The variance will be the confidence of the prediction.</p>
<p>GPs can be computationally expensive for large datasets. However, compared to the much more expensive cell culture experiments, the computational cost of GPs does not pose a practical challenge.</p>
</sec>
<sec id="s6-1-3">
<title>6.1.3 Artificial neural networks</title>
<p>Artificial neural networks (ANN) are an increasingly popular machine learning model that have been used in a wide variety of tasks. ANNs are universal function approximators that can be used to model arbitrary functions and this contributes to their success.</p>
<p>ANNs are made up of neurons arranged in layers and connected together by weighted edges (<xref ref-type="bibr" rid="B66">Krogh, 2008</xref>). A simple feedforward network where each neuron in a layer is connected to all neurons in the previous layer, is known as a multi-layer perceptron (MLP) or a fully-connected neural network (FCNN) (<xref ref-type="fig" rid="F3">Figure 3</xref>). The input, represented as a vector, is defined as the input layer, where each neuron takes on the value of one dimension of the input vector. The values of the neurons of each layer are computed by multiplying the value of the neurons in the previous layer by the weights of the edges connecting them to neurons of the current layer and summing them up. A non-linear function known as the activation function is applied to the value of each neuron which introduces non-linearity. For regression problems, no activation is applied to the final layer which outputs an unbounded continuous value.</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption>
<p>Diagram of typical Multi-Layer Perceptron.</p>
</caption>
<graphic xlink:href="fbioe-11-1195294-g003.tif"/>
</fig>
<p>Common activation functions include rectified linear unit (ReLU), sigmoid function, and hyperbolic tangent function. Radial basis functions can also be used as activation functions, and such neural networks, normally with single hidden layer are called radial basis neural networks (RBFNN) (<xref ref-type="bibr" rid="B32">Ghosh and Nag, 2001</xref>). A variation of RBFNN is the Generalized regression neural network (GRNN).</p>
<p>ANNs, specifically MLPs generally with 1 - 2 hidden layers are common surrogate models used in culture media optimization literature. There have been many works that compare ANNs and second-order polynomial response surfaces on the closeness of fit and RMSE of prediction on response to cell culture media. The general conclusion from those works is that ANNs are more accurate surrogate models and provide better optimization performance. RBFNNs are also popularly used as surrogate models in culture media optimization.</p>
<p>Deeper ANNs with many hidden layers, also known as deep learning models, are less common in media optimization. Two studies have used ANNs with 4 hidden layers (<xref ref-type="bibr" rid="B114">Tachibana et al., 2021</xref>; <xref ref-type="bibr" rid="B127">Yoshida et al., 2022</xref>) as surrogate models to predict the expression of GFP in <italic>E. coli</italic>. Having more layers increases the ability of the network to fit data, however it should be noted that such overparameterized models generally require a high number of training datapoints to prevent overfitting and poor generalization. Hence, having a validation set to test the performance of the model after training can be useful to determine if overfitting occurs, and to decide if a deep network is necessary by comparing the prediction performance against other types of surrogate models.</p>
</sec>
<sec id="s6-1-4">
<title>6.1.4 Support vector machines</title>
<p>Support vector machines (SVM) are a type of machine learning model that learns a hyperplane that has the greatest number of training points falling within a certain margin away from the hyperplane. SVMs were first designed for classification problems, where the model learns a hyperplane that separates the different classes with the greatest margin. SVM for regression is a modification of the original SVM used for predicting a continuous output value (<xref ref-type="bibr" rid="B28">Forrester and Keane, 2009</xref>).</p>
<p>Unlike SVM for classification, which tries to separate the data with a clear margin, SVM for regression uses a technique known as epsilon-insensitive loss to allow some data points to be within the margin of error. The algorithm then tries to find the hyperplane to minimize the error between the predicted output values and the actual output values of the training set.</p>
<p>SVMs are uncommon as surrogate models in culture media optimization literature but have been used in other applications (<xref ref-type="bibr" rid="B15">Chintalapati et al., 2013</xref>). Despite limited usage in culture media optimization literature, it is generally held that SVMs are less prone to overfitting for smaller datasets compared to ANNs (<xref ref-type="bibr" rid="B124">Wilson, 2008</xref>) which could be an advantage in culture media optimization problems where there is a lower amount of data.</p>
</sec>
</sec>
<sec id="s6-2">
<title>6.2 Acquisition functions</title>
<sec id="s6-2-1">
<title>6.2.1 Predicted value</title>
<p>Predicted value is the most straightforward acquisition function. It essentially treats the surrogate model as a faithful predictor of the actual objective function. Thus by optimizing the predicted value of the surrogate model, the hope is that the solution found is likely a near-optimum point (<xref ref-type="bibr" rid="B28">Forrester and Keane, 2009</xref>).</p>
<p>Predicted value is the most common acquisition function used for most surrogate models except for GPs. In culture media optimization, it has been used in combination with RSMs, ANNs and other models.</p>
<p>The drawback is that the surrogate model may give wrong predictions that deviate from the true objective function value by a large amount.</p>
</sec>
<sec id="s6-2-2">
<title>6.2.2 Expected improvement</title>
<p>For surrogate models that can provide confidence estimation of prediction, alternative acquisition functions that incorporate prediction uncertainty can be used instead. This helps to address the problem of poor predicted values in less sampled regions.</p>
<p>The expected improvement (EI) acquisition function is designed to balance between exploration and exploitation of the search space (<xref ref-type="bibr" rid="B28">Forrester and Keane, 2009</xref>). Based on the predicted value of the surrogate model, EI favors points that have a high probability of improving the current best solution by considering the difference between the current best solution and the predicted value of the objective function at a new point. At the same time, EI also favors points that are uncertain, measured by the standard deviation of the predicted value of the objective function at a new point. Points that have a high standard deviation are more uncertain, and therefore more likely to provide new information about the search space. Thus, by considering both the potential improvement of a new point and the uncertainty of the prediction, EI acquisition function provides both exploitation and exploration.</p>
<p>EI is computed as such:<disp-formula id="equ2">
<mml:math id="m2">
<mml:mrow>
<mml:mi>E</mml:mi>
<mml:mi>I</mml:mi>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:msub>
<mml:mi>f</mml:mi>
<mml:mi mathvariant="italic">min</mml:mi>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:mover accent="true">
<mml:mi>f</mml:mi>
<mml:mo>&#x5e;</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
<mml:mi mathvariant="bold">&#x3a6;</mml:mi>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>f</mml:mi>
<mml:mi mathvariant="italic">min</mml:mi>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:mover accent="true">
<mml:mi>f</mml:mi>
<mml:mo>&#x5e;</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
</mml:mrow>
<mml:mi>s</mml:mi>
</mml:mfrac>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
<mml:mo>&#x2b;</mml:mo>
<mml:mi>s</mml:mi>
<mml:mi>&#x3d5;</mml:mi>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>f</mml:mi>
<mml:mi mathvariant="italic">min</mml:mi>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:mover accent="true">
<mml:mi>f</mml:mi>
<mml:mo>&#x5e;</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
</mml:mrow>
<mml:mi>s</mml:mi>
</mml:mfrac>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>where <inline-formula id="inf1">
<mml:math id="m3">
<mml:mrow>
<mml:mi mathvariant="bold">&#x3a6;</mml:mi>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:mo>.</mml:mo>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> represents the standard normal density function; <inline-formula id="inf2">
<mml:math id="m4">
<mml:mrow>
<mml:mi>&#x3d5;</mml:mi>
<mml:mrow>
<mml:mfenced open="(" close=")" separators="|">
<mml:mrow>
<mml:mo>.</mml:mo>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> represents the probability distribution function; <inline-formula id="inf3">
<mml:math id="m5">
<mml:mrow>
<mml:mover accent="true">
<mml:mi>f</mml:mi>
<mml:mo>&#x5e;</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</inline-formula> is the surrogate model predictor; <inline-formula id="inf4">
<mml:math id="m6">
<mml:mrow>
<mml:msub>
<mml:mi>f</mml:mi>
<mml:mi mathvariant="italic">min</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> is the current best functional value and <inline-formula id="inf5">
<mml:math id="m7">
<mml:mrow>
<mml:mi>s</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula> is the standard deviation (<xref ref-type="bibr" rid="B10">Bhosekar, 2020</xref>).</p>
<p>GPs provide an uncertainty estimate of its predictions inherently, and EI can be easily applied as the acquisition function and is often the choice by default. For typical ANNs used in regression, the uncertainty estimate is not available. However, one technique for uncertainty estimation is through measuring the variance in predictions between an ensemble of neural networks each trained separately on the same data. With this uncertainty estimate, EI can then be applied as the acquisition function (<xref ref-type="bibr" rid="B71">Lim et al., 2021</xref>). So far, in culture media optimization, EI as an acquisition function has only been used in conjunction with GPs. Given that ANNs are commonly used as the surrogate model in many culture media optimization studies, it may be useful to investigate using EI as the acquisition function on an ensemble of ANNs as the surrogate model.</p>
</sec>
<sec id="s6-2-3">
<title>6.2.3 Custom functions</title>
<p>Some works have designed custom acquisition functions that incorporate other aspects relevant to their goals of media design. For example, Cosenza et al. (2022) used a custom acquisition function that also accounts for the cost of producing the media, converting a multi-objective optimization problem into a single objective problem.</p>
</sec>
</sec>
<sec id="s6-3">
<title>6.3 Acquisition methods</title>
<sec id="s6-3-1">
<title>6.3.1 Analytical solution</title>
<p>For some surrogate models, namely, polynomial response surfaces, analytical solutions for the optima can be obtained by solving for the stationary points of the polynomial if they exist and checking the nature of the stationary points. This is known as canonical and ridge analysis (<xref ref-type="bibr" rid="B20">Dean et al., 2017</xref>).</p>
<p>One possible problem with this approach is that we may be interested in acquiring multiple points instead, which is more easily achieved through population-based metaheuristics. This may be why many papers chose to use methods such as genetic algorithms to optimize the predicted value of polynomial response surfaces rather than using an analytical solution.</p>
</sec>
<sec id="s6-3-2">
<title>6.3.2 Local optimization methods</title>
<sec id="s6-3-2-1">
<title>6.3.2.1 Nelder-Mead Downhill simplex</title>
<p>Nelder-Mead downhill simplex is a direct search algorithm that optimizes by constructing a nondegenerate simplex in the search space and uses rules of evolving the simplex to drive the search (<xref ref-type="bibr" rid="B70">Lewis et al., 2000</xref>). A simplex refers to a set of <italic>n</italic>&#x2b;1 points in an <italic>n-</italic>dimensional space.</p>
<p>In simplex search, the worst point of the simplex is reflected through the centroid of the opposite face of the simplex. If the new point improves upon the worst point, it is kept to form the new simplex. If not, then the next worst point of the simplex is used to reflect and generate a new point. If all new points do not improve compared to the existing simplex, the lengths of the edges adjacent to the current best vertex are reduced by half. This process can be thought of as a variation of the method of steepest descent where the direction of movement is the opposite direction to the gradient of a plane fitted to the simplex points.</p>
<p>Nelder-Mead downhill simplex can be used as a local optimization method to optimize the acquisition function, as used in <xref ref-type="bibr" rid="B117">Tripathi et al. (2012)</xref>.</p>
</sec>
<sec id="s6-3-2-2">
<title>6.3.2.2 BFGS and L-BFGS</title>
<p>BFGS (Broyden&#x2013;Fletcher&#x2013;Goldfarb&#x2013;Shanno) is a widely used local optimization algorithm used for solving non-linear optimization problems. BFGS belongs to a family of Quasi-Newton methods, which are approximations of the Newton-Raphson method. The BFGS algorithm uses a limited number of gradient evaluations to approximate the Hessian matrix (describes the curvature of the function at a given point) and it updates this approximation at each iteration. L-BFGS (Limited-memory BFGS) is a variant of BFGS that is more suited for optimizing problems with many variables. BFGS stores a dense n&#xd7;n approximation to the inverse Hessian (<italic>n</italic> is the number of variables), whereas L-BFGS stores a limited number of vectors for the approximation (<xref ref-type="bibr" rid="B74">Liu and Nocedal, 1989</xref>). Quasi-Newton methods such as BFGS are often the default choice in optimization libraries and are reliable in finding the local optima of smooth functions. It is therefore used as the default optimizer in many Bayesian optimization libraries such as GPyOpt, SMT, and scikit-optimize.</p>
</sec>
<sec id="s6-3-2-3">
<title>6.3.2.3 Drawbacks of local optimization</title>
<p>Local optimization will only converge upon the local optima close to the initial point of search. When the surrogate model is highly non-linear and contains multiple local optima, it is difficult to find the global optimum of the acquisition function. This could be addressed by either running multiple instances of local optimization starting at random locations or using global optimization methods.</p>
</sec>
</sec>
<sec id="s6-3-3">
<title>6.3.3 Global optimization methods</title>
<p>Any of the optimization algorithms introduced in <xref ref-type="sec" rid="s5">section 5</xref> can optimize the acquisition function including genetic algorithm and differential evolution. Here we introduce two others that have been used in culture media optimization.</p>
<sec id="s6-3-3-1">
<title>6.3.3.1 Simulated annealing</title>
<p>Simulated annealing (SA) is a metaheuristic algorithm inspired by the annealing process of materials when cooled down from high temperatures. As the temperature of the material is slowly lowered, the atoms settle to a new configuration that has a lower internal energy. The initial starting position is thought of as a local minimum. The heating of the materials translates to replacing the current solution with new random solutions. The new solutions may be accepted based on a probability computed on the resulting function value decrease as well as a &#x2018;temperature&#x2019; measure which is slowly decreasing as iterations increase. The temperature measure allows for solutions that have higher objective function values to be accepted which avoids trapping in local minima.</p>
<p>SA has been used to optimize the acquisition function when ANN or RSM is used as a surrogate model(<xref ref-type="bibr" rid="B5">Aquino et al., 2016</xref>; <xref ref-type="bibr" rid="B87">Parkhey et al., 2017</xref>; <xref ref-type="bibr" rid="B23">Dhagat and Jujjavarapu, 2021</xref>).</p>
</sec>
<sec id="s6-3-3-2">
<title>6.3.3.2 DYCORS</title>
<p>DYCORS (Dynamic Coordinate Search using Response Surface Models) is an optimization algorithm used for solving non-linear and non-convex optimization problems. It is a variant of the coordinate search method and uses a dynamic selection of coordinates for optimization on radial basis response surface, which allows it to converge faster than traditional coordinate search methods.</p>
<p>The DYCORS algorithm starts with a feasible point and at each iteration, it selects a subset of coordinates to optimize based on the gradient information. Then, it performs a line search along the selected coordinates to minimize the objective function. The algorithm continues this process until a stopping criterion is met. The selection of coordinates is done dynamically, as the algorithm progresses, this allows it to adapt to the structure of the problem and avoid getting stuck in poor local optima. DYCORS has been used to optimize on a RBFNN (<xref ref-type="bibr" rid="B19">Cosenza et al., 2021</xref>).</p>
</sec>
</sec>
<sec id="s6-3-4">
<title>6.3.4 Strategies for acquiring multiple candidates</title>
<p>When proposing a new batch of candidates, having a diversity of candidates helps to explore the solution space better and avoid duplicate testing of similar candidates. If factors are continuous values, candidates can be technically different but have arbitrarily small differences. Hence when proposing candidates using the acquisition function, several strategies may be employed to promote diversity.</p>
<p>Early stopping of the acquisition optimization algorithm can promote diversity for population-based metaheuristics, this is illustrated in <xref ref-type="fig" rid="F4">Figure 4</xref>. When metaheuristics algorithms are allowed to run for many iterations, the solution population will generally converge upon an optimum. By stopping early, diversity in the solution population may be preserved. Truncated GA is one example of this strategy.</p>
<fig id="F4" position="float">
<label>FIGURE 4</label>
<caption>
<p>Heatmap of the L1 distance, where 0 represents the same value. As the number of generations of the algorithm increases, the diversity of the candidate population decreases.</p>
</caption>
<graphic xlink:href="fbioe-11-1195294-g004.tif"/>
</fig>
<p>Multiple runs of the acquisition optimization algorithm from random initial positions allow the acquisition of different local minima when using local optimization methods or non-population type metaheuristics like SA. As local optimization methods will generally converge upon an optimum that is closer to the initial position, by running this process multiple times but with different randomly selected starting positions, multiple local minima can be found if they exist in the function. An example is shown in <xref ref-type="fig" rid="F5">Figure 5</xref>.</p>
<fig id="F5" position="float">
<label>FIGURE 5</label>
<caption>
<p>Illustration of how using multiple runs from random initial positions work on contour plot of the rastrigin function. Candidates from the initial positions would converge towards the nearest final position, with a few of them converging towards the global optimum.</p>
</caption>
<graphic xlink:href="fbioe-11-1195294-g005.tif"/>
</fig>
<p>When EI is used as the acquisition function, another strategy of generating multiple candidates is the parallel efficient global optimization method with<italic>qEI</italic> criterion. The <italic>q</italic>-EI criterion was used to choose q &#x2208; N points, where q refers to the number of candidates and N refers to the population size. Instead of directly optimizing <italic>q</italic>-EI, <xref ref-type="bibr" rid="B33">Ginsbourger et al. (2008)</xref> proposed the use of two heuristics methods, Kriging Believer and Constant Liar as candidate designs. The <italic>q</italic>-EI criterion is subsequently used to choose between the two.</p>
</sec>
</sec>
</sec>
<sec id="s7">
<title>7 Characteristics of culture media optimization problems</title>
<p>There are several key characteristics of culture media optimization problems. These should be taken into consideration when choosing the method of optimization and is what differentiates this problem from others.</p>
<p>Firstly, the number of iterations for optimization is limited. This results from the fact that cell culture experiments are time-consuming and therefore experimental groups cannot afford to optimize for a large number of iterations. Many works perform a one-step optimization, where only an initial design of experiment was cultured, followed by one batch of proposed candidates based on the initial DOE. The average number of iterations performed across all works was 2.73 (including one-step optimization) and 6.22 (excluding one-step optimization). In most benchmarking of black-box optimization algorithms, the number of iterations far exceeds what can be afforded for culture media optimization.</p>
<p>Because of this, optimization algorithms that are slow in improvement or convergence are not suitable. Based on studies that compare the two, a surrogate model-based approach may be more suitable as the rate of improvement tends to be higher than metaheuristics algorithms in early iterations.</p>
<p>Secondly, the number of candidates is limited and dependent on the nature of the cell culture and available experiment equipment. For example, if one were to use the standard cell culture plate size of 96 wells with at least three replicates per candidate, that would limit the experiment to a maximum of 32 candidates, not accounting for limitations like the edge effect (<xref ref-type="bibr" rid="B79">Mansoury et al., 2021</xref>). Of course, researchers need not be restricted to using one plate and can also use other multiwell plates like 384-well or 1536-well plates, however, these come with their own limitations such as format restrictions of the equipment available. The average batch size used in all the works surveyed was 27.01.</p>
<p>Given the limitations in both iterations and batch size, the number of variables that can be optimized given certain experiment budget while achieving statistical power is an important consideration in media optimization experiments. In all works surveyed that used a surrogate model, the average number of candidates tested in total was found to be 40.46, with 7.40 candidates tested per factor. The average number of factors used was 8.10 (including one-step optimization) and 14.44 (excluding one-step optimization).</p>
<p>For experiments that use a 2nd order polynomial RSM for optimization, a typical number of factors used is 5. Using the average number of candidates tested, this gives roughly 2 datapoints per coefficient for fitting, which is sufficient to achieve power of approximately 0.8 with a R<sup>2</sup> of 0.5 and a significance level of 0.05. However, with 6 factors, achieving the same power would need 50 candidates. It is important to consider the experimental budget and the need for sufficient statistical power when designing experiments to fit the response surface.</p>
<p>For non-parametric approaches or other machine learning models, validation sets can be used to estimate the accuracy of the model and determine if the number of datapoints is sufficient. Empirical rules of thumb may also be used as guides to decide on number of factors to choose given certain budget or conversely the number of experiments given certain number of factors. For example, a common guideline is to have data 10 times the number of factors. For metaheuristics algorithms, several studies (Boluf&#xe9;-R&#xf6;hler and Chen 2013; Chen et al., 2015) have looked at how population size affects performance at different dimensions. However, these have focused on metrics such as convergence time and closeness to optima after convergence, which is not entirely relevant to media optimization. Nevertheless, we may reference some guidelines such as having a population size larger than and preferably 10 times the dimension (Boluf&#xe9;-R&#xf6;hler and Chen 2013; Mallipeddi and Suganthan 2008).</p>
<p>Lastly, another key factor to consider is the effect of passaging. Often, bioprocesses require cells to be passaged continually for a few rounds. After each passage, the characteristics of the cells change which likely results in a change in the true function of the optimization problem. These characteristics include key gene functions, morphology, proliferation rate, and expression levels (<xref ref-type="bibr" rid="B48">Hughes et al., 2007</xref>). Currently, none of the black-box optimization algorithms account for this. One interesting approach used by <xref ref-type="bibr" rid="B18">Cosenza et al. (2022)</xref> was to include information about cell count by incorporating &#x2018;low-fidelity&#x2019; information sources such as biochemical assays and &#x2018;high-fidelity&#x2019; information sources like cell proliferation rate over one passage with a Bayesian optimization tool, thereby including single-passage and multiple-passage information into culture media optimization problems and accounting for the change in cell characteristics.</p>
</sec>
<sec id="s8">
<title>8 Trends in existing literature</title>
<p>The use of optimization algorithms for culture media optimization was first reported in 1992 by Freyer et al. (Dirk Weuster-Botz and Wandrey 1995), where GA was used to optimize media for formate dehydrogenase production in <italic>Candida boidinii.</italic> In the late 1990s and early 2000s, many researchers followed suit and started using GA for culture media optimization problems (<xref ref-type="bibr" rid="B120">Viennet et al., 1996</xref>; <xref ref-type="bibr" rid="B89">Patil et al., 2002</xref>; <xref ref-type="bibr" rid="B80">Marteijn et al., 2003</xref>; <xref ref-type="bibr" rid="B7">Bapat and Wangikar, 2004</xref>). Notably, <xref ref-type="bibr" rid="B16">Cockshott &#x26; Hartman (2001)</xref> were the first to implement PSO for the purpose of fermentation media optimization, where they hypothesized that PSO would perform better than GA based on previous reports that PSO performs better for smaller population sizes and converges to the optimum faster compared to GA. SBO for culture media optimization was first described by Coleman et al. (2003) where they used an ANN ensemble surrogate model to optimize <italic>Escherichia coli</italic> fermentation, based on previous studies in enology and other fermentation processes. Since then, many researchers have used optimization algorithms for the purpose of culture media optimization.</p>
<p>Of all the research articles available on the topic, approximately 70% reported using an SBO approach, of which roughly 43% used ANN as the surrogate model, PV as the acquisition function, and GA as the acquisition method (SBO(ANN)-PV-GA). The reason for the popularity of this method is unclear; many papers do not cite their reasons for choosing a particular model over others. Most of the papers compare SBO(ANN)-PV-GA to either classical methods such as OFAT screening (<xref ref-type="bibr" rid="B21">Desai et al., 2006</xref>) or to the use of statistical RSM as a surrogate (<xref ref-type="bibr" rid="B22">Desai et al., 2008</xref>; <xref ref-type="bibr" rid="B85">Pal et al., 2009</xref>; <xref ref-type="bibr" rid="B8">Baskar and Renganathan, 2010</xref>; <xref ref-type="bibr" rid="B24">Du et al., 2012</xref>; <xref ref-type="bibr" rid="B37">Gurunathan, 2012</xref>). Some cited the ability of ANN to excel in pattern recognition and modeling nonlinear relationships (<xref ref-type="bibr" rid="B39">Haider et al., 2008</xref>; <xref ref-type="bibr" rid="B102">Singh et al., 2008</xref>; <xref ref-type="bibr" rid="B85">Pal et al., 2009</xref>; <xref ref-type="bibr" rid="B24">Du et al., 2012</xref>; <xref ref-type="bibr" rid="B90">Peng et al., 2014</xref>) and its ability to work well with a small number of candidates (<xref ref-type="bibr" rid="B22">Desai et al., 2008</xref>) and noisy data (<xref ref-type="bibr" rid="B88">Pathak et al., 2015</xref>) as reasons why ANN models should be developed for biological systems. The next most popular approaches adopted were an SBO with RSM as the surrogate model, PV as the acquisition function, and GA as the acquisition method (SBO(RSM)-PV-GA); and direct optimization with GA. The papers that have compared SBO(ANN)-PV-GA and SBO(RSM)-PV-GA have usually concluded that ANN is a better model (<xref ref-type="bibr" rid="B75">Liu et al., 2009</xref>; <xref ref-type="bibr" rid="B61">Khaouane et al., 2012</xref>; <xref ref-type="bibr" rid="B59">Katla et al., 2019</xref>; <xref ref-type="bibr" rid="B99">Selvaraj et al., 2019</xref>; <xref ref-type="bibr" rid="B113">Suryawanshi et al., 2019</xref>) for media optimization problems, likely due to the nonlinear nature of the objective function.</p>
<p>Some less commonly used methods include neuro-fuzzy networks as a surrogate model (<xref ref-type="bibr" rid="B5">Aquino et al., 2016</xref>), elastic net regularized general linear model as a surrogate model (<xref ref-type="bibr" rid="B35">Grzesik and Warth, 2021</xref>), and ensemble modeling (<xref ref-type="bibr" rid="B76">Liu and Gunawan, 2017</xref>). Interestingly, most researchers opted for parametric approaches as opposed to non-parametric approaches like GP or SVM, since nonparametric methods are known to outperform parametric methods for high-dimensional and nonlinear data (<xref ref-type="bibr" rid="B73">Liu et al., 2020</xref>). <xref ref-type="fig" rid="F6">Figure 6</xref> shows the trend in methods used over the years.</p>
<fig id="F6" position="float">
<label>FIGURE 6</label>
<caption>
<p>Trend in methods used over the years.</p>
</caption>
<graphic xlink:href="fbioe-11-1195294-g006.tif"/>
</fig>
<p>As shown in <xref ref-type="fig" rid="F7">Figure 7</xref>, there is also an overrepresentation of bacteria and fungi as the cell line of choice in media optimization studies, due to the interest in optimizing the fermentation process in yeast strains. Despite rising interest in cultured meat production and the importance of antibody production, mammalian cells have not been studied extensively in media optimization research.</p>
<fig id="F7" position="float">
<label>FIGURE 7</label>
<caption>
<p>Distribution of types (kingdom) of organisms studied.</p>
</caption>
<graphic xlink:href="fbioe-11-1195294-g007.tif"/>
</fig>
</sec>
<sec id="s9">
<title>9 Common challenges</title>
<sec id="s9-1">
<title>9.1 Noise in experiment results</title>
<p>It is inevitable that noise exists in biological experiment readouts. Having noisy experimental results is detrimental to optimization by hampering the ability and speed of the algorithm to converge. For example, in metaheuristics algorithms, the next-generation of candidates is typically generated from high-performing candidates. With noise, high-performing candidates may be spurious and perform poorly on average when tested repeatedly. This will lead to poor candidates generated for the next iteration, and reduce the rate of improvement. Similarly for surrogate models, noisy training data may result in a surrogate model that predicts the true function value (without noise) poorly.</p>
<p>Measures to reduce noise can be implemented in both the experimental and algorithmic design. Biological replicates can be used for each media formulation and the response values can be averaged across replicates, reducing variability in experimental results and providing the optimization algorithm with less noisy response values.</p>
<p>In terms of algorithm choice, many studies (<xref ref-type="bibr" rid="B94">Rakshit et al., 2017</xref>; <xref ref-type="bibr" rid="B112">Sudholt, 2021</xref>) have looked at how noise affects the performance of various methods. Some metaheuristics algorithms are more sensitive to noise, for example, <xref ref-type="bibr" rid="B65">Krink et al. (2004)</xref> found that standard DE performance degrades more than other metaheuristics on noisy functions.</p>
<p>For GPs, modeling with noise can incorporate the variability between replicates into the modeling process. The noise associated with the response may be either homoscedastic noise if noise variances are similar across observations, or heteroscedastic noise if noise variances vary across the observations. If the response measuring equipment is known to have a consistent range of uncertainty within the range of the response, then modeling with homoscedastic noise would suffice. Otherwise, estimates of the noise variance would be computed using the replicates of each candidate separately and heteroscedastic noise incorporated into the GP model.</p>
</sec>
<sec id="s9-2">
<title>9.2 Exploration and exploitation in proposing candidates</title>
<p>The trade-off between exploration and exploitation is a common challenge in optimization. Excessive exploration leads to slow convergence, while excessive exploitation increases the probability of being trapped in local optima. The design of most optimization algorithms implicitly tries to balance between these two needs. For example, in genetic algorithms, the selection and cross-over steps are designed to exploit by proposing new candidates that are similar to previous high-fitness candidates; the mutation step is designed for exploration by allowing for random changes in the candidates to increase diversity in the population. It would be helpful for practitioners to understand which hyperparameters of their algorithm of choice controls the exploration-exploitation tradeoff and adjust them accordingly.</p>
<p>For direct optimization methods, more emphasis on exploitation may be helpful given the limits of experimental capacity. A fast improvement on local optima to obtain &#x201c;good enough&#x201d; solutions could be sufficient for certain experiments.</p>
<p>When optimizing on the acquisition function of the surrogate model that occurs<italic>in silico</italic>, it may be advisable to allow more exploration to find the global optima of the acquisition function since the number of iterations is not limited. Note that this does not mean that the candidates should concentrate around the global optima of the acquisition function as this is in fact excessive <italic>exploitation</italic> behavior on optimizing the actual function. As discussed in <xref ref-type="sec" rid="s6-3-4">Section 6.3.4</xref>, diversity should still be preserved as the surrogate model is unlikely to be accurate at the start and to avoid very similar candidates.</p>
</sec>
</sec>
<sec id="s10">
<title>10 New directions in research</title>
<p>With the expansion of the cultured meat industry and the increasing need for therapeutic antibody discovery, media optimization in more mammalian cell contexts would be beneficial. Optimization for mammalian cell culture media poses a more complex challenge as compared to microbial culture media due to higher number of components. This means the existence of more and potentially higher-order interactions between components, leading to a much more complex objective function. Increasing the number of interactions would also complicate the screening process - the methods currently implemented may not effectively identify the most important components when the number and complexity of interactions between components in the media increase.</p>
<p>In most studies, researchers have used a single measure of enzyme of interest activity or a measurement that is representative of the protein expression level like fluorescence from GFP expression. However, these measurements may not always accurately indicate the actual protein yield in the cells. A related problem is the use of lower-fidelity measurements to reduce experimental burden while complementing with fewer high-fidelity but expensive or time-consuming measurements. In such cases, the need to synthesize multiple information sources related to the outcome will be useful. <xref ref-type="bibr" rid="B18">Cosenza et al. (2022)</xref> introduced the use of multi-information source Bayesian optimization where information from multiple assays was combined to measure cell growth. This is achieved with a modification to the kernel of the GP, by adding an additional Gaussian kernel to account for the deviation of the lower-fidelity measurement away from the true function value (assumed to be equal to high-fidelity measurement). The allocation of each batch of candidates between high and low fidelity for measurement is decided through the combination that produces the highest multi-point expected improvement.</p>
<p>Another possible addition to media optimization problems would be multi-objective optimizations. Often, optimization problems involve multiple objectives that sometimes conflict, such as maximizing cell growth and minimizing cost, that can be optimized simultaneously. Examples include maximizing metabolic activity (<xref ref-type="bibr" rid="B41">Havel et al., 2006</xref>) and maximizing cell count (<xref ref-type="bibr" rid="B63">Kim and Audet, 2019</xref>). Multi-objective optimization would allow for the prediction of a more universal solution set that can be chosen from depending on the unique constraints or subjective desired outcomes. For more information on multi-objective optimization, we refer interested readers to (Collette and Siarry 2004).</p>
<p>Lastly, there is also a lack of use of newly developed algorithms in culture media optimization research. Recently, <xref ref-type="bibr" rid="B127">Yoshida et al. (2022)</xref> and <xref ref-type="bibr" rid="B114">Tachibana et al. (2021)</xref> have used deep neural networks (DNN) with 4 hidden layers for fermentation media optimization and have found that they perform better than other machine learning models. Although deep learning has gained recent popularity in many other applications, it remains to be seen if DNNs will improve outcomes in media optimization application considering limited studies. There are other advanced optimization methods worth implementing for media optimization, including deep kernel regression as surrogate models (<xref ref-type="bibr" rid="B123">Wilson et al., 2015</xref>), and derivative-free reinforcement learning methods like neuroevolution of augmenting topologies (NEAT) (<xref ref-type="bibr" rid="B93">Qian and Yu, 2021</xref>). In this review, we will not be exhaustively expanding upon or analyzing the methods not currently found in culture media optimization literature.</p>
</sec>
<sec id="s11">
<title>11 Comparison of methods with simulation experiment</title>
<p>As described in <xref ref-type="sec" rid="s2-3">Section 2.3</xref>, simulation experiments were done to compare the available methods for media optimization problems. To compare the algorithms&#x2019; performance across all functions, an average performance score is defined as follows:<disp-formula id="equ3">
<mml:math id="m8">
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>m</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>l</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>s</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>d</mml:mi>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mi>f</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>l</mml:mi>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mi>v</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>l</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>e</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>y</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>l</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:msub>
<mml:mi>y</mml:mi>
<mml:mrow>
<mml:mi>f</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>l</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>y</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>l</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>&#x2212;</mml:mo>
<mml:mi>G</mml:mi>
<mml:mi>l</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>b</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>l</mml:mi>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mi>o</mml:mi>
<mml:mi>p</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>m</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>m</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>where y<sub>initial</sub> refers to the functional value at the first iteration; y<sub>final</sub> refers to the functional value at the last iteration and the global optimum is the near-optimal solution found by running DE for 1000 iterations. The average performance score for an algorithm across all different functions can be defined as the mean of the scores on individual functions.</p>
<p>The following methods were compared throughout our simulation experiments:</p>
<p>Metaheuristics methods.<list list-type="simple">
<list-item>
<p>&#x2022; Genetic Algorithm (GA)</p>
</list-item>
<list-item>
<p>&#x2022; Differential Evolution (DEbest1bin)</p>
</list-item>
<list-item>
<p>&#x2022; Particle Swarm Optimization (PSO)</p>
</list-item>
</list>
</p>
<p>Various SBOs with differing surrogate models, with PV as the acquisition function and GA as the acquisition method.<list list-type="simple">
<list-item>
<p>&#x2022; Second-order polynomial (SBO(2OP)-GA-PV)</p>
</list-item>
<list-item>
<p>&#x2022; Kriging (SBO(KRG)-GA-PV)</p>
</list-item>
<list-item>
<p>&#x2022; Multi-Layer Perceptron (SBO(MLP)-GA-PV)</p>
</list-item>
<list-item>
<p>&#x2022; Support Vector Machine (SBO(SVR)-GA-PV)</p>
</list-item>
</list>
</p>
<p>Kriging SBO with a different acquisition function.<list list-type="simple">
<list-item>
<p>&#x2022; EI (SBO(KRG)-GA-EI)</p>
</list-item>
</list>
</p>
<p>Kriging SBO with different acquisition methods, with PV as the acquisition function.<list list-type="simple">
<list-item>
<p>&#x2022; Truncated GA (maximum number of generations &#x3d; 10) (SBO(KRG)-truncGA-PV)</p>
</list-item>
<list-item>
<p>&#x2022; Truncated DE (maximum number of generations &#x3d; 10) (SBO(KRG)-truncDE-PV)</p>
</list-item>
<list-item>
<p>&#x2022; L-BFGS (SBO(KRG)-L-BFGS-B-PV)</p>
</list-item>
</list>
</p>
<p>These models were chosen as they were the most popular methods used in the field of culture media optimization. Kriging was chosen as the model for comparison against other infill strategies as according to our preliminary data, it was the best performing surrogate model. As explained in <xref ref-type="sec" rid="s2-3">Section 2.3</xref>, the experiments were conducted at three levels of dimensions to represent the varying complexities of different types of culture media. The performance of the methods was also compared in experiments with noise to evaluate its effect on various methods.</p>
<p>For the low dimension experiment (dim &#x3d; 5), a comparison of three different DOE methods, LHS, CCD and BBD was also done to understand how each DOE affects the performance of the methods. BBD was supplemented with LHS to ensure an equal number of candidates across the DOE. This comparison was only done for the low dimension experiment as the number of candidates would be too high and unfeasible to replicate experimentally for higher dimensions. BBD was found to be the best performing DOE (<xref ref-type="sec" rid="s17">Supplementary Table S2</xref>).</p>
<p>
<xref ref-type="fig" rid="F8">Figure 8</xref> shows the performance of the various methods in each respective context. In the &#x2018;noiseless&#x2019; experiments, some variation exists in output values across the replicates due to the stochastic nature of the methods. The relative performance of the methods does not differ significantly across the different DOE in the low dimension experiments (data not shown). According to <xref ref-type="fig" rid="F8">Figure 8A</xref>, in the low dimension experiments without additive noise Kriging with truncated DE as the acquisition method and PV as the acquisition function is the best method across all iterations.</p>
<fig id="F8" position="float">
<label>FIGURE 8</label>
<caption>
<p>Performance of the methods in each context, represented by average normalized functional value over each iteration. Bands represent the 95% confidence interval. Unless otherwise stated, the DOE used is LHS. <bold>(A)</bold> Low dimension experiment results, with BBD as the DOE and without additive noise <bold>(B)</bold> Low dimension experiment results, with BBD as the DOE and additive noise <bold>(C)</bold> Medium dimension experiment results without additive noise <bold>(D)</bold> Medium dimension experiment results with additive noise <bold>(E)</bold> High dimension experiment results without additive noise <bold>(F)</bold> High dimension experiment results with additive noise.</p>
</caption>
<graphic xlink:href="fbioe-11-1195294-g008.tif"/>
</fig>
<p>In the experiments with added noise, the performance of all the methods deteriorates significantly. However, SBO(KRG)-truncDE-PV remains the best performing method across all iterations (<xref ref-type="fig" rid="F8">Figure 8B</xref>), with PSO being a close second at higher iterations. According to <xref ref-type="fig" rid="F8">Figure 8B</xref>, methods such as SBO(2OP)-GA-PV and SBO(SVR)-GA-PV perform comparatively well until iteration 4, which is congruent with the many papers that had success with RSM methods with just one iteration (<xref ref-type="bibr" rid="B75">Liu et al., 2009</xref>; <xref ref-type="bibr" rid="B61">Khaouane et al., 2012</xref>; <xref ref-type="bibr" rid="B87">Parkhey et al., 2017</xref>). Another popular method, SBO(MLP)-GA-PV, however, performed less well compared to the others. The SBO with the acquisition method as a local optimizer (SBO(KRG)-L-BFGS-B-PV) performed the worst consistently across all iterations and its performance seems to degrade significantly in the presence of noise (<xref ref-type="fig" rid="F8">Figures 8A, B</xref>).</p>
<p>
<xref ref-type="fig" rid="F8">Figure 8C</xref> shows the results for the medium dimension (dim &#x3d; 20) experiments without additive noise. Interestingly, in this case SBO(KRG)-truncGA-PV performs better than SBO(KRG)-truncDE-PV at certain iterations. These two methods perform significantly better than the other methods, while the performance of PSO, DE, SBO(KRG)-L-BFGS-B, SBO(KRG)-GA-PV and SBO(KRG)-GA-EI is comparable across iterations.</p>
<p>Similar to the low dimension experiments, the performance of all the methods deteriorates with the addition of noise. The performance of SBO(KRG)-truncGA-PV, SBO(KRG)-truncDE-PV and PSO are comparable from iteration 4 onwards, with significantly better performance as compared to the other methods. Notably, the performance of SBO(2OP)-GA-PV is much lower compared to its performance in the low dimension experiment.</p>
<p>The results of the high dimension (dim &#x3d; 40) experiments without additive noise (<xref ref-type="fig" rid="F8">Figure 8E</xref>) are very similar to that of <xref ref-type="fig" rid="F8">Figure 8C</xref>, barring a significant deterioration in the performance of SBO(SVR)-GA-PV.</p>
<p>In the high dimension experiments with additive noise, SBO(KRG)-truncDE-PV is clearly the best choice over almost all iterations, with PSO as a feasible substitute.</p>
<p>Overall, the SBO methods with truncated acquisition methods seemed to triumph across all the contexts. This was likely due to the conserved diversity in the proposed candidates which prevented convergence to a local optimum, thus allowing the method to find a near-optimal solution faster. PSO has also fared relatively well across the experiments even with additive noise. Notably, the hypothesis that SVM would perform well with small datasets has generally held true. It is also surprising that SBO(KRG)-GA-EI did not perform better than its PV counterpart across all experiments, as EI helps to balance exploration of the input space with exploitation. This could be because the parameters set for GA allow it to strike a good balance between exploration and exploitation such that the use of EI as an acquisition function disrupted this balance, resulting in too much exploration and thus wasting resources on solutions that are less likely to be near-optimal. ANN did not seem to perform well across the experiments. The popularity of ANNs and their improved performance against RSM as reported in many media optimization studies is therefore surprising. This could be because most such studies conduct a one-step optimization which is not sufficient to determine the performance at higher number of iterations. Another reason could be the lack of hyperparameter tuning in the implementation of the MLP or other differences in implementation.</p>
<p>
<xref ref-type="table" rid="T1">Table 1</xref> shows a compilation of all the methods used in culture media optimization literature and their corresponding rankings according to the normalized values reached at the 10th iteration; and the number of iterations needed to achieve the 1st iteration normalized value reached by SBO(KRG)-truncDE-PV, averaged across all noisy experiments. There is a lack of papers that have used the best performing methods we have found, meaning that the use of these methods could result in better improvements than reported.</p>
<table-wrap id="T1" position="float">
<label>TABLE 1</label>
<caption>
<p>Compilation of all the methods used in culture media optimization. Rankings are listed for methods that were evaluated in this paper, based on final functional values across experiments with additive noise, ranked 1&#x2013;10 from best to worst. Iterations to reach set value is the number of iterations each method took to reach to achieve the normalized value achieved by the best performing method, SBO(KRG)-truncDE-PV in the first iteration.</p>
</caption>
<table>
<thead valign="top">
<tr>
<th align="left">Type</th>
<th align="left">Surrogate Model/Metaheuristic</th>
<th align="left">Acquisition method</th>
<th align="left">References</th>
<th colspan="2" align="left">Count</th>
<th colspan="2" align="left">Ranking</th>
<th align="left">Iterations to reach set value</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="left">Metaheuristic</td>
<td align="left">GA</td>
<td align="left">-</td>
<td align="left">
<xref ref-type="bibr" rid="B31">Garlapati et al., 2010</xref>; <xref ref-type="bibr" rid="B89">Patil et al., 2002</xref>; <xref ref-type="bibr" rid="B80">Marteijn et al., 2003</xref>; <xref ref-type="bibr" rid="B7">Bapat and Wangikar, 2004</xref>; D. <xref ref-type="bibr" rid="B122">Weuster-Botz and Wandrey, 1995</xref>; <xref ref-type="bibr" rid="B46">Hofer et al., 2004</xref>; <xref ref-type="bibr" rid="B49">Hutwimmer et al., 2008a</xref> (2008b),<xref ref-type="bibr" rid="B98">Sarma et al., 2009</xref>; <xref ref-type="bibr" rid="B67">Kucharzyk et al., 2012</xref>; <xref ref-type="bibr" rid="B116">Ti&#x161;ma et al., 2012</xref>; <xref ref-type="bibr" rid="B14">Chauhan et al., 2013</xref>; <xref ref-type="bibr" rid="B106">Singha and Panda, 2014</xref>; <xref ref-type="bibr" rid="B13">Camacho-Rodr&#xed;guez et al., 2015</xref>; <xref ref-type="bibr" rid="B84">Munroe et al., 2019</xref>; <xref ref-type="bibr" rid="B11">Brinc and Beli&#x10d;, 2019</xref>; <xref ref-type="bibr" rid="B45">Heylen et al., 2006</xref>
</td>
<td colspan="2" align="right">16</td>
<td colspan="2" align="right">8</td>
<td align="right">7</td>
</tr>
<tr>
<td align="left">Metaheuristic</td>
<td align="left">PSO</td>
<td align="left">-</td>
<td align="left">
<xref ref-type="bibr" rid="B16">Cockshott and Hartman, 2001</xref>; <xref ref-type="bibr" rid="B47">Huang et al., 2007</xref>; <xref ref-type="bibr" rid="B31">Garlapati et al., 2010</xref>
</td>
<td colspan="2" align="right">3</td>
<td colspan="2" align="right">3</td>
<td align="right">2</td>
</tr>
<tr>
<td align="left">Metaheuristic</td>
<td align="left">Multi-objective GA</td>
<td align="left">-</td>
<td align="left">(<xref ref-type="bibr" rid="B41">Havel et al., 2006</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Metaheuristic</td>
<td align="left">SA</td>
<td align="left">-</td>
<td align="left">(<xref ref-type="bibr" rid="B23">Dhagat and Jujjavarapu, 2021</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">Metaheuristic</td>
<td align="left">DE</td>
<td align="left">-</td>
<td align="left">(<xref ref-type="bibr" rid="B63">Kim and Audet, 2019</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">5</td>
<td align="right">3</td>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GP</td>
<td align="left">Local optimizer</td>
<td align="left">(<xref ref-type="bibr" rid="B18">Cosenza et al., 2022</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">10</td>
<td align="right">&#x3e;10</td>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">ANN</td>
<td align="left">GA</td>
<td align="left">(<xref ref-type="bibr" rid="B21">Desai et al., 2006</xref>; <xref ref-type="bibr" rid="B22">Desai et al., 2008</xref>; <xref ref-type="bibr" rid="B85">Pal et al., 2009</xref>; <xref ref-type="bibr" rid="B8">Baskar and Renganathan, 2010</xref>; <xref ref-type="bibr" rid="B24">Du et al., 2012</xref>; <xref ref-type="bibr" rid="B39">Haider et al., 2008</xref>; <xref ref-type="bibr" rid="B102">Singh et al., 2008</xref>; <xref ref-type="bibr" rid="B90">Peng et al., 2014</xref>; <xref ref-type="bibr" rid="B88">Pathak et al., 2015</xref>; <xref ref-type="bibr" rid="B113">Suryawanshi et al., 2019</xref>; <xref ref-type="bibr" rid="B99">Selvaraj et al., 2019</xref>; <xref ref-type="bibr" rid="B42">He et al., 2008</xref>; <xref ref-type="bibr" rid="B111">Subba Rao et al., 2008</xref>; <xref ref-type="bibr" rid="B104">Singh et al., 2009</xref>; <xref ref-type="bibr" rid="B36">Guo et al., 2010</xref>; <xref ref-type="bibr" rid="B38">Gurunathan and Sahadevan, 2011</xref>; <xref ref-type="bibr" rid="B55">Kana et al., 2012</xref>; <xref ref-type="bibr" rid="B8">Baskar and Renganathan, 2010</xref>; <xref ref-type="bibr" rid="B55">Kana et al., 2012</xref>; <xref ref-type="bibr" rid="B95">Rekha et al., 2013</xref>; <xref ref-type="bibr" rid="B130">Zhou et al., 2015</xref>; <xref ref-type="bibr" rid="B121">Wei et al., 2017</xref>; <xref ref-type="bibr" rid="B69">Kumar et al., 2017</xref>; <xref ref-type="bibr" rid="B86">Pandey et al., 2018</xref>; <xref ref-type="bibr" rid="B59">Katla et al., 2019</xref>; <xref ref-type="bibr" rid="B54">Joji et al., 2019</xref>; <xref ref-type="bibr" rid="B91">Prabhu et al., 2020</xref>; <xref ref-type="bibr" rid="B129">Zhang et al., 2020</xref>; Imandi et al. n.d.)</td>
<td colspan="2" align="right">29</td>
<td colspan="2" align="right">9</td>
<td align="right">&#x3e;10</td>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">ANN</td>
<td align="left">PSO</td>
<td align="left">(<xref ref-type="bibr" rid="B61">Khaouane et al., 2012</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">ANN</td>
<td align="left">hybrid GA</td>
<td align="left">(<xref ref-type="bibr" rid="B17">Coleman et al., 2003</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">ANN</td>
<td align="left">Local optimizer</td>
<td align="left">(<xref ref-type="bibr" rid="B117">Tripathi et al., 2012</xref>; <xref ref-type="bibr" rid="B23">Dhagat and Jujjavarapu, 2021</xref>)</td>
<td colspan="2" align="right">2</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RSM</td>
<td align="left">GA</td>
<td align="left">(<xref ref-type="bibr" rid="B87">Parkhey et al., 2017</xref>; <xref ref-type="bibr" rid="B99">Selvaraj et al., 2019</xref>; <xref ref-type="bibr" rid="B69">Kumar et al., 2017</xref>; <xref ref-type="bibr" rid="B78">Maiti et al., 2011</xref>; <xref ref-type="bibr" rid="B82">Moorthy and Baskar, 2013</xref>; Y; <xref ref-type="bibr" rid="B105">Singh and Srivastava, 2013</xref>; <xref ref-type="bibr" rid="B58">Kanimozhi et al., 2017</xref>; <xref ref-type="bibr" rid="B101">Shirodkar and Muraleedharan, 2017</xref>; <xref ref-type="bibr" rid="B108">Srivastava et al., 2018</xref>)</td>
<td colspan="2" align="right">9</td>
<td colspan="2" align="right">7</td>
<td align="right">&#x3e;10</td>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RSM</td>
<td align="left">DE</td>
<td align="left">(<xref ref-type="bibr" rid="B25">Eswari et al., 2013</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RSM</td>
<td align="left">PSO</td>
<td align="left">(<xref ref-type="bibr" rid="B61">Khaouane et al., 2012</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RSM</td>
<td align="left">SA</td>
<td align="left">(<xref ref-type="bibr" rid="B87">Parkhey et al., 2017</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RSM</td>
<td align="left">Analytical Solution</td>
<td align="left">(<xref ref-type="bibr" rid="B3">Abdel-Fattah et al., 2007</xref>; <xref ref-type="bibr" rid="B2">Abdel-Fattah, 2009</xref>; <xref ref-type="bibr" rid="B12">Burrows et al., 2009</xref>; <xref ref-type="bibr" rid="B1">Abbasi et al., 2013</xref>; <xref ref-type="bibr" rid="B26">Farag et al., 2018</xref>; <xref ref-type="bibr" rid="B59">Katla et al., 2019</xref>)</td>
<td colspan="2" align="right">6</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RSM</td>
<td align="left">Q2</td>
<td align="left">(<xref ref-type="bibr" rid="B12">Burrows et al., 2009</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RSM</td>
<td align="left">Multi-objective GA</td>
<td align="left">(<xref ref-type="bibr" rid="B68">Kumar et al., 2015</xref>; <xref ref-type="bibr" rid="B118">Unuofin et al., 2019</xref>)</td>
<td colspan="2" align="right">2</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">SVM</td>
<td align="left">GA</td>
<td align="left">(<xref ref-type="bibr" rid="B125">Xu et al., 2014</xref>; <xref ref-type="bibr" rid="B44">Hesami and Jones, 2021</xref>)</td>
<td colspan="2" align="right">2</td>
<td colspan="2" align="right">6</td>
<td align="right">2</td>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">Random Forests</td>
<td align="left">Exhaustive grid search, kmeans</td>
<td align="left">(<xref ref-type="bibr" rid="B35">Grzesik and Warth, 2021</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RBFNN</td>
<td align="left">GA</td>
<td align="left">(<xref ref-type="bibr" rid="B115">Tian et al., 2013</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RBFNN</td>
<td align="left">PSO</td>
<td align="left">(<xref ref-type="bibr" rid="B75">Liu et al., 2009</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RBFNN</td>
<td align="left">truncated GA</td>
<td align="left">(<xref ref-type="bibr" rid="B128">Zhang and Block, 2009</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">RBFNN</td>
<td align="left">truncated GA &#x2b; DYCORS</td>
<td align="left">(<xref ref-type="bibr" rid="B19">Cosenza et al., 2021</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">Neuro-fuzzy networks</td>
<td align="left">SA</td>
<td align="left">(<xref ref-type="bibr" rid="B5">Aquino et al., 2016</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">Bayesian-regularized NN</td>
<td align="left">Batch relative information gain</td>
<td align="left">(<xref ref-type="bibr" rid="B128">Zhang and Block, 2009</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">Bayesian-regularized NN</td>
<td align="left">PSO</td>
<td align="left">(<xref ref-type="bibr" rid="B61">Khaouane et al., 2012</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">Elastic net regularized general linear models</td>
<td align="left">Exhaustive grid search, kmeans</td>
<td align="left">(<xref ref-type="bibr" rid="B35">Grzesik and Warth, 2021</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">Non-parametric Regression with Gaussian Kernel</td>
<td align="left">ranking pseudo-r square</td>
<td align="left">(<xref ref-type="bibr" rid="B131">Zou et al., 2020</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GRNN</td>
<td align="left">Fruit fly optimization</td>
<td align="left">(<xref ref-type="bibr" rid="B97">Salehi et al., 2021</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GRNN</td>
<td align="left">GA</td>
<td align="left">(<xref ref-type="bibr" rid="B52">Jafari et al., 2022</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">Adaptive neuro-fuzzy inference system</td>
<td align="left">GA</td>
<td align="left">(<xref ref-type="bibr" rid="B27">Farhadi et al., 2020</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GP</td>
<td align="left">Markov Chain Monte Carlo</td>
<td align="left">(<xref ref-type="bibr" rid="B83">Morschett et al., 2017</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GP</td>
<td align="left">Batch Contextual Local Penalization</td>
<td align="left">(<xref ref-type="bibr" rid="B57">Kanda et al., 2022</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GP</td>
<td align="left">Full factorial in new expanded region</td>
<td align="left">(<xref ref-type="bibr" rid="B29">Freier et al., 2016</xref>)</td>
<td colspan="2" align="right">1</td>
<td colspan="2" align="right">-</td>
<td align="left"/>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GP</td>
<td align="left">GA</td>
<td align="left"/>
<td colspan="2" align="right">0</td>
<td colspan="2" align="right">4</td>
<td align="right">2</td>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GP</td>
<td align="left">Truncated GA</td>
<td align="left"/>
<td colspan="2" align="right">0</td>
<td colspan="2" align="right">2</td>
<td align="right">2</td>
</tr>
<tr>
<td align="left">SBO</td>
<td align="left">GP</td>
<td align="left">Truncated DE</td>
<td align="left"/>
<td colspan="2" align="right">0</td>
<td colspan="2" align="right">1</td>
<td align="right">1</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Based on the results of the simulation experiments, we recommend Kriging SBO together with a truncated metaheuristics acquisition method as a highly competitive optimization method that would likely meet the needs of most culture media optimization experiments.</p>
<p>While the simulation experiments conducted were quite extensive, they were not exhaustive and were focused more on the methods proven to yield promising results. For example, methods such as variable neighborhood search, simulated annealing and a variety of direct search methods like mesh adaptive direct search have not been evaluated in the culture media optimization context. Given that PSO performs well as a metaheuristic, it would also be interesting to know whether using it as an acquisition method would improve the current best method of SBO(KRG)-PV, or in general for other surrogate models. It would also be useful to verify if the truncated acquisition methods improve the performance of surrogate models other than Kriging. Furthermore, the number of generations for truncated acquisition methods could be optimized such that the diversity of the suggested candidates is maintained while also converging faster to the global optimum. Lastly, while the BBOB test suite makes for a suitable alternative to time-consuming experimental validation, the validity of these results should be proven through experimental validation.</p>
</sec>
<sec id="s12">
<title>12 Conclusion and outlook</title>
<p>Since its inception, cell culture is playing an increasingly important role in the production of various useful products including biopharmaceuticals, enzymes and chemicals. The market size for products manufactured with microbial fermentation is estimated to be USD 28.3 billion (<xref ref-type="bibr" rid="B72">Custom Market Insights, 2022</xref>) and the market size for biopharmaceuticals manufactured using animal cell systems is estimated to be USD 24.6 billion (&#x2018;Cell Culture Market Size, Share &#x26; Trends Report, 2022&#x2013;2030&#x2019; 2018). In recent years, new applications are emerging in cell culture for food production known as cellular agriculture, which includes cultured meat, single-cell proteins and precision fermentation. The market for cellular agriculture is projected to grow to USD 515.2 billion by 2030 (Research 2021). In scalable cell culture production systems, culture media is often the crucial determinator of techno-economic viability through its effect on product yield and its role as the main contributor to production costs. For example, in monoclonal antibody production using animal cell cultures, culture media is estimated to account for 30%&#x2013;40% of the production cost (<xref ref-type="bibr" rid="B9">Batista and Fernandes, 2015</xref>). Thus, optimizing the culture media for yield and cost is very important for the success of scalable commercial applications and for reducing the cost for many products.</p>
<p>Given the expansive literature on culture media optimization, we recognize the challenge researchers face in perusing existing works and selecting a suitable optimization methodology for their own cell culture application. In the absence of studies that summarize and compare between various available methods, it is common to default to methods that were used in similar studies when a better method may exist.</p>
<p>In this review article, we hoped to (i) provide a generalized framework for understanding and designing culture media optimization experiments; (ii) summarize and classify the large number of existing works; (iii) examine common challenges and algorithmic features that were designed and chosen during past efforts to such challenges; (iv) provide recommendations on the type of algorithm to use based on benchmark comparisons.</p>
<p>For standard applications of culture media optimization, the benchmark comparison may serve as a reference to help select the algorithm that is likely to perform well on generic unknown functions within experimental resource constraints. For applications where existing methods provide limited results, and are seeking to develop improved algorithms, we identify a few possible areas where efforts might be fruitful.</p>
<p>Development of better surrogate models is an area where advances in data-driven predictive models can be applied. As discussed in <xref ref-type="sec" rid="s10">Section 10</xref>, advances in machine learning algorithms and deep learning may find useful application especially for higher dimension problems. From the benchmark experiments, the method of acquisition has a significant effect on optimization performance. Hence designing of better acquisition methods is also likely to yield improvements. Acquisition methods that can successfully balance exploitation and exploration through clever designs can achieve fast improvement while avoiding premature convergence. Other areas where new developments have been made are discussed in detail in <xref ref-type="sec" rid="s10">Section 10</xref>.</p>
<p>In this review, we have focused exclusively on methods used in cell culture media that are knowledge-blind. However, it should be recognized that there are methodological limitations of relying on such optimization approaches where only media component concentrations/levels and output responses are collected and used. Biological data of cell response, such as the cell transcriptomes, or analysis of spent media can be used to enhance predictive surrogate models. Chemical information about the media components may also be incorporated into such models, opening up the possibility of building surrogate models that can generalize to other media components outside the original list of components used in the experiment. Various AI, bioinformatic and chemoinformatic approaches may find potential use in the discovery, design and optimization of cell culture media.</p>
</sec>
</body>
<back>
<sec id="s13">
<title>Author contributions</title>
<p>TZ and KC contributed to the conception and design of the review study. TZ, RR, and RK collected and reviewed the literature and data. RR, TZ, and RK wrote the software to perform simulation experiments. RR performed the analysis and visualization. TZ and RR wrote the first draft of the manuscript. All authors contributed to manuscript revision, read, and approved the submitted version. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec id="s14">
<title>Funding</title>
<p>This work is supported by the &#x201c;CRISP Meats: CentRe of Innovation for Sustainable banking and Production of cultivated Meats&#x201d; project under the &#x201c;Singapore Food Story R&#x26;D Programme Theme 2 &#x2013; IAF-PP on Future Foods: Alternative Proteins&#x201d; program.</p>
</sec>
<sec sec-type="COI-statement" id="s15">
<title>Conflict of interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="disclaimer" id="s16">
<title>Publisher&#x2019;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<sec id="s17">
<title>Supplementary material</title>
<p>The Supplementary Material for this article can be found online at: <ext-link ext-link-type="uri" xlink:href="https://www.frontiersin.org/articles/10.3389/fbioe.2023.1195294/full#supplementary-material">https://www.frontiersin.org/articles/10.3389/fbioe.2023.1195294/full&#x23;supplementary-material</ext-link>
</p>
<supplementary-material xlink:href="Table1.DOCX" id="SM1" mimetype="application/DOCX" xmlns:xlink="http://www.w3.org/1999/xlink"/>
</sec>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Abbasi</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Sharafi</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Alidost</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Bodagh</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Zahiri</surname>
<given-names>H. S.</given-names>
</name>
<name>
<surname>Noghabi</surname>
<given-names>K. A.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Response surface optimization of biosurfactant produced by <italic>Pseudomonas aeruginosa</italic> MA01 isolated from spoiled apples</article-title>. <source>Prep. Biochem. Biotechnol.</source> <volume>43</volume> (<issue>4</issue>), <fpage>398</fpage>&#x2013;<lpage>414</lpage>. <pub-id pub-id-type="doi">10.1080/10826068.2012.747966</pub-id>
</citation>
</ref>
<ref id="B2">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Abdel-Fattah</surname>
<given-names>Y. R.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Bioprocess development for production of alkaline protease by Bacillus pseudofirmus Mn6 through statistical experimental designs</article-title>. <source>J. Microbiol. Biotechnol.</source> <volume>19</volume> (<issue>4</issue>), <fpage>378</fpage>&#x2013;<lpage>386</lpage>. <pub-id pub-id-type="doi">10.4014/jmb.0806.380</pub-id>
</citation>
</ref>
<ref id="B3">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Abdel-Fattah</surname>
<given-names>Y. R.</given-names>
</name>
<name>
<surname>El Enshasy</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Anwar</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Omar</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Abolmagd</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Abou Zahra</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2007</year>). <article-title>Application of factorial experimental designs for optimization of cyclosporin. A production by tolypocladium inflatum in submerged culture</article-title>. <source>J. Microbiol. Biotechnol.</source> <volume>17</volume> (<issue>12</issue>), <fpage>1930</fpage>&#x2013;<lpage>1936</lpage>.</citation>
</ref>
<ref id="B4">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Antony</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2014</year>). &#x201c;<article-title>5 - screening designs</article-title>,&#x201d; in <source>Design of experiments for engineers and scientists</source>. Editor <person-group person-group-type="editor">
<name>
<surname>Antony</surname>
<given-names>J.</given-names>
</name>
</person-group> <edition>Second Edition</edition> (<publisher-loc>Oxford</publisher-loc>: <publisher-name>Elsevier</publisher-name>), <fpage>51</fpage>&#x2013;<lpage>62</lpage>. <pub-id pub-id-type="doi">10.1016/B978-0-08-099417-8.00005-5</pub-id>
</citation>
</ref>
<ref id="B5">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Aquino</surname>
<given-names>P. L. M.</given-names>
</name>
<name>
<surname>Fonseca</surname>
<given-names>F. S.</given-names>
</name>
<name>
<surname>Mozzer</surname>
<given-names>O. D.</given-names>
</name>
<name>
<surname>Giordano</surname>
<given-names>R. C.</given-names>
</name>
<name>
<surname>Sousa</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>Optimization of the production of inactivated Clostridium novyi type B vaccine using computational intelligence techniques</article-title>. <source>Appl. Biochem. Biotechnol.</source> <volume>179</volume> (<issue>5</issue>), <fpage>895</fpage>&#x2013;<lpage>909</lpage>. <pub-id pub-id-type="doi">10.1007/s12010-016-2038-3</pub-id>
</citation>
</ref>
<ref id="B6">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Audet</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Hare</surname>
<given-names>W.</given-names>
</name>
</person-group> (<year>2017</year>). &#x201c;<article-title>Derivative-free and blackbox optimization</article-title>,&#x201d; in <source>Springer series in operations research and financial engineering</source> (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer International Publishing</publisher-name>).</citation>
</ref>
<ref id="B7">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bapat</surname>
<given-names>P. M.</given-names>
</name>
<name>
<surname>Wangikar</surname>
<given-names>P. P.</given-names>
</name>
</person-group> (<year>2004</year>). <article-title>Optimization of rifamycin B fermentation in shake flasks via a machine-learning-based approach</article-title>. <source>Biotechnol. Bioeng.</source> <volume>86</volume> (<issue>2</issue>), <fpage>201</fpage>&#x2013;<lpage>208</lpage>. <pub-id pub-id-type="doi">10.1002/bit.20056</pub-id>
</citation>
</ref>
<ref id="B8">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Baskar</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Renganathan</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2010</year>). <article-title>Optimization of L-asparaginase production by Aspergillus terreus MTCC 1782 using response surface methodology and artificial neural network-linked genetic algorithm</article-title>. <source>Asia-Pacific J. Chem. Eng.</source> <volume>7</volume> (<issue>2</issue>), <fpage>212</fpage>&#x2013;<lpage>220</lpage>. <pub-id pub-id-type="doi">10.1002/apj.520</pub-id>
</citation>
</ref>
<ref id="B9">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Batista</surname>
<given-names>K. A.</given-names>
</name>
<name>
<surname>Fernandes</surname>
<given-names>K. F.</given-names>
</name>
</person-group> (<year>2015</year>). <article-title>Development and optimization of a new culture media using extruded bean as nitrogen source</article-title>. <source>MethodsX</source> <volume>2</volume>, <fpage>154</fpage>&#x2013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1016/j.mex.2015.03.001</pub-id>
</citation>
</ref>
<ref id="B10">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Bhosekar</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2020</year>). <source>Supply chain optimization and modular process design using machine learning-based frameworks</source>. <publisher-loc>New Jersey</publisher-loc>: <publisher-name>Rutgers University Community Repository</publisher-name>.</citation>
</ref>
<ref id="B11">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Brinc</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Beli&#x10d;</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Optimization of process conditions for mammalian fed-batch cell culture in automated micro-bioreactor system using genetic algorithm</article-title>. <source>J. Biotechnol.</source> <volume>300</volume>, <fpage>40</fpage>&#x2013;<lpage>47</lpage>. <pub-id pub-id-type="doi">10.1016/j.jbiotec.2019.05.001</pub-id>
</citation>
</ref>
<ref id="B12">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Burrows</surname>
<given-names>E. H.</given-names>
</name>
<name>
<surname>Wong</surname>
<given-names>W. K.</given-names>
</name>
<name>
<surname>Fern</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Chaplen</surname>
<given-names>F. W. R.</given-names>
</name>
<name>
<surname>Ely</surname>
<given-names>R. L.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Optimization of pH and nitrogen for enhanced hydrogen production by Synechocystis sp. PCC 6803 via statistical and machine learning methods</article-title>. <source>Biotechnol. Prog.</source> <volume>25</volume> (<issue>4</issue>), <fpage>1009</fpage>&#x2013;<lpage>1017</lpage>. <pub-id pub-id-type="doi">10.1002/btpr.213</pub-id>
</citation>
</ref>
<ref id="B13">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Camacho-Rodr&#xed;guez</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Cer&#xf3;n-Garc&#xed;a</surname>
<given-names>M. C.</given-names>
</name>
<name>
<surname>Fern&#xe1;ndez-Sevilla</surname>
<given-names>J. M.</given-names>
</name>
<name>
<surname>Molina-Grima</surname>
<given-names>E.</given-names>
</name>
</person-group> (<year>2015</year>). <article-title>Genetic algorithm for the medium optimization of the microalga nannochloropsis gaditana cultured to aquaculture</article-title>. <source>Bioresour. Technol.</source> <volume>177</volume>, <fpage>102</fpage>&#x2013;<lpage>109</lpage>. <pub-id pub-id-type="doi">10.1016/j.biortech.2014.11.057</pub-id>
</citation>
</ref>
<ref id="B14">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chauhan</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Chauhan</surname>
<given-names>R. S.</given-names>
</name>
<name>
<surname>Garlapati</surname>
<given-names>V. K.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Modelling and optimization studies on a novel lipase production by Staphylococcus arlettae through submerged fermentation</article-title>. <source>Enzyme Res.</source> <volume>2013</volume>, <fpage>1</fpage>&#x2013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1155/2013/353954</pub-id>
</citation>
</ref>
<ref id="B15">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chintalapati</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Kumar</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Prasad</surname>
<given-names>R. K.</given-names>
</name>
<name>
<surname>Mathur</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Optimal design of an <italic>in-situ</italic> bioremediation system using support vector machine and particle swarm optimization</article-title>. <source>J. Contam. Hydrology</source> <volume>151C</volume>, <fpage>105</fpage>&#x2013;<lpage>116</lpage>. <pub-id pub-id-type="doi">10.1016/j.jconhyd.2013.05.003</pub-id>
</citation>
</ref>
<ref id="B16">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cockshott</surname>
<given-names>A. R.</given-names>
</name>
<name>
<surname>Hartman</surname>
<given-names>B. E.</given-names>
</name>
</person-group> (<year>2001</year>). <article-title>Improving the fermentation medium for echinocandin B production Part II: Particle swarm optimization</article-title>. <source>Process Biochem.</source> <volume>36</volume> (<issue>7</issue>), <fpage>661</fpage>&#x2013;<lpage>669</lpage>. <pub-id pub-id-type="doi">10.1016/S0032-9592(00)00261-2</pub-id>
</citation>
</ref>
<ref id="B17">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Coleman</surname>
<given-names>M. C.</given-names>
</name>
<name>
<surname>Buck</surname>
<given-names>K. K. S.</given-names>
</name>
<name>
<surname>Block</surname>
<given-names>D. E.</given-names>
</name>
</person-group> (<year>2003</year>). <article-title>An integrated approach to optimization of Escherichia coli fermentations using historical data</article-title>. <source>Biotechnol. Bioeng.</source> <volume>84</volume> (<issue>3</issue>), <fpage>274</fpage>&#x2013;<lpage>285</lpage>. <pub-id pub-id-type="doi">10.1002/bit.10719</pub-id>
</citation>
</ref>
<ref id="B18">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cosenza</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Astudillo</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Frazier</surname>
<given-names>P. I.</given-names>
</name>
<name>
<surname>Baar</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Block</surname>
<given-names>D. E.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>Multi-information source bayesian optimization of culture media for cellular agriculture</article-title>. <source>Biotechnol. Bioeng.</source> <volume>119</volume> (<issue>9</issue>), <fpage>2447</fpage>&#x2013;<lpage>2458</lpage>. <pub-id pub-id-type="doi">10.1002/bit.28132</pub-id>
</citation>
</ref>
<ref id="B19">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cosenza</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Block</surname>
<given-names>D. E.</given-names>
</name>
<name>
<surname>Baar</surname>
<given-names>K.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Optimization of muscle cell culture media using nonlinear design of experiments</article-title>. <source>Biotechnol. J.</source> <volume>16</volume> (<issue>11</issue>), <fpage>2100228</fpage>. <pub-id pub-id-type="doi">10.1002/biot.202100228</pub-id>
</citation>
</ref>
<ref id="B20">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Dean</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Voss</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Dragulji&#x107;</surname>
<given-names>D.</given-names>
</name>
</person-group> (<year>2017</year>). &#x201c;<article-title>Response surface methodology</article-title>,&#x201d; in <source>Design and analysis of experiments</source>. Editors <person-group person-group-type="editor">
<name>
<surname>Dean</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Voss</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Dragulji&#x107;</surname>
<given-names>D.</given-names>
</name>
</person-group> (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer International Publishing</publisher-name>). <comment>565&#x2013;614. Springer Texts in Statistics</comment>.</citation>
</ref>
<ref id="B21">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Desai</surname>
<given-names>K. M.</given-names>
</name>
<name>
<surname>Akolkar</surname>
<given-names>S. K.</given-names>
</name>
<name>
<surname>Badhe</surname>
<given-names>Y. P.</given-names>
</name>
<name>
<surname>Tambe</surname>
<given-names>S. S.</given-names>
</name>
<name>
<surname>Lele</surname>
<given-names>S. S.</given-names>
</name>
</person-group> (<year>2006</year>). <article-title>Optimization of fermentation media for exopolysaccharide production from lactobacillus plantarum using artificial intelligence-based techniques</article-title>. <source>Process Biochem.</source> <volume>41</volume> (<issue>8</issue>), <fpage>1842</fpage>&#x2013;<lpage>1848</lpage>. <pub-id pub-id-type="doi">10.1016/j.procbio.2006.03.037</pub-id>
</citation>
</ref>
<ref id="B22">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Desai</surname>
<given-names>K. M.</given-names>
</name>
<name>
<surname>Survase</surname>
<given-names>S. A.</given-names>
</name>
<name>
<surname>Saudagar</surname>
<given-names>P. S.</given-names>
</name>
<name>
<surname>Lele</surname>
<given-names>S. S.</given-names>
</name>
<name>
<surname>Singhal</surname>
<given-names>R. S.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Comparison of artificial neural network (ANN) and response surface methodology (RSM) in fermentation media optimization: Case study of fermentative production of scleroglucan</article-title>. <source>Biochem. Eng. J.</source> <volume>41</volume> (<issue>3</issue>), <fpage>266</fpage>&#x2013;<lpage>273</lpage>. <pub-id pub-id-type="doi">10.1016/j.bej.2008.05.009</pub-id>
</citation>
</ref>
<ref id="B23">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dhagat</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Jujjavarapu</surname>
<given-names>S. E.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Simulated annealing and artificial neural network as optimization tools to enhance yields of bioemulsifier and exopolysaccharides by thermophilic brevibacillus borstelensis</article-title>. <source>J. Environ. Chem. Eng.</source> <volume>9</volume> (<issue>4</issue>), <fpage>105499</fpage>. <pub-id pub-id-type="doi">10.1016/j.jece.2021.105499</pub-id>
</citation>
</ref>
<ref id="B24">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Du</surname>
<given-names>L. N.</given-names>
</name>
<name>
<surname>Song</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>H. B.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>Z. Z.</given-names>
</name>
<name>
<surname>Meng</surname>
<given-names>L. J.</given-names>
</name>
<etal/>
</person-group> (<year>2012</year>). <article-title>Optimization of the fermentation medium for paecilomyces tenuipes N45 using statistical approach</article-title>. <source>Afr. J. Microbiol. Res.</source> <volume>6</volume>, <fpage>6130</fpage>&#x2013;<lpage>6141</lpage>.</citation>
</ref>
<ref id="B25">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Eswari</surname>
<given-names>J. S.</given-names>
</name>
<name>
<surname>Anand</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Venkateswarlu</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Optimum culture medium composition for rhamnolipid production by Pseudomonas aeruginosa AT10 using a novel multi-objective optimization method</article-title>. <source>J. Chem. Technol. Biotechnol.</source> <volume>88</volume> (<issue>2</issue>), <fpage>271</fpage>&#x2013;<lpage>279</lpage>. <pub-id pub-id-type="doi">10.1002/jctb.3825</pub-id>
</citation>
</ref>
<ref id="B26">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Farag</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Soliman</surname>
<given-names>N. A.</given-names>
</name>
<name>
<surname>Abdel-Fattah</surname>
<given-names>Y. R.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>Statistical optimization of crude oil bio-degradation by a local marine bacterium isolate Pseudomonas sp. Sp48</article-title>. <source>J. Genet. Eng. Biotechnol.</source> <volume>16</volume> (<issue>2</issue>), <fpage>409</fpage>&#x2013;<lpage>420</lpage>. <pub-id pub-id-type="doi">10.1016/j.jgeb.2018.01.001</pub-id>
</citation>
</ref>
<ref id="B27">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Farhadi</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Salehi</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Moieni</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Safaie</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Sabet</surname>
<given-names>M. S.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Modeling of paclitaxel biosynthesis elicitation in corylus avellana cell culture using adaptive neuro-fuzzy inference system-genetic algorithm (ANFIS-GA) and multiple regression methods</article-title>. <source>PLOS ONE</source> <volume>15</volume> (<issue>8</issue>), <fpage>e0237478</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0237478</pub-id>
</citation>
</ref>
<ref id="B28">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Forrester</surname>
<given-names>A. I. J.</given-names>
</name>
<name>
<surname>Keane</surname>
<given-names>A. J.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Recent advances in surrogate-based optimization</article-title>. <source>Prog. Aerosp. Sci.</source> <volume>45</volume> (<issue>1</issue>), <fpage>50</fpage>&#x2013;<lpage>79</lpage>. <pub-id pub-id-type="doi">10.1016/j.paerosci.2008.11.001</pub-id>
</citation>
</ref>
<ref id="B29">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Freier</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Hemmerich</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Sch&#xf6;ler</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Wiechert</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Oldiges</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>von Lieres</surname>
<given-names>E.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>Framework for kriging-based iterative experimental analysis and design: Optimization of secretory protein production in corynebacterium glutamicum</article-title>. <source>Eng. Life Sci.</source> <volume>16</volume> (<issue>6</issue>), <fpage>538</fpage>&#x2013;<lpage>549</lpage>. <pub-id pub-id-type="doi">10.1002/elsc.201500171</pub-id>
</citation>
</ref>
<ref id="B30">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Galbraith</surname>
<given-names>S. C.</given-names>
</name>
<name>
<surname>Bhatia</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Yoon</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>Media formulation optimization: Current and future opportunities</article-title>. <source>
<italic>Curr. Opin. Chem. Eng.</italic> Biotechnol. bioprocess Eng.</source> <volume>22</volume>, <fpage>42</fpage>&#x2013;<lpage>47</lpage>. <pub-id pub-id-type="doi">10.1016/j.coche.2018.08.004</pub-id>
</citation>
</ref>
<ref id="B31">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Garlapati</surname>
<given-names>V. K.</given-names>
</name>
<name>
<surname>Vundavilli</surname>
<given-names>P. R.</given-names>
</name>
<name>
<surname>Banerjee</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2010</year>). <article-title>Evaluation of lipase production by genetic algorithm and particle swarm optimization and their comparative study</article-title>. <source>Appl. Biochem. Biotechnol.</source> <volume>162</volume> (<issue>5</issue>), <fpage>1350</fpage>&#x2013;<lpage>1361</lpage>. <pub-id pub-id-type="doi">10.1007/s12010-009-8895-2</pub-id>
</citation>
</ref>
<ref id="B32">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Ghosh</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Nag</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2001</year>). &#x201c;<article-title>An overview of radial basis function networks</article-title>,&#x201d; in <source>Radial basis function networks 2: New advances in design</source>. Editors <person-group person-group-type="editor">
<name>
<surname>Howlett</surname>
<given-names>R. J.</given-names>
</name>
<name>
<surname>Jain</surname>
<given-names>L. C.</given-names>
</name>
</person-group> (<publisher-loc>Heidelberg</publisher-loc>: <publisher-name>Physica-Verlag HD</publisher-name>). <comment>1&#x2013;36. Studies in Fuzziness and Soft Computing</comment>. <pub-id pub-id-type="doi">10.1007/978-3-7908-1826-0_1</pub-id>
</citation>
</ref>
<ref id="B33">
<citation citation-type="thesis">
<person-group person-group-type="author">
<name>
<surname>Ginsbourger</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Riche</surname>
<given-names>R. L.</given-names>
</name>
<name>
<surname>Carraro</surname>
<given-names>L.</given-names>
</name>
</person-group> (<year>2008</year>). &#x201c;<article-title>A multi-points criterion for deterministic parallel global optimization based on Gaussian processes</article-title>,&#x201d;. <comment>Report</comment>.</citation>
</ref>
<ref id="B34">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Goswami</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Ghosh</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Chakraborty</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2016</year>). <article-title>Reliability analysis of structures by iterative improved response surface method</article-title>. <source>Struct. Saf.</source> <volume>60</volume>, <fpage>56</fpage>&#x2013;<lpage>66</lpage>. <pub-id pub-id-type="doi">10.1016/j.strusafe.2016.02.002</pub-id>
</citation>
</ref>
<ref id="B35">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grzesik</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Warth</surname>
<given-names>S. C.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>One-time optimization of advanced T cell culture media using a machine learning pipeline</article-title>. <source>Front. Bioeng. Biotechnol.</source> <volume>9</volume>, <fpage>614324</fpage>. <pub-id pub-id-type="doi">10.3389/fbioe.2021.614324</pub-id>
</citation>
</ref>
<ref id="B36">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Guo</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Yuan</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>D.</given-names>
</name>
</person-group> (<year>2010</year>). <article-title>Medium optimization for ethanol production with Clostridium autoethanogenum with carbon monoxide as sole carbon source</article-title>. <source>Bioresour. Technol.</source> <volume>101</volume> (<issue>22</issue>), <fpage>8784</fpage>&#x2013;<lpage>8789</lpage>. <pub-id pub-id-type="doi">10.1016/j.biortech.2010.06.072</pub-id>
</citation>
</ref>
<ref id="B37">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gurunathan</surname>
<given-names>B.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Optimization of culture conditions and bench-scale production of L-asparaginase by submerged fermentation of Aspergillus terreus MTCC</article-title>. <source>J. Microbiol. Biotechnol.</source> <volume>22</volume> (<issue>7</issue>), <fpage>923</fpage>&#x2013;<lpage>929</lpage>. <pub-id pub-id-type="doi">10.4014/jmb.1112.12002</pub-id>
</citation>
</ref>
<ref id="B38">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gurunathan</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Sahadevan</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2011</year>). <article-title>Design of experiments and artificial neural network linked genetic algorithm for modeling and optimization of L-asparaginase production by Aspergillus terreus MTCC 1782</article-title>. <source>Biotechnol. Bioprocess Eng.</source> <volume>16</volume> (<issue>1</issue>), <fpage>50</fpage>&#x2013;<lpage>58</lpage>. <pub-id pub-id-type="doi">10.1007/s12257-010-0119-7</pub-id>
</citation>
</ref>
<ref id="B39">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Haider</surname>
<given-names>M. A.</given-names>
</name>
<name>
<surname>Pakshirajan</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Singh</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Chaudhry</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Artificial neural network-genetic algorithm approach to optimize media constituents for enhancing lipase production by a soil microorganism</article-title>. <source>Appl. Biochem. Biotechnol.</source> <volume>144</volume> (<issue>3</issue>), <fpage>225</fpage>&#x2013;<lpage>235</lpage>. <pub-id pub-id-type="doi">10.1007/s12010-007-8017-y</pub-id>
</citation>
</ref>
<ref id="B40">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hansen</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Auger</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Ros</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Mersmann</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Tu&#x161;ar</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Brockhoff</surname>
<given-names>D.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>COCO: A platform for comparing continuous optimizers in a black-box setting</article-title>. <source>Optim. Methods Softw.</source> <volume>36</volume> (<issue>1</issue>), <fpage>114</fpage>&#x2013;<lpage>144</lpage>. <pub-id pub-id-type="doi">10.1080/10556788.2020.1808977</pub-id>
</citation>
</ref>
<ref id="B41">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Havel</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Link</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Hofinger</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Franco-Lara</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Weuster-Botz</surname>
<given-names>D.</given-names>
</name>
</person-group> (<year>2006</year>). <article-title>Comparison of genetic algorithms for experimental multi-objective optimization on the example of medium design for cyanobacteria</article-title>. <source>Biotechnol. J.</source> <volume>1</volume> (<issue>5</issue>), <fpage>549</fpage>&#x2013;<lpage>555</lpage>. <pub-id pub-id-type="doi">10.1002/biot.200500052</pub-id>
</citation>
</ref>
<ref id="B42">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>He</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>X.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Medium factor optimization and fermentation kinetics for phenazine-1-carboxylic acid production by Pseudomonas sp. M18G</article-title>. <source>Biotechnol. Bioeng.</source> <volume>100</volume> (<issue>2</issue>), <fpage>250</fpage>&#x2013;<lpage>259</lpage>. <pub-id pub-id-type="doi">10.1002/bit.21767</pub-id>
</citation>
</ref>
<ref id="B43">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hernadewita</surname>
<given-names>R. I.</given-names>
</name>
<name>
<surname>Hendra</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Yuliani</surname>
<given-names>E. N. S.</given-names>
</name>
<name>
<surname>Hermiyetti</surname>
<given-names>H.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>An analysis of implementation of Taguchi method to improve production of pulp on hydrapulper milling</article-title>. <source>Int. J. Prod. Manag. Eng.</source> <volume>7</volume> (<issue>2</issue>), <fpage>125</fpage>&#x2013;<lpage>132</lpage>. <pub-id pub-id-type="doi">10.4995/ijpme.2019.10163</pub-id>
</citation>
</ref>
<ref id="B44">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hesami</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Jones</surname>
<given-names>A. M. P.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Modeling and optimizing callus growth and development in cannabis sativa using random forest and support vector machine in combination with a genetic algorithm</article-title>. <source>Appl. Microbiol. Biotechnol.</source> <volume>105</volume> (<issue>12</issue>), <fpage>5201</fpage>&#x2013;<lpage>5212</lpage>. <pub-id pub-id-type="doi">10.1007/s00253-021-11375-y</pub-id>
</citation>
</ref>
<ref id="B45">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Heylen</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Vanparys</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Wittebolle</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Verstraete</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Boon</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Vos</surname>
<given-names>P. D.</given-names>
</name>
</person-group> (<year>2006</year>). <article-title>Cultivation of denitrifying bacteria: Optimization of isolation conditions and diversity study</article-title>. <source>Appl. Environ. Microbiol.</source> <volume>72</volume> (<issue>4</issue>), <fpage>2637</fpage>&#x2013;<lpage>2643</lpage>. <pub-id pub-id-type="doi">10.1128/AEM.72.4.2637-2643.2006</pub-id>
</citation>
</ref>
<ref id="B46">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hofer</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Mandl</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Steiner</surname>
<given-names>W.</given-names>
</name>
</person-group> (<year>2004</year>). <article-title>Diketone cleaving enzyme Dke1 production by acinetobacter johnsonii&#x2014;optimization of fermentation conditions</article-title>. <source>J. Biotechnol.</source> <volume>107</volume> (<issue>1</issue>), <fpage>73</fpage>&#x2013;<lpage>81</lpage>. <pub-id pub-id-type="doi">10.1016/j.jbiotec.2003.09.010</pub-id>
</citation>
</ref>
<ref id="B47">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Huang</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Mei</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Xia</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2007</year>). <article-title>Application of artificial neural network coupling particle swarm optimization algorithm to biocatalytic production of GABA</article-title>. <source>Biotechnol. Bioeng.</source> <volume>96</volume> (<issue>5</issue>), <fpage>924</fpage>&#x2013;<lpage>931</lpage>. <pub-id pub-id-type="doi">10.1002/bit.21162</pub-id>
</citation>
</ref>
<ref id="B48">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hughes</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Marshall</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Reid</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Parkes</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Gelber</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>2007</year>). <article-title>The costs of using unauthenticated, over-passaged cell lines: How much more data do we need?</article-title> <source>BioTechniques</source> <volume>43</volume> (<issue>5</issue>), <fpage>575</fpage>&#x2013;<lpage>586</lpage>. <pub-id pub-id-type="doi">10.2144/000112598</pub-id>
</citation>
</ref>
<ref id="B49">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hutwimmer</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Wagner</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Affenzeller</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Burgstaller</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Strasser</surname>
<given-names>H.</given-names>
</name>
</person-group> (<year>2008a</year>). <article-title>Algorithm&#x2010;based design of novel synthetic media for metarhizium anisopliae simulating its nutritional conditions in the environment</article-title>. <source>J. Appl. Microbiol.</source> <volume>105</volume> (<issue>2</issue>), <fpage>459</fpage>&#x2013;<lpage>468</lpage>. <pub-id pub-id-type="doi">10.1111/j.1365-2672.2008.03764.x</pub-id>
</citation>
</ref>
<ref id="B50">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hutwimmer</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Wagner</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Affenzeller</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Burgstaller</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Strasser</surname>
<given-names>H.</given-names>
</name>
</person-group> (<year>2008b</year>). <article-title>Algorithm&#x2010;based design of synthetic growth media stimulating virulence properties of metarhizium anisopliae conidia</article-title>. <source>J. Appl. Microbiol.</source> <volume>105</volume> (<issue>6</issue>), <fpage>2026</fpage>&#x2013;<lpage>2034</lpage>. <pub-id pub-id-type="doi">10.1111/j.1365-2672.2008.03872.x</pub-id>
</citation>
</ref>
<ref id="B51">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Imandi</surname>
<given-names>S. B.</given-names>
</name>
<name>
<surname>Karanam</surname>
<given-names>S. K.</given-names>
</name>
<name>
<surname>Nagumantri</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Srivastava</surname>
<given-names>R. K.</given-names>
</name>
<name>
<surname>Sarangi</surname>
<given-names>P. K.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>Neural networks and genetic algorithm as robust optimization tools for modeling the microbial production of poly-&#x3b2;-hydroxybutyrate (PHB) from brewers&#x2019; spent grain</article-title>. <source>Biotechnol. Appl. Biochem.</source> <volume>2022</volume>, <fpage>1</fpage>&#x2013;<lpage>17</lpage>. <pub-id pub-id-type="doi">10.1002/bab.2412</pub-id>
</citation>
</ref>
<ref id="B52">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jafari</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Daneshvar</surname>
<given-names>M. H.</given-names>
</name>
<name>
<surname>Jafari</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Hesami</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>Machine learning-assisted <italic>in vitro</italic> rooting optimization in passiflora caerulea</article-title>. <source>Forests</source> <volume>13</volume> (<issue>12</issue>), <fpage>2020</fpage>. <pub-id pub-id-type="doi">10.3390/f13122020</pub-id>
</citation>
</ref>
<ref id="B53">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jankovic</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Chaudhary</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Goia</surname>
<given-names>F.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Designing the design of experiments (DOE) &#x2013; an investigation on the influence of different factorial designs on the characterization of complex systems</article-title>. <source>Energy Build.</source> <volume>250</volume>, <fpage>111298</fpage>. <pub-id pub-id-type="doi">10.1016/j.enbuild.2021.111298</pub-id>
</citation>
</ref>
<ref id="B54">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Joji</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Santhiagu</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Salim</surname>
<given-names>N.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Computational modeling of culture media for enhanced production of fibrinolytic enzyme from marine bacterium fictibacillus sp. Strain SKA27 and <italic>in vitro</italic> evaluation of fibrinolytic activity</article-title>. <source>3 Biotech.</source> <volume>9</volume> (<issue>9</issue>), <fpage>323</fpage>. <pub-id pub-id-type="doi">10.1007/s13205-019-1853-y</pub-id>
</citation>
</ref>
<ref id="B55">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kana</surname>
<given-names>E. B. G.</given-names>
</name>
<name>
<surname>Oloke</surname>
<given-names>J. K.</given-names>
</name>
<name>
<surname>Lateef</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Adesiyan</surname>
<given-names>M. O.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Modeling and optimization of biogas production on saw dust and other Co-substrates using artificial neural network and genetic algorithm</article-title>. <source>Renew. Energy</source> <volume>46</volume>, <fpage>276</fpage>&#x2013;<lpage>281</lpage>. <pub-id pub-id-type="doi">10.1016/j.renene.2012.03.027</pub-id>
</citation>
</ref>
<ref id="B56">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kana</surname>
<given-names>E. B. G.</given-names>
</name>
<name>
<surname>Oloke</surname>
<given-names>J. K.</given-names>
</name>
<name>
<surname>Lateef</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Oyebanji</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Comparative evaluation of artificial neural network coupled genetic algorithm and response surface methodology for modeling and optimization of citric acid production by Aspergillus Niger MCBN297</article-title>. <source>Chem. Eng.</source> <volume>27</volume>, <fpage>397</fpage>&#x2013;<lpage>402</lpage>.</citation>
</ref>
<ref id="B57">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kanda</surname>
<given-names>G. N.</given-names>
</name>
<name>
<surname>Tsuzuki</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Terada</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Sakai</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Motozawa</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Masuda</surname>
<given-names>T.</given-names>
</name>
<etal/>
</person-group> (<year>2022</year>). <article-title>Robotic search for optimal cell culture in regenerative medicine</article-title>. <source>ELife</source> <volume>11</volume>, <fpage>e77007</fpage>. <pub-id pub-id-type="doi">10.7554/eLife.77007</pub-id>
</citation>
</ref>
<ref id="B58">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kanimozhi</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Moorthy</surname>
<given-names>I. G.</given-names>
</name>
<name>
<surname>Sivashankar</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Sivasubramanian</surname>
<given-names>V.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Optimization of dextran production by weissella cibaria NITCSK4 using response surface methodology-genetic algorithm based Technology</article-title>. <source>Carbohydr. Polym.</source> <volume>174</volume>, <fpage>103</fpage>&#x2013;<lpage>110</lpage>. <pub-id pub-id-type="doi">10.1016/j.carbpol.2017.06.021</pub-id>
</citation>
</ref>
<ref id="B59">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Katla</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Karmakar</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Tadi</surname>
<given-names>S. R. R.</given-names>
</name>
<name>
<surname>Mohan</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Anand</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Pal</surname>
<given-names>U.</given-names>
</name>
<etal/>
</person-group> (<year>2019</year>). <article-title>High level extracellular production of recombinant human interferon alpha 2b in glycoengineered pichia pastoris: Culture medium optimization, high cell density cultivation and biological characterization</article-title>. <source>J. Appl. Microbiol.</source> <volume>126</volume> (<issue>5</issue>), <fpage>1438</fpage>&#x2013;<lpage>1453</lpage>. <pub-id pub-id-type="doi">10.1111/jam.14227</pub-id>
</citation>
</ref>
<ref id="B60">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kennedy</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Eberhart</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>1995</year>). <article-title>Particle swarm optimization</article-title>. <source>Proc. ICNN&#x2019;95 - Int. Conf. Neural Netw.</source> <volume>4</volume>, <fpage>1942</fpage>&#x2013;<lpage>1948</lpage>. <pub-id pub-id-type="doi">10.1109/ICNN.1995.488968</pub-id>
</citation>
</ref>
<ref id="B61">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Khaouane</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Si-Moussa</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Hanini</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Benkortbi</surname>
<given-names>O.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Optimization of culture conditions for the production of pleuromutilin from pleurotus mutilus using a hybrid method based on central composite design, neural network, and particle swarm optimization</article-title>. <source>Biotechnol. Bioprocess Eng.</source> <volume>17</volume> (<issue>5</issue>), <fpage>1048</fpage>&#x2013;<lpage>1054</lpage>. <pub-id pub-id-type="doi">10.1007/s12257-012-0254-4</pub-id>
</citation>
</ref>
<ref id="B62">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Khuri</surname>
<given-names>A. I.</given-names>
</name>
<name>
<surname>Mukhopadhyay</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2010</year>). <article-title>Response surface methodology</article-title>. <source>WIREs Comput. Stat.</source> <volume>2</volume> (<issue>2</issue>), <fpage>128</fpage>&#x2013;<lpage>149</lpage>. <pub-id pub-id-type="doi">10.1002/wics.73</pub-id>
</citation>
</ref>
<ref id="B63">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kim</surname>
<given-names>M. M.</given-names>
</name>
<name>
<surname>Audet</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>On-demand serum-free media formulations for human hematopoietic cell expansion using a high dimensional search algorithm</article-title>. <source>Commun. Biol.</source> <volume>2</volume> (<issue>1</issue>), <fpage>48</fpage>&#x2013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1038/s42003-019-0296-7</pub-id>
</citation>
</ref>
<ref id="B64">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Koziel</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Ciaurri</surname>
<given-names>D. E.</given-names>
</name>
<name>
<surname>Leifsson</surname>
<given-names>L.</given-names>
</name>
</person-group> (<year>2011</year>). <article-title>Surrogate-based methods</article-title>. In <source>Computational optimization, methods and algorithms</source>, edited by <source>slawomir koziel and xin-she yang, 33&#x2013;59. Studies in computational intelligence</source>. <publisher-loc>Berlin, Heidelberg</publisher-loc>: <publisher-name>Springer</publisher-name>.</citation>
</ref>
<ref id="B65">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Krink</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Filipic</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Fogel</surname>
<given-names>G. B.</given-names>
</name>
</person-group> (<year>2004</year>). &#x201c;<article-title>Noisy optimization problems - a particular challenge for differential evolution?</article-title>&#x201d; in <conf-name>Proceedings of the 2004 Congress on Evolutionary Computation</conf-name>, <conf-loc>Portland</conf-loc>, <conf-date>19-23 June 2004</conf-date>.</citation>
</ref>
<ref id="B66">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Krogh</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>What are artificial neural networks?</article-title> <source>Nat. Biotechnol.</source> <volume>26</volume> (<issue>2</issue>), <fpage>195</fpage>&#x2013;<lpage>197</lpage>. <pub-id pub-id-type="doi">10.1038/nbt1386</pub-id>
</citation>
</ref>
<ref id="B67">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kucharzyk</surname>
<given-names>K. H.</given-names>
</name>
<name>
<surname>Crawford</surname>
<given-names>R. L.</given-names>
</name>
<name>
<surname>Paszczynski</surname>
<given-names>A. J.</given-names>
</name>
<name>
<surname>Soule</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Hess</surname>
<given-names>T. F.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Maximizing microbial degradation of perchlorate using a genetic algorithm: Media optimization</article-title>. <source>J. Biotechnol.</source> <volume>157</volume> (<issue>1</issue>), <fpage>189</fpage>&#x2013;<lpage>197</lpage>. <pub-id pub-id-type="doi">10.1016/j.jbiotec.2011.10.011</pub-id>
</citation>
</ref>
<ref id="B68">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kumar</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Pathak</surname>
<given-names>A. K.</given-names>
</name>
<name>
<surname>Guria</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>2015</year>). <article-title>NPK-10:26:26 complex fertilizer assisted optimal cultivation of dunaliella tertiolecta using response surface methodology and genetic algorithm</article-title>. <source>Bioresour. Technol.</source> <volume>194</volume>, <fpage>117</fpage>&#x2013;<lpage>129</lpage>. <pub-id pub-id-type="doi">10.1016/j.biortech.2015.06.082</pub-id>
</citation>
</ref>
<ref id="B69">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kumar</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Chhabra</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Shukla</surname>
<given-names>P.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Xylanase production from thermomyces lanuginosus VAPS-24 using low cost agro-industrial residues via hybrid optimization tools and its potential use for saccharification</article-title>. <source>Bioresour. Technol.</source> <volume>243</volume>, <fpage>1009</fpage>&#x2013;<lpage>1019</lpage>. <pub-id pub-id-type="doi">10.1016/j.biortech.2017.07.094</pub-id>
</citation>
</ref>
<ref id="B70">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lewis</surname>
<given-names>R. M.</given-names>
</name>
<name>
<surname>Torczon</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Trosset</surname>
<given-names>M. W.</given-names>
</name>
</person-group> (<year>2000</year>). <article-title>Direct search methods: Then and now</article-title>. <source>J. Comput. Appl. Math. Numer. Analysis</source> <volume>124</volume> (<issue>1</issue>), <fpage>191</fpage>&#x2013;<lpage>207</lpage>. <comment>Optimization and Nonlinear Equations</comment>. <pub-id pub-id-type="doi">10.1016/S0377-0427(00)00423-4</pub-id>
</citation>
</ref>
<ref id="B71">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lim</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Ng</surname>
<given-names>C. K.</given-names>
</name>
<name>
<surname>Vaitesswar</surname>
<given-names>U. S.</given-names>
</name>
<name>
<surname>Hippalgaonkar</surname>
<given-names>K.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Extrapolative bayesian optimization with Gaussian process and neural network ensemble surrogate models</article-title>. <source>Adv. Intell. Syst.</source> <volume>3</volume> (<issue>11</issue>), <fpage>2100101</fpage>. <pub-id pub-id-type="doi">10.1002/aisy.202100101</pub-id>
</citation>
</ref>
<ref id="B72">
<citation citation-type="web">
<collab>Custom Market Insights</collab> (<year>2022</year>). <article-title>[Latest] global microbial fermentation Technology market size/share worth USD 52.12 billion by 2030 at a 7.8% CAGR: Custom market insights (analysis, outlook, leaders, report, trends, forecast, segmentation, growth, growth rate, value). GlobeNewswire news room</article-title>
<comment>. <ext-link ext-link-type="uri" xlink:href="https://www.globenewswire.com/en/news-release/2022/11/07/2549546/0/en/Latest-Global-Microbial-Fermentation-Technology-Market-Size-Share-Worth-USD-52-12-Billion-by-2030-at-a-7-8-CAGR-Custom-Market-Insights-Analysis-Outlook-Leaders-Report-Trends-Foreca.html">https://www.globenewswire.com/en/news-release/2022/11/07/2549546/0/en/Latest-Global-Microbial-Fermentation-Technology-Market-Size-Share-Worth-USD-52-12-Billion-by-2030-at-a-7-8-CAGR-Custom-Market-Insights-Analysis-Outlook-Leaders-Report-Trends-Foreca.html</ext-link> (Accessed November 7, 2022)</comment>.</citation>
</ref>
<ref id="B73">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Gao</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Marcos-Martinez</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Bryan</surname>
<given-names>B. A.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Nonparametric machine learning for mapping forest cover and exploring influential factors</article-title>. <source>Landsc. Ecol.</source> <volume>35</volume> (<issue>7</issue>), <fpage>1683</fpage>&#x2013;<lpage>1699</lpage>. <pub-id pub-id-type="doi">10.1007/s10980-020-01046-0</pub-id>
</citation>
</ref>
<ref id="B74">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>D. C.</given-names>
</name>
<name>
<surname>Nocedal</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>1989</year>). <article-title>On the limited memory BFGS method for large scale optimization</article-title>. <source>Math. Program.</source> <volume>45</volume> (<issue>1</issue>), <fpage>503</fpage>&#x2013;<lpage>528</lpage>. <pub-id pub-id-type="doi">10.1007/BF01589116</pub-id>
</citation>
</ref>
<ref id="B75">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Sun</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Du</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>W.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Culture conditions optimization of hyaluronic acid production by Streptococcus zooepidemicus based on radial basis function neural network and quantum-behaved particle swarm optimization algorithm</article-title>. <source>Enzyme Microb. Technol.</source> <volume>44</volume> (<issue>1</issue>), <fpage>24</fpage>&#x2013;<lpage>32</lpage>. <pub-id pub-id-type="doi">10.1016/j.enzmictec.2008.09.015</pub-id>
</citation>
</ref>
<ref id="B76">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Gunawan</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Bioprocess optimization under uncertainty using ensemble modeling</article-title>. <source>J. Biotechnol.</source> <volume>244</volume>, <fpage>34</fpage>&#x2013;<lpage>44</lpage>. <pub-id pub-id-type="doi">10.1016/j.jbiotec.2017.01.013</pub-id>
</citation>
</ref>
<ref id="B77">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maaranen</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Miettinen</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>M&#xe4;kel&#xe4;</surname>
<given-names>M. M.</given-names>
</name>
</person-group> (<year>2004</year>). <article-title>Quasi-random initial population for genetic algorithms</article-title>. <source>Comput. Math. Appl.</source> <volume>47</volume> (<issue>12</issue>), <fpage>1885</fpage>&#x2013;<lpage>1895</lpage>. <pub-id pub-id-type="doi">10.1016/j.camwa.2003.07.011</pub-id>
</citation>
</ref>
<ref id="B78">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maiti</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Rathore</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Srivastava</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Shekhawat</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Srivastava</surname>
<given-names>P.</given-names>
</name>
</person-group> (<year>2011</year>). <article-title>Optimization of process parameters for ethanol production from sugar cane molasses by zymomonas mobilis using response surface methodology and genetic algorithm</article-title>. <source>Appl. Microbiol. Biotechnol.</source> <volume>90</volume> (<issue>1</issue>), <fpage>385</fpage>&#x2013;<lpage>395</lpage>. <pub-id pub-id-type="doi">10.1007/s00253-011-3158-x</pub-id>
</citation>
</ref>
<ref id="B79">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mansoury</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Hamed</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Karmustaji</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Hannan</surname>
<given-names>F. A.</given-names>
</name>
<name>
<surname>Safrany</surname>
<given-names>S. T.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>The edge effect: A global problem. The trouble with culturing cells in 96-well plates</article-title>. <source>Biochem. Biophysics Rep.</source> <volume>26</volume>, <fpage>100987</fpage>. <pub-id pub-id-type="doi">10.1016/j.bbrep.2021.100987</pub-id>
</citation>
</ref>
<ref id="B80">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Marteijn</surname>
<given-names>R. C. L.</given-names>
</name>
<name>
<surname>Jurrius</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Dhont</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>De Gooijer</surname>
<given-names>C. D.</given-names>
</name>
<name>
<surname>Tramper</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Martens</surname>
<given-names>D. E.</given-names>
</name>
</person-group> (<year>2003</year>). <article-title>Optimization of a feed medium for fed-batch culture of insect cells using a genetic algorithm</article-title>. <source>Biotechnol. Bioeng.</source> <volume>81</volume> (<issue>3</issue>), <fpage>269</fpage>&#x2013;<lpage>278</lpage>. <pub-id pub-id-type="doi">10.1002/bit.10465</pub-id>
</citation>
</ref>
<ref id="B81">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Mirjalili</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Genetic algorithm</article-title>,&#x201d; in <source>Evolutionary algorithms and neural networks: Theory and applications</source>. Editor <person-group person-group-type="editor">
<name>
<surname>Mirjalili</surname>
<given-names>S.</given-names>
</name>
</person-group> (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer International Publishing</publisher-name>). <comment>43&#x2013;55. Studies in Computational Intelligence</comment>.</citation>
</ref>
<ref id="B82">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Moorthy</surname>
<given-names>I. M. G.</given-names>
</name>
<name>
<surname>Baskar</surname>
<given-names>R.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Statistical modeling and optimization of alkaline protease production from a newly isolated alkalophilic Bacillus species BGS using response surface methodology and genetic algorithm</article-title>. <source>Prep. Biochem. Biotechnol.</source> <volume>43</volume> (<issue>3</issue>), <fpage>293</fpage>&#x2013;<lpage>314</lpage>. <pub-id pub-id-type="doi">10.1080/10826068.2012.719850</pub-id>
</citation>
</ref>
<ref id="B83">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Morschett</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Freier</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Rohde</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Wiechert</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>von Lieres</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Oldiges</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>A framework for accelerated phototrophic bioprocess development: Integration of parallelized microscale cultivation, laboratory automation and kriging-assisted experimental design</article-title>. <source>Biotechnol. Biofuels</source> <volume>10</volume> (<issue>1</issue>), <fpage>26</fpage>. <pub-id pub-id-type="doi">10.1186/s13068-017-0711-6</pub-id>
</citation>
</ref>
<ref id="B84">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Munroe</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Sandoval</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Martens</surname>
<given-names>D. E.</given-names>
</name>
<name>
<surname>Sipkema</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Pomponi</surname>
<given-names>S. A.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Genetic algorithm as an optimization tool for the development of sponge cell culture media</article-title>. <source>Vitro Cell. Dev. Biology-Animal</source> <volume>55</volume> (<issue>3</issue>), <fpage>149</fpage>&#x2013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1007/s11626-018-00317-0</pub-id>
</citation>
</ref>
<ref id="B85">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pal</surname>
<given-names>M. P.</given-names>
</name>
<name>
<surname>Vaidya</surname>
<given-names>B. K.</given-names>
</name>
<name>
<surname>Desai</surname>
<given-names>K. M.</given-names>
</name>
<name>
<surname>Joshi</surname>
<given-names>R. M.</given-names>
</name>
<name>
<surname>Nene</surname>
<given-names>S. N.</given-names>
</name>
<name>
<surname>Kulkarni</surname>
<given-names>B. D.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Media optimization for biosurfactant production by rhodococcus erythropolis MTCC 2794: Artificial intelligence versus a statistical approach</article-title>. <source>J. Industrial Microbiol. Biotechnol.</source> <volume>36</volume> (<issue>5</issue>), <fpage>747</fpage>&#x2013;<lpage>756</lpage>. <pub-id pub-id-type="doi">10.1007/s10295-009-0547-6</pub-id>
</citation>
</ref>
<ref id="B86">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pandey</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Kumar</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Prabhu</surname>
<given-names>A. A.</given-names>
</name>
<name>
<surname>Veeranki</surname>
<given-names>V. D.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>Application of medium optimization tools for improving recombinant human interferon gamma production from kluyveromyces lactis</article-title>. <source>Prep. Biochem. Biotechnol.</source> <volume>48</volume> (<issue>3</issue>), <fpage>279</fpage>&#x2013;<lpage>287</lpage>. <pub-id pub-id-type="doi">10.1080/10826068.2018.1425714</pub-id>
</citation>
</ref>
<ref id="B87">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Parkhey</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Gupta</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Eswari</surname>
<given-names>J. S.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Optimization of cellulase production from isolated cellulolytic bacterium: Comparison between genetic algorithms, simulated annealing, and response surface methodology</article-title>. <source>Chem. Eng. Commun.</source> <volume>204</volume> (<issue>1</issue>), <fpage>28</fpage>&#x2013;<lpage>38</lpage>. <pub-id pub-id-type="doi">10.1080/00986445.2016.1230736</pub-id>
</citation>
</ref>
<ref id="B88">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pathak</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Singh</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Niwas</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Osama</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Khan</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Haque</surname>
<given-names>S.</given-names>
</name>
<etal/>
</person-group> (<year>2015</year>). <article-title>Artificial intelligence versus statistical modeling and optimization of cholesterol oxidase production by using streptomyces sp</article-title>. <source>PLOS ONE</source> <volume>10</volume> (<issue>9</issue>), <fpage>e0137268</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0137268</pub-id>
</citation>
</ref>
<ref id="B89">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Patil</surname>
<given-names>S. V.</given-names>
</name>
<name>
<surname>Jayaraman</surname>
<given-names>V. K.</given-names>
</name>
<name>
<surname>Kulkarni</surname>
<given-names>B. D.</given-names>
</name>
</person-group> (<year>2002</year>). <article-title>Optimization of media by evolutionary algorithms for production of polyols</article-title>. <source>Appl. Biochem. Biotechnol.</source> <volume>102</volume> (<issue>1</issue>), <fpage>119</fpage>&#x2013;<lpage>128</lpage>. <pub-id pub-id-type="doi">10.1385/ABAB:102-103:1-6:119</pub-id>
</citation>
</ref>
<ref id="B90">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Peng</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Zhong</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Ren</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Xiao</surname>
<given-names>S.</given-names>
</name>
<etal/>
</person-group> (<year>2014</year>). <article-title>The artificial neural network approach based on uniform design to optimize the fed-batch fermentation condition: Application to the production of iturin A</article-title>. <source>Microb. Cell Factories</source> <volume>13</volume> (<issue>1</issue>), <fpage>54</fpage>. <pub-id pub-id-type="doi">10.1186/1475-2859-13-54</pub-id>
</citation>
</ref>
<ref id="B91">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Prabhu</surname>
<given-names>A. A.</given-names>
</name>
<name>
<surname>Thomas</surname>
<given-names>D. J.</given-names>
</name>
<name>
<surname>Ledesma-Amaro</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Leeke</surname>
<given-names>G. A.</given-names>
</name>
<name>
<surname>Medina</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Verheecke-Vaessen</surname>
<given-names>C.</given-names>
</name>
<etal/>
</person-group> (<year>2020</year>). <article-title>Biovalorisation of crude glycerol and xylose into xylitol by oleaginous yeast yarrowia lipolytica</article-title>. <source>Microb. Cell Factories</source> <volume>19</volume> (<issue>1</issue>), <fpage>121</fpage>. <pub-id pub-id-type="doi">10.1186/s12934-020-01378-1</pub-id>
</citation>
</ref>
<ref id="B92">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Price</surname>
<given-names>K. V.</given-names>
</name>
</person-group> (<year>2013</year>). &#x201c;<article-title>Differential evolution</article-title>,&#x201d; in <source>Handbook of optimization: From classical to modern approach</source>. Editors <person-group person-group-type="editor">
<name>
<surname>Zelinka</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Sn&#xe1;&#x161;el</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Abraham</surname>
<given-names>A.</given-names>
</name>
</person-group> (<publisher-loc>Berlin, Heidelberg</publisher-loc>: <publisher-name>Springer</publisher-name>). <comment>187&#x2013;214. Intelligent Systems Reference Library</comment>.</citation>
</ref>
<ref id="B93">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Qian</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Yu</surname>
<given-names>Y.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Derivative-free reinforcement learning: A review</article-title>. <source>Front. Comput. Sci.</source> <volume>15</volume>, <fpage>156336</fpage>. <pub-id pub-id-type="doi">10.1007/s11704-020-0241-4</pub-id>
</citation>
</ref>
<ref id="B94">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rakshit</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Konar</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Das</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Noisy evolutionary optimization algorithms &#x2013; a comprehensive survey</article-title>. <source>Swarm Evol. Comput.</source> <volume>33</volume>, <fpage>18</fpage>&#x2013;<lpage>45</lpage>. <pub-id pub-id-type="doi">10.1016/j.swevo.2016.09.002</pub-id>
</citation>
</ref>
<ref id="B95">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rekha</surname>
<given-names>V. P. B.</given-names>
</name>
<name>
<surname>Ghosh</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Adapa</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Oh</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Pulicherla</surname>
<given-names>K. K.</given-names>
</name>
<name>
<surname>Rao</surname>
<given-names>K. R. S. S.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Optimization of polygalacturonase production from a newly isolated <italic>thalassospira frigidphilosprofundus</italic> to use in pectin hydrolysis: Statistical approach</article-title>. <source>BioMed Res. Int.</source> <volume>2013</volume>, <fpage>1</fpage>&#x2013;<lpage>12</lpage>. <pub-id pub-id-type="doi">10.1155/2013/750187</pub-id>
</citation>
</ref>
<ref id="B96">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ritacco</surname>
<given-names>F. V.</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Khetan</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>Cell culture media for recombinant protein expression in Chinese hamster ovary (CHO) cells: History, key components, and optimization strategies</article-title>. <source>Biotechnol. Prog.</source> <volume>34</volume> (<issue>6</issue>), <fpage>1407</fpage>&#x2013;<lpage>1426</lpage>. <pub-id pub-id-type="doi">10.1002/btpr.2706</pub-id>
</citation>
</ref>
<ref id="B97">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Salehi</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Farhadi</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Moieni</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Safaie</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Hesami</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>A hybrid model based on general regression neural network and fruit fly optimization algorithm for forecasting and optimizing paclitaxel biosynthesis in corylus avellana cell culture</article-title>. <source>Plant Methods</source> <volume>17</volume> (<issue>1</issue>), <fpage>13</fpage>. <pub-id pub-id-type="doi">10.1186/s13007-021-00714-9</pub-id>
</citation>
</ref>
<ref id="B98">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sarma</surname>
<given-names>M. V. R. K.</given-names>
</name>
<name>
<surname>Sahai</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Bisaria</surname>
<given-names>V. S.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Genetic algorithm-based medium optimization for enhanced production of fluorescent pseudomonad R81 and siderophore</article-title>. <source>Biochem. Eng. J.</source> <volume>47</volume> (<issue>1</issue>), <fpage>100</fpage>&#x2013;<lpage>108</lpage>. <pub-id pub-id-type="doi">10.1016/j.bej.2009.07.010</pub-id>
</citation>
</ref>
<ref id="B99">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Selvaraj</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Vytla</surname>
<given-names>R. M.</given-names>
</name>
<name>
<surname>Vijay</surname>
<given-names>G. S.</given-names>
</name>
<name>
<surname>Natarajan</surname>
<given-names>K.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Modeling and optimization of tannase production with triphala in packed bed reactor by response surface methodology, genetic algorithm, and artificial neural network</article-title>. <source>3 Biotech.</source> <volume>9</volume>, <fpage>259</fpage>. <pub-id pub-id-type="doi">10.1007/s13205-019-1763-z</pub-id>
</citation>
</ref>
<ref id="B101">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shirodkar</surname>
<given-names>P. V.</given-names>
</name>
<name>
<surname>Muraleedharan</surname>
<given-names>U. D.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Enhanced<italic>&#x3b1;</italic>-amylase production by a marine protist,<italic>Ulkenia</italic>sp. using response surface methodology and genetic algorithm</article-title>. <source>Prep. Biochem. Biotechnol.</source> <volume>47</volume> (<issue>10</issue>), <fpage>1043</fpage>&#x2013;<lpage>1049</lpage>. <pub-id pub-id-type="doi">10.1080/10826068.2017.1373293</pub-id>
</citation>
</ref>
<ref id="B102">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Singh</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Majumder</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Goyal</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Artificial intelligence based optimization of exocellular glucansucrase production from leuconostoc dextranicum NRRL B-1146</article-title>. <source>Bioresour. Technol.</source> <volume>99</volume> (<issue>17</issue>), <fpage>8201</fpage>&#x2013;<lpage>8206</lpage>. <pub-id pub-id-type="doi">10.1016/j.biortech.2008.03.038</pub-id>
</citation>
</ref>
<ref id="B103">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Singh</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Haque</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Niwas</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Srivastava</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Pasupuleti</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Tripathi</surname>
<given-names>C. K. M.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Strategies for fermentation medium optimization: An in-depth review</article-title>. <source>Front. Microbiol.</source> <volume>7</volume>, <fpage>2087</fpage>. <pub-id pub-id-type="doi">10.3389/fmicb.2016.02087</pub-id>
</citation>
</ref>
<ref id="B104">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Singh</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Khan</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Khan</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Tripathi</surname>
<given-names>C. K. M.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Optimization of actinomycin V production by streptomyces triostinicus using artificial neural network and genetic algorithm</article-title>. <source>Appl. Microbiol. Biotechnol.</source> <volume>82</volume> (<issue>2</issue>), <fpage>379</fpage>&#x2013;<lpage>385</lpage>. <pub-id pub-id-type="doi">10.1007/s00253-008-1828-0</pub-id>
</citation>
</ref>
<ref id="B105">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Singh</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Srivastava</surname>
<given-names>S. K.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Statistical and evolutionary optimization for enhanced production of an anti- leukemic enzyme, L-asparaginase, in a protease-deficient Bacillus aryabhattai ITBHU02 isolated from the soil contaminated with hospital waste</article-title>. <source>Indian J. Exp. Biol.</source> <volume>51</volume> (<issue>4</issue>), <fpage>322</fpage>&#x2013;<lpage>335</lpage>.</citation>
</ref>
<ref id="B106">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Singha</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Panda</surname>
<given-names>T.</given-names>
</name>
</person-group> (<year>2014</year>). <article-title>Improved production of laccase by daedalea flavida: Consideration of evolutionary process optimization and batch-fed culture</article-title>. <source>Bioprocess Biosyst. Eng.</source> <volume>37</volume> (<issue>3</issue>), <fpage>493</fpage>&#x2013;<lpage>503</lpage>. <pub-id pub-id-type="doi">10.1007/s00449-013-1014-3</pub-id>
</citation>
</ref>
<ref id="B107">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Snoek</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Larochelle</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Adams</surname>
<given-names>R. P.</given-names>
</name>
</person-group> (<year>2012</year>). &#x201c;<article-title>Practical bayesian optimization of machine learning algorithms</article-title>,&#x201d; in <source>Proceedings of the 25th international conference on neural information processing systems</source> (<publisher-loc>Red Hook, NY, USA</publisher-loc>: <publisher-name>Curran Associates Inc</publisher-name>).</citation>
</ref>
<ref id="B108">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Srivastava</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Singh</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Haque</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Pandey</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Mishra</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Jawed</surname>
<given-names>A.</given-names>
</name>
<etal/>
</person-group> (<year>2018</year>). <article-title>Response surface methodology-genetic algorithm based medium optimization, purification, and characterization of cholesterol oxidase from streptomyces rimosus</article-title>. <source>Sci. Rep.</source> <volume>8</volume> (<issue>1</issue>), <fpage>10913</fpage>&#x2013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1038/s41598-018-29241-9</pub-id>
</citation>
</ref>
<ref id="B109">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stork</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Eiben</surname>
<given-names>A. E.</given-names>
</name>
<name>
<surname>Bartz-Beielstein</surname>
<given-names>T.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>A new taxonomy of global optimization algorithms</article-title>. <source>Nat. Comput.</source> <volume>21</volume> (<issue>2</issue>), <fpage>219</fpage>&#x2013;<lpage>242</lpage>. <pub-id pub-id-type="doi">10.1007/s11047-020-09820-4</pub-id>
</citation>
</ref>
<ref id="B110">
<citation citation-type="web">
<collab>Straits Research</collab> (<year>2021</year>). <article-title>Cellular agriculture market analysis, share forecast to 2030</article-title>
<comment>. <ext-link ext-link-type="uri" xlink:href="https://straitsresearch.com/report/cellular-agriculture-market">https://straitsresearch.com/report/cellular-agriculture-market</ext-link> (Accessed March 9, 2023)</comment>.</citation>
</ref>
<ref id="B111">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Subbarao</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Sathish</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Mahalaxmi</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Suvarnalaxmi</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Rao</surname>
<given-names>R. S.</given-names>
</name>
<name>
<surname>Prakasham</surname>
<given-names>R. S.</given-names>
</name>
</person-group> (<year>2008</year>). <article-title>Modelling and optimization of fermentation factors for enhancement of alkaline protease production by isolated Bacillus circulans using feed-forward neural network and genetic algorithm</article-title>. <source>J. Appl. Microbiol.</source> <volume>104</volume> (<issue>3</issue>), <fpage>889</fpage>&#x2013;<lpage>898</lpage>. <pub-id pub-id-type="doi">10.1111/j.1365-2672.2007.03605.x</pub-id>
</citation>
</ref>
<ref id="B112">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sudholt</surname>
<given-names>D.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Analysing the robustness of evolutionary algorithms to noise: Refined runtime bounds and an example where noise is beneficial</article-title>. <source>Algorithmica</source> <volume>83</volume> (<issue>4</issue>), <fpage>976</fpage>&#x2013;<lpage>1011</lpage>. <pub-id pub-id-type="doi">10.1007/s00453-020-00671-0</pub-id>
</citation>
</ref>
<ref id="B113">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Suryawanshi</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Naik</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Eswari</surname>
<given-names>J. S.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Extraction and optimization of exopolysaccharide from lactobacillus sp. Using response surface methodology and artificial neural networks</article-title>. <source>Prep. Biochem. Biotechnol.</source> <volume>49</volume> (<issue>10</issue>), <fpage>987</fpage>&#x2013;<lpage>996</lpage>. <pub-id pub-id-type="doi">10.1080/10826068.2019.1645695</pub-id>
</citation>
</ref>
<ref id="B114">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tachibana</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Chiou</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Konishi</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2021</year>). <article-title>Machine learning modeling of the effects of media formulated with various yeast extracts on heterologous protein production in Escherichia coli</article-title>. <source>MicrobiologyOpen</source> <volume>10</volume> (<issue>3</issue>), <fpage>e1214</fpage>. <pub-id pub-id-type="doi">10.1002/mbo3.1214</pub-id>
</citation>
</ref>
<ref id="B115">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tian</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Gao</surname>
<given-names>X. D.</given-names>
</name>
<name>
<surname>Yao</surname>
<given-names>W. B.</given-names>
</name>
</person-group> (<year>2013</year>). <article-title>Optimization of auto-induction medium for G-CSF production by Escherichia coli using artificial neural networks coupled with genetic algorithm</article-title>. <source>World J. Microbiol. Biotechnol.</source> <volume>29</volume> (<issue>3</issue>), <fpage>505</fpage>&#x2013;<lpage>513</lpage>. <pub-id pub-id-type="doi">10.1007/s11274-012-1204-1</pub-id>
</citation>
</ref>
<ref id="B116">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ti&#x161;ma</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>&#x17d;nidar&#x161;i&#x10d;-Plazl</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Vasi&#x107;-Ra&#x10d;ki</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Zeli&#x107;</surname>
<given-names>B.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Optimization of laccase production by trametes versicolor cultivated on industrial waste</article-title>. <source>Appl. Biochem. Biotechnol.</source> <volume>166</volume> (<issue>1</issue>), <fpage>36</fpage>&#x2013;<lpage>46</lpage>. <pub-id pub-id-type="doi">10.1007/s12010-011-9401-1</pub-id>
</citation>
</ref>
<ref id="B117">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tripathi</surname>
<given-names>C. K. M.</given-names>
</name>
<name>
<surname>Khan</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Praveen</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Khan</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Srivastava</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Enhanced antibiotic production by streptomyces sindenensis using artificial neural networks coupled with genetic algorithm and Nelder-Mead downhill simplex</article-title>. <source>J. Microbiol. Biotechnol.</source> <volume>22</volume> (<issue>7</issue>), <fpage>939</fpage>&#x2013;<lpage>946</lpage>. <pub-id pub-id-type="doi">10.4014/jmb.1109.09018</pub-id>
</citation>
</ref>
<ref id="B118">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Unuofin</surname>
<given-names>J. O.</given-names>
</name>
<name>
<surname>Okoh</surname>
<given-names>A. I.</given-names>
</name>
<name>
<surname>Nwodo</surname>
<given-names>U. U.</given-names>
</name>
</person-group> (<year>2019</year>). <article-title>Utilization of agroindustrial wastes for the production of laccase by achromobacter xylosoxidans HWN16 and bordetella bronchiseptica HSO16</article-title>. <source>J. Environ. Manag.</source> <volume>231</volume>, <fpage>222</fpage>&#x2013;<lpage>231</lpage>. <pub-id pub-id-type="doi">10.1016/j.jenvman.2018.10.016</pub-id>
</citation>
</ref>
<ref id="B119">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>van der Valk</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Brunner</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Smet</surname>
<given-names>K. D.</given-names>
</name>
<name>
<surname>Svenningsen</surname>
<given-names>A. F.</given-names>
</name>
<name>
<surname>Honegger</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Knudsen</surname>
<given-names>L. E.</given-names>
</name>
<etal/>
</person-group> (<year>2010</year>). <article-title>Optimization of chemically defined cell culture media &#x2013; replacing fetal bovine serum in mammalian <italic>in vitro</italic> methods</article-title>. <source>Toxicol. Vitro</source> <volume>24</volume> (<issue>4</issue>), <fpage>1053</fpage>&#x2013;<lpage>1063</lpage>. <pub-id pub-id-type="doi">10.1016/j.tiv.2010.03.016</pub-id>
</citation>
</ref>
<ref id="B120">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Viennet</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Fonteix</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Marc</surname>
<given-names>I.</given-names>
</name>
</person-group> (<year>1996</year>). <article-title>Multicriteria optimization using a genetic algorithm for determining a Pareto set</article-title>. <source>Int. J. Syst. Sci.</source> <volume>27</volume> (<issue>2</issue>), <fpage>255</fpage>&#x2013;<lpage>260</lpage>. <pub-id pub-id-type="doi">10.1080/00207729608929211</pub-id>
</citation>
</ref>
<ref id="B121">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wei</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Si</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Lu</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Yu</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Huang</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>Z.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Medium optimization for pyrroloquinoline quinone (PQQ) production by methylobacillus sp. Zju323 using response surface methodology and artificial neural network&#x2013;genetic algorithm</article-title>. <source>Prep. Biochem. Biotechnol.</source> <volume>47</volume> (<issue>7</issue>), <fpage>709</fpage>&#x2013;<lpage>719</lpage>. <pub-id pub-id-type="doi">10.1080/10826068.2017.1315596</pub-id>
</citation>
</ref>
<ref id="B122">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Weuster-Botz</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Wandrey</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>1995</year>). <article-title>Medium optimization by genetic algorithm for continuous production of formate dehydrogenase</article-title>. <source>Process Biochem.</source> <volume>30</volume> (<issue>6</issue>), <fpage>563</fpage>&#x2013;<lpage>571</lpage>. <pub-id pub-id-type="doi">10.1016/0032-9592(94)00036-0</pub-id>
</citation>
</ref>
<ref id="B123">
<citation citation-type="thesis">
<person-group person-group-type="author">
<name>
<surname>Wilson</surname>
<given-names>A. G.</given-names>
</name>
<name>
<surname>Hu</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Salakhutdinov</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Xing</surname>
<given-names>E. P.</given-names>
</name>
</person-group> (<year>2015</year>). &#x201c;<article-title>Deep kernel learning</article-title>,&#x201d;. <comment>arXiv</comment>.</citation>
</ref>
<ref id="B124">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Wilson</surname>
<given-names>M. D.</given-names>
</name>
</person-group> (<year>2008</year>). &#x201c;<article-title>Support vector machines</article-title>,&#x201d; in <source>Encyclopedia of ecology</source>. Editors <person-group person-group-type="editor">
<name>
<surname>J&#xf8;rgensen</surname>
<given-names>S. E.</given-names>
</name>
<name>
<surname>Fath</surname>
<given-names>B. D.</given-names>
</name>
</person-group> (<publisher-loc>Oxford</publisher-loc>: <publisher-name>Academic Press</publisher-name>).</citation>
</ref>
<ref id="B125">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Xu</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Yan</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Sheng</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>Y.</given-names>
</name>
</person-group> (<year>2014</year>). <article-title>Serum-free medium optimization based on trial design and support vector regression</article-title>. <source>BioMed Res. Int.</source> <volume>2014</volume>, <fpage>1</fpage>&#x2013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1155/2014/269305</pub-id>
</citation>
</ref>
<ref id="B126">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yao</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Asayama</surname>
<given-names>Y.</given-names>
</name>
</person-group> (<year>2017</year>). <article-title>Animal-cell culture media: History, characteristics, and current issues</article-title>. <source>Reproductive Med. Biol.</source> <volume>16</volume> (<issue>2</issue>), <fpage>99</fpage>&#x2013;<lpage>117</lpage>. <pub-id pub-id-type="doi">10.1002/rmb2.12024</pub-id>
</citation>
</ref>
<ref id="B127">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yoshida</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Watanabe</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Chiou</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Konishi</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2022</year>). <article-title>High throughput optimization of medium composition for Escherichia coli protein expression using deep learning and bayesian optimization</article-title>. <source>J. Biosci. Bioeng.</source> <volume>135</volume> (<issue>2</issue>), <fpage>127</fpage>&#x2013;<lpage>133</lpage>. <pub-id pub-id-type="doi">10.1016/j.jbiosc.2022.12.004</pub-id>
</citation>
</ref>
<ref id="B128">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Block</surname>
<given-names>D. E.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Using highly efficient nonlinear experimental design methods for optimization of lactococcus lactis fermentation in chemically defined media</article-title>. <source>Biotechnol. Prog.</source> <volume>25</volume> (<issue>6</issue>), <fpage>1587</fpage>&#x2013;<lpage>1597</lpage>. <pub-id pub-id-type="doi">10.1002/btpr.277</pub-id>
</citation>
</ref>
<ref id="B129">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Deng</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Dai</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Jin</surname>
<given-names>X.</given-names>
</name>
</person-group> (<year>2020</year>). <article-title>Optimization of culture conditions for differentiation of melon based on artificial neural network and genetic algorithm</article-title>. <source>Sci. Rep.</source> <volume>10</volume> (<issue>1</issue>), <fpage>3524</fpage>&#x2013;<lpage>3528</lpage>. <pub-id pub-id-type="doi">10.1038/s41598-020-60278-x</pub-id>
</citation>
</ref>
<ref id="B130">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhou</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Guo</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Guan</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>X.</given-names>
</name>
</person-group> (<year>2015</year>). <article-title>Optimization of culture medium for maximal production of spinosad using an artificial neural network - genetic algorithm modeling</article-title>. <source>Microb. Physiol.</source> <volume>25</volume> (<issue>4</issue>), <fpage>253</fpage>&#x2013;<lpage>261</lpage>. <pub-id pub-id-type="doi">10.1159/000381312</pub-id>
</citation>
</ref>
<ref id="B131">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zou</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Zhou</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Fan</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>X.</given-names>
</name>
<etal/>
</person-group> (<year>2020</year>). <article-title>A novel method based on nonparametric regression with a Gaussian kernel algorithm identifies the critical components in CHO media and feed optimization</article-title>. <source>J. Industrial Microbiol. Biotechnol.</source> <volume>47</volume> (<issue>1</issue>), <fpage>63</fpage>&#x2013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.1007/s10295-019-02248-5</pub-id>
</citation>
</ref>
</ref-list>
</back>
</article>