<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Comput. Neurosci.</journal-id>
<journal-title>Frontiers in Computational Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Comput. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5188</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fncom.2022.930827</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>An Infrared Sequence Image Generating Method for Target Detection and Tracking</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name><surname>Zhijian</surname> <given-names>Huang</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name><surname>Bingwei</surname> <given-names>Hui</given-names></name>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1789516/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Shujin</surname> <given-names>Sun</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>School of Computer Engineering and Applied Mathematics, Changsha University</institution>, <addr-line>Changsha</addr-line>, <country>China</country></aff>
<aff id="aff2"><sup>2</sup><institution>Hunan Province Key Laboratory of Industrial Internet Technology and Security</institution>, <addr-line>Changsha</addr-line>, <country>China</country></aff>
<aff id="aff3"><sup>3</sup><institution>Automatic Target Recognition (ATR) Key Laboratory, School of Electronic Science, National University of Defense Technology</institution>, <addr-line>Changsha</addr-line>, <country>China</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Deepika Koundal, University of Petroleum and Energy Studies, India</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Arvind Dhaka, Manipal University Jaipur, India; Amita Nandal, Manipal University Jaipur, India</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Hui Bingwei <email>huibingwei07&#x00040;163.com</email></corresp>
</author-notes>
<pub-date pub-type="epub">
<day>15</day>
<month>07</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>16</volume>
<elocation-id>930827</elocation-id>
<history>
<date date-type="received">
<day>28</day>
<month>04</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>10</day>
<month>06</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2022 Zhijian, Bingwei and Shujin.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Zhijian, Bingwei and Shujin</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>Training infrared target detection and tracking models based on deep learning requires a large number of infrared sequence images. The cost of acquisition real infrared target sequence images is high, while conventional simulation methods lack authenticity. This paper proposes a novel infrared data simulation method that combines real infrared images and simulated 3D infrared targets. Firstly, it stitches real infrared images into a panoramic image which is used as background. Then, the infrared characteristics of 3D aircraft are simulated on the tail nozzle, skin, and tail flame, which are used as targets. Finally, the background and targets are fused based on Unity3D, where the aircraft trajectory and attitude can be edited freely to generate rich multi-target infrared data. The experimental results show that the simulated image is not only visually similar to the real infrared image but also consistent with the real infrared image in terms of the performance of target detection algorithms. The method can provide training and testing samples for deep learning models for infrared target detection and tracking.</p></abstract>
<kwd-group>
<kwd>infrared image simulation</kwd>
<kwd>infrared target simulation</kwd>
<kwd>infrared radiation</kwd>
<kwd>deep learning</kwd>
<kwd>Unity3D</kwd>
</kwd-group>
<counts>
<fig-count count="8"/>
<table-count count="2"/>
<equation-count count="6"/>
<ref-count count="26"/>
<page-count count="9"/>
<word-count count="4664"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>Introduction</title>
<p>With the rapid development of deep-learning technology, data-driven models and algorithms have become a hot topic in infrared target detection and tracking (Dai et al., <xref ref-type="bibr" rid="B5">2021</xref>; Hou et al., <xref ref-type="bibr" rid="B9">2022</xref>). Unlike conventional methods, data-driven methods require a large amount of infrared data for model training and testing (Yi et al., <xref ref-type="bibr" rid="B21">2019</xref>; Junhong et al., <xref ref-type="bibr" rid="B11">2020</xref>).</p>
<p>However, the current infrared image datasets used for object detection and tracking are of poor quality (Hui et al., <xref ref-type="bibr" rid="B10">2020</xref>). The cost of measured data is high, and it is difficult to obtain infrared images in various scenarios (Zhang et al., <xref ref-type="bibr" rid="B25">2018</xref>). For example, the target type in real data is single, and it is difficult to obtain infrared images of important types of aircraft. The authenticity of the simulation data is insufficient (Xia et al., <xref ref-type="bibr" rid="B18">2015</xref>). The battlefield in modern warfare involves a wide range of complex environments. It is difficult for knowledge-based models to simulate a complex infrared battlefield. These problems significantly limit research progress in infrared target detection and tracking.</p>
<p>Currently, infrared target simulation can be performed using two approaches: methods based on infrared characteristic modeling (Shuwei and Bo, <xref ref-type="bibr" rid="B16">2018</xref>; Guanfeng et al., <xref ref-type="bibr" rid="B7">2019</xref>; Yongjie et al., <xref ref-type="bibr" rid="B22">2020</xref>) and methods based on deep neural networks (Mirza and Osindero, <xref ref-type="bibr" rid="B13">2014</xref>; Alec et al., <xref ref-type="bibr" rid="B1">2016</xref>; Junyan et al., <xref ref-type="bibr" rid="B12">2017</xref>; Chenyang, <xref ref-type="bibr" rid="B3">2019</xref>; Yi, <xref ref-type="bibr" rid="B20">2020</xref>). The former is typically based on infrared radiation theory. Physical models of various parts of an aircraft (such as engines, tail nozzles, tail flames, and casings) are established, atmospheric radiation is modeled, and infrared simulation data under various conditions are obtained. These methods start with a physical model and have strong interpretability. If sufficient parameters are added, high-fidelity infrared images can be produced (Yunjey et al., <xref ref-type="bibr" rid="B24">2020</xref>). With a large number of parameters and calculations, they are suitable for simple target simulations. However, these are unsuitable for real-environment simulations with complex types of ground objects (Chenyang, <xref ref-type="bibr" rid="B3">2019</xref>; Rani et al., <xref ref-type="bibr" rid="B14">2022</xref>). Methods based on deep learning, typically using a generative adversarial network (GAN), learn the style of the infrared image from a large number of real infrared images and then transfer visible light images to infrared images (Alec et al., <xref ref-type="bibr" rid="B1">2016</xref>; Junyan et al., <xref ref-type="bibr" rid="B12">2017</xref>; Chenyang, <xref ref-type="bibr" rid="B3">2019</xref>; Yi, <xref ref-type="bibr" rid="B20">2020</xref>). These methods do not require complex physical modeling processes and are fast, but lack authenticity and reliability (Shi et al., <xref ref-type="bibr" rid="B15">2021</xref>; Bhalla et al., <xref ref-type="bibr" rid="B2">2022</xref>). More importantly, the method is based on deep learning and cannot add infrared targets as needed, nor can it edit the flight trajectory and attitude, which is exactly what the infrared target dataset needs most.</p>
<p>Therefore, it is meaningful and valuable to study an infrared data generation method that conforms to the real infrared radiation characteristics, and can add multiple types and multiple aircraft targets arbitrarily. This paper proposed a new method, and its main contributions are as follows:</p>
<list list-type="simple">
<list-item><p>(1) A method combining the real infrared data of background with the simulated infrared data of target is proposed, which can easily generate multi-target infrared simulation data with high authenticity. It uses the panorama of the real infrared data mosaic as the background, rather than the direct 3D infrared simulation of the ground objects. It can avoid the complex problem of infrared modeling of ground objects. Compared with the 3D infrared simulation of the whole scene, it is much easier, and the generated data are more authentic.</p></list-item>
<list-item><p>(2) The method is based on the Unity3D to fuse the target model with the infrared scene. It can freely add the type and number of aircrafts, edit the aircraft trajectory, and attitude. So it can generate rich multi-target infrared simulation data.</p></list-item>
<list-item><p>(3) Starting from the infrared radiation characteristics, our method simulates the physical characteristics of the key parts of the 3D target (the tail nozzle, skin, and tail flame), which can generate high authenticity infrared target data.</p></list-item>
</list>
</sec>
<sec sec-type="methods" id="s2">
<title>Methods</title>
<sec>
<title>Overall Framework</title>
<p><xref ref-type="fig" rid="F1">Figure 1</xref> shows the overall framework of this study, divided into three branches: infrared background stitching, infrared radiation modeling, and flight trajectory editing. The infrared radiation modeling branch first establishes a 3D model on the basis of the size of the aircraft and then establishes an infrared radiation model of the aircraft according to the infrared radiation theory (such as the engine nozzle, skin, and tail flame). The infrared background stitching branch performs panoramic stitching based on real infrared dataset, and after uniform light processing, a uniform infrared panoramic image is obtained. We used the infrared panorama as background for the 3D scene. The flight-trajectory editing branch provides trajectory-editing tools. Users can call editing tools to create flight trajectories based on the aircraft performance parameters. The trajectory included the time, position, and attitude of each node. The observation window can track and record targets in a field of view of a specified size. Because multiple and various types of aircrafts can be selected and various trajectories can be edited, a rich variety of infrared simulation data can be obtained.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>Overall framework of this study.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fncom-16-930827-g0001.tif"/>
</fig>
</sec>
<sec>
<title>Infrared Target Modeling</title>
<p>As an infrared radiation source, the radiation characteristics of different parts of an aircraft show evident differences owing to different degrees of heat generation. The main components with the strongest infrared radiation include the engine nozzle, aircraft skin, and tail flame (Haixing et al., <xref ref-type="bibr" rid="B8">1997</xref>). This study starts with the basic theory of infrared radiation, grasps the main infrared radiation characteristics of each component, and establishes its infrared radiation intensity model.</p>
<p>Assuming that the infrared detector can perceive light of wavelengths ranging from &#x003BB;<sub>1</sub> to &#x003BB;<sub>2</sub> (only mid-wave infrared is considered in this study, that is, the wavelength range is 3&#x02013;5 &#x003BC;m), according to the Planck&#x00027;s law (Yu, <xref ref-type="bibr" rid="B23">2012</xref>), the infrared radiation intensity of a gray body can be expressed as:</p>
<disp-formula id="E1"><label>(1)</label><mml:math id="M1"><mml:mrow><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>&#x0007E;</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mstyle displaystyle='true'><mml:mrow><mml:msubsup><mml:mo>&#x0222B;</mml:mo><mml:mrow><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow></mml:msubsup><mml:mrow><mml:mfrac><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:msup><mml:mi>&#x003BB;</mml:mi><mml:mn>5</mml:mn></mml:msup></mml:mrow></mml:mfrac><mml:mfrac><mml:mn>1</mml:mn><mml:mrow><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:msub><mml:mtext>c</mml:mtext><mml:mn>2</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:mi>&#x003BB;</mml:mi><mml:mi>T</mml:mi></mml:mrow></mml:msup><mml:mo>&#x02212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mfrac><mml:mi>d</mml:mi><mml:mi>&#x003BB;</mml:mi></mml:mrow></mml:mrow></mml:mstyle><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:msup><mml:mi>T</mml:mi><mml:mn>4</mml:mn></mml:msup></mml:mrow><mml:mrow><mml:msubsup><mml:mi>c</mml:mi><mml:mn>2</mml:mn><mml:mn>4</mml:mn></mml:msubsup></mml:mrow></mml:mfrac><mml:mstyle displaystyle='true'><mml:mrow><mml:msubsup><mml:mo>&#x0222B;</mml:mo><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mi>T</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mi>T</mml:mi></mml:mrow></mml:msubsup><mml:mrow><mml:mfrac><mml:mrow><mml:msup><mml:mrow><mml:mo stretchy='false'>(</mml:mo><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:mi>&#x003BB;</mml:mi><mml:mi>T</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow><mml:mn>3</mml:mn></mml:msup></mml:mrow><mml:mrow><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:mi>&#x003BB;</mml:mi><mml:mi>T</mml:mi></mml:mrow></mml:msup><mml:mo>&#x02212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mfrac></mml:mrow></mml:mrow></mml:mstyle><mml:mtable><mml:mtr><mml:mtd><mml:mrow><mml:mi>d</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:mi>&#x003BB;</mml:mi><mml:mi>T</mml:mi></mml:mrow></mml:mfrac><mml:mo stretchy='false'>)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:mrow></mml:math></disp-formula>
<p>where <italic>T</italic> is the gray body surface temperature, <italic>c</italic><sub>1</sub> is the first radiation constant, typically (3.741774 &#x000B1; 0.0000022) &#x000D7; 10<sup>&#x02212;16</sup>W &#x000B7; m<sup>2</sup>, and <italic>c</italic><sub>2</sub> is the second radiation constant, typically (1.4387869 &#x000B1; 0.00000012) &#x000D7; 10<sup>&#x02212;2</sup>m &#x000B7; K. Assuming <italic>x</italic> &#x0003D; <italic>c</italic><sub>2</sub>/&#x003BB;<italic>T</italic>, the above equation can be simplified as follows:</p>
<disp-formula id="E2"><label>(2)</label><mml:math id="M3"><mml:mrow><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>&#x02212;</mml:mo><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:msup><mml:mi>T</mml:mi><mml:mn>4</mml:mn></mml:msup></mml:mrow><mml:mrow><mml:msubsup><mml:mi>c</mml:mi><mml:mn>2</mml:mn><mml:mn>4</mml:mn></mml:msubsup></mml:mrow></mml:mfrac><mml:mstyle displaystyle='true'><mml:mrow><mml:msubsup><mml:mo>&#x0222B;</mml:mo><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mi>T</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mi>T</mml:mi></mml:mrow></mml:msubsup><mml:mrow><mml:mfrac><mml:mrow><mml:msup><mml:mi>x</mml:mi><mml:mn>3</mml:mn></mml:msup></mml:mrow><mml:mrow><mml:msup><mml:mi>e</mml:mi><mml:mi>x</mml:mi></mml:msup><mml:mo>&#x02212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mfrac><mml:mi>d</mml:mi><mml:mi>x</mml:mi></mml:mrow></mml:mrow></mml:mstyle></mml:mrow></mml:math></disp-formula>
<sec>
<title>Nozzle Radiation Model</title>
<p>When the fuel in an engine burns, it emits high-temperature radiation, which is the main heat source when the aircraft is flying (Chuanyu, <xref ref-type="bibr" rid="B4">2013</xref>). As an extension of the engine outside the fuselage, the tail nozzle also exhibits relatively strong infrared radiation. The tail nozzle is a typical gray body, and the surface emissivity is approximately in the range of 0.8&#x02013;0.9. According to Equation (2), the relationship between the infrared radiation intensity of the tail nozzle <italic>I</italic><sub><italic>W</italic></sub> and temperature <italic>T</italic><sub><italic>W</italic></sub> is as follows:</p>
<disp-formula id="E3"><label>(3)</label><mml:math id="M4"><mml:mrow><mml:msub><mml:mi>I</mml:mi><mml:mi>W</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>&#x003B5;</mml:mi><mml:mi>W</mml:mi></mml:msub></mml:mrow><mml:mi>&#x003C0;</mml:mi></mml:mfrac><mml:mstyle displaystyle='true'><mml:mrow><mml:msubsup><mml:mo>&#x0222B;</mml:mo><mml:mrow><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:mrow></mml:msubsup><mml:mrow><mml:mfrac><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:msup><mml:mi>&#x003BB;</mml:mi><mml:mn>5</mml:mn></mml:msup></mml:mrow></mml:mfrac><mml:mfrac><mml:mn>1</mml:mn><mml:mrow><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:mi>&#x003BB;</mml:mi><mml:msub><mml:mi>T</mml:mi><mml:mtext>w</mml:mtext></mml:msub></mml:mrow></mml:msup><mml:mo>&#x02212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mfrac><mml:mi>d</mml:mi><mml:mi>&#x003BB;</mml:mi><mml:mo>&#x000B7;</mml:mo><mml:msub><mml:mi>S</mml:mi><mml:mi>W</mml:mi></mml:msub><mml:mo>&#x000B7;</mml:mo><mml:mi>cos</mml:mi><mml:msub><mml:mi>&#x003B8;</mml:mi><mml:mi>W</mml:mi></mml:msub></mml:mrow></mml:mrow></mml:mstyle></mml:mrow></mml:math></disp-formula>
<p>where &#x003B5;<sub><italic>M</italic></sub> is the radiation rate of the nozzle surface, which is determined by the aircraft surface material. S<sub>M</sub> is the cross-sectional area of the skin facing the probe. &#x003B8;<sub><italic>M</italic></sub> is the angle between the orientation of the probe and the orientation of the infrared radiation.</p>
</sec>
<sec>
<title>Aircraft Skin Radiation Model</title>
<p>Aircraft skin temperature is mainly affected by two factors: the ambient temperature of the atmosphere and the temperature generated by the friction between the aircraft and the atmosphere during the high-speed motion. Because this study only considers aircraft flying at medium and low altitudes, the linear relationship between the atmospheric ambient temperature <italic>T</italic><sub>0</sub> and altitude <italic>H</italic> satisfies T0 &#x0003D; (288.2-0.0065 H) K, and T0 &#x0003D; 280 K for simplicity. The temperature <italic>T</italic><sub>M</sub> generated by friction and flight speed follow the following functional relationship: <inline-formula><mml:math id="M5"><mml:msub><mml:mrow><mml:mi>T</mml:mi></mml:mrow><mml:mrow><mml:mstyle class="text"><mml:mtext class="textrm" mathvariant="normal">M</mml:mtext></mml:mstyle></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>T</mml:mi></mml:mrow><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x0002B;</mml:mo><mml:mn>0</mml:mn><mml:mo>.</mml:mo><mml:mn>16</mml:mn><mml:msup><mml:mrow><mml:mi>M</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula>, where <italic>M</italic> is the Mach number of the aircraft.</p>
<p>Furthermore, according to Equation (2), the functional relationship between the aircraft skin radiation intensity <italic>I</italic><sub>M</sub> and temperature <italic>T</italic><sub>M</sub> is as follows:</p>
<disp-formula id="E4"><label>(4)</label><mml:math id="M6"><mml:mrow><mml:msub><mml:mi>I</mml:mi><mml:mi>M</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>&#x003B5;</mml:mi><mml:mi>M</mml:mi></mml:msub></mml:mrow><mml:mi>&#x003C0;</mml:mi></mml:mfrac><mml:mstyle displaystyle='true'><mml:mrow><mml:msubsup><mml:mo>&#x0222B;</mml:mo><mml:mrow><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>&#x003BB;</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:mrow></mml:msubsup><mml:mrow><mml:mfrac><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:msup><mml:mi>&#x003BB;</mml:mi><mml:mn>5</mml:mn></mml:msup></mml:mrow></mml:mfrac><mml:mfrac><mml:mn>1</mml:mn><mml:mrow><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:mn>2</mml:mn><mml:msub><mml:mi>T</mml:mi><mml:mi>M</mml:mi></mml:msub></mml:mrow></mml:msup><mml:mo>&#x02212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mfrac><mml:mi>d</mml:mi><mml:mi>&#x003BB;</mml:mi><mml:mo>&#x000B7;</mml:mo><mml:msub><mml:mi>S</mml:mi><mml:mi>M</mml:mi></mml:msub><mml:mo>&#x000B7;</mml:mo><mml:mi>cos</mml:mi><mml:msub><mml:mi>&#x003B8;</mml:mi><mml:mi>M</mml:mi></mml:msub></mml:mrow></mml:mrow></mml:mstyle></mml:mrow></mml:math></disp-formula>
<p>where &#x003B5;<sub><italic>M</italic></sub> is the skin surface emissivity, which is determined by the surface material of the aircraft skin. <italic>S</italic><sub>M</sub> is the cross-sectional area of the aircraft skin facing the probe, and &#x003B8;<sub><italic>M</italic></sub> is the angle between the probe and infrared radiation orientation.</p>
</sec>
<sec>
<title>Tail Flame Radiation Model</title>
<p>The high-temperature flame and high-temperature gas injected by the engine form the tail flame of the aircraft. We assume that the gas temperature in the tail nozzle is <italic>T</italic><sub>F</sub>, the tail flame temperature is <italic>T</italic><sub>P</sub>, and the gas pressures inside and outside the tail nozzle are <italic>P</italic><sub>P</sub> and <italic>P</italic><sub>F</sub>, respectively; then, we have:</p>
<disp-formula id="E5"><label>(5)</label><mml:math id="M7"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:msub><mml:mrow><mml:mi>T</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>T</mml:mi></mml:mrow><mml:mrow><mml:mi>F</mml:mi></mml:mrow></mml:msub><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>/</mml:mo><mml:msub><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>F</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x003B3;</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>/</mml:mo><mml:mi>&#x003B3;</mml:mi></mml:mrow></mml:msup></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>where &#x003B3; is the specific heat of the gas; its value for turbofan aeroengines is 1.3. According to Equation (2), the functional relationship between the radiation intensity <italic>I</italic><sub>P</sub> of the tail nozzle and temperature <italic>T</italic><sub>P</sub> can be established as follows:</p>
<disp-formula id="E6"><label>(6)</label><mml:math id="M8"><mml:mrow><mml:msub><mml:mi>I</mml:mi><mml:mi>p</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>&#x003B5;</mml:mi><mml:mi>p</mml:mi></mml:msub></mml:mrow><mml:mi>&#x003C0;</mml:mi></mml:mfrac><mml:mstyle displaystyle='true'><mml:mrow><mml:msubsup><mml:mo>&#x0222B;</mml:mo><mml:mn>2</mml:mn><mml:mn>2</mml:mn></mml:msubsup><mml:mrow><mml:mfrac><mml:mrow><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:msup><mml:mi>&#x003BB;</mml:mi><mml:mn>5</mml:mn></mml:msup></mml:mrow></mml:mfrac><mml:mfrac><mml:mn>1</mml:mn><mml:mrow><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:msub><mml:mi>&#x003C3;</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mo>/</mml:mo><mml:mn>2</mml:mn><mml:msub><mml:mi>T</mml:mi><mml:mi>p</mml:mi></mml:msub></mml:mrow></mml:msup><mml:mo>&#x02212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mfrac><mml:mi>d</mml:mi><mml:mi>&#x003BB;</mml:mi><mml:mo>&#x000B7;</mml:mo><mml:msub><mml:mi>S</mml:mi><mml:mi>p</mml:mi></mml:msub><mml:mo>&#x000B7;</mml:mo><mml:mi>cos</mml:mi><mml:msub><mml:mi>&#x003B8;</mml:mi><mml:mi>p</mml:mi></mml:msub></mml:mrow></mml:mrow></mml:mstyle></mml:mrow></mml:math></disp-formula>
<p>where &#x003B5;<sub>&#x003C1;</sub> is the surface emissivity of the aircraft tail flame, <italic>S</italic><sub>P</sub> is the cross-sectional area of the aircraft tail flame facing the probe, and &#x003B8;<sub><italic>P</italic></sub> is the angle between the probe and infrared radiation orientation. To improve the intuitive effect, the tail flame is typically simulated by particle flow. Based on the above-infrared radiation model, a 3D target with infrared radiation characteristics was obtained. The infrared radiation intensity of an aircraft dynamically changes with the speed and attitude of the target. <xref ref-type="fig" rid="F2">Figure 2</xref> shows the simulation effect of F-35 aircraft at different attitudes. <xref ref-type="fig" rid="F3">Figure 3</xref> shows the simulation effect of Su-35 aircraft at different speeds.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p>Simulation effect of F-35 aircraft at different attitudes. The speed is Mach 1, and the background is a real infrared image. The coordinates are roll, yaw, and pitch.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fncom-16-930827-g0002.tif"/>
</fig>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p>Infrared characteristics of Su-35 aircraft at different speeds. The speed varies from 0.6 to 2.3 Ma.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fncom-16-930827-g0003.tif"/>
</fig>
</sec>
</sec>
<sec>
<title>Panoramic Stitching of Infrared Images</title>
<p>We expect the targets to fly in a wide infrared scene to obtain a simulated image sequence of moving targets. However, the field of view of infrared sensors is typically narrow. For example, the field of view in the public infrared dataset (Hui et al., <xref ref-type="bibr" rid="B10">2020</xref>) (dataset used for infrared detection and tracking of dim-small aircraft targets under a ground/air background, <ext-link ext-link-type="uri" xlink:href="http://www.csdata.org/p/387/">http://www.csdata.org/p/387/</ext-link>) is only 1&#x000B0; &#x000D7; 1&#x000B0;.</p>
<p>To obtain a continuous projection of the moving target in a real infrared scene, it is necessary to stitch infrared images of a narrow field of view into a panoramic image. In view of the small texture and low contrast of infrared images, a stitching and fusion method must be adopted specifically for infrared images, as detailed in our previous paper (Zhijian et al., <xref ref-type="bibr" rid="B26">2021</xref>), which describes how to stitch a panoramic image from infrared sequence images. <xref ref-type="fig" rid="F4">Figure 4</xref> shows only a part of the stitching results.</p>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption><p>Panoramic stitching results of real infrared images.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fncom-16-930827-g0004.tif"/>
</fig>
</sec>
<sec>
<title>Fusion of Simulated Targets and Real Infrared Scene</title>
<p>This study realized the fusion of a static real infrared scene and dynamic simulated targets based on the Unity3D engine. The main steps were as follows: (1) Constructing a hemisphere with the camera position as the center and the real farthest observation distance as the radius. The panoramic image obtained by splicing real infrared images was used as the epidermis to cover the hemisphere to obtain a pseudo 3D scene, as shown in <xref ref-type="fig" rid="F5">Figure 5</xref>. (2) Based on the flight trajectory (information, such as the position, attitude, and speed of the aircraft at each moment, is set), the 3D infrared simulation target flies in a 3D space. (3) Through human&#x02013;computer interaction, the observation position and viewing angle were dynamically adjusted to track and observe the targets. (4) Each frame of the observation projects the target onto the infrared background and obtains the target infrared data with the real infrared background. With continuous observation, dynamic simulation image sequences of the targets can be obtained.</p>
<fig id="F5" position="float">
<label>Figure 5</label>
<caption><p>Fusion of simulation targets and real infrared scene.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fncom-16-930827-g0005.tif"/>
</fig>
</sec>
</sec>
<sec id="s3">
<title>Experiment and Analysis</title>
<sec>
<title>Dataset and Experiment Setting</title>
<p>The real infrared data used in this experiment comes from the public infrared dataset (Hui et al., <xref ref-type="bibr" rid="B10">2020</xref>) (dataset used for infrared detection and tracking of dim-small aircraft targets under a ground/air background, <ext-link ext-link-type="uri" xlink:href="http://www.csdata.org/p/387/">http://www.csdata.org/p/387/</ext-link>). The dataset covers a variety of scenes such as sky and ground, with a total of 22 data segments, 30 tracks, 16,177 images, and 16,944 targets. Each frame is a gray image with a resolution of 256 &#x000D7; 256 pixels, BMP format, 1&#x000B0; &#x000D7; 1&#x000B0; field of view. Each target corresponds to a label position, and each data segment corresponds to a label file. This data set is usually used in the basic research of dim-small target detection, precision guidance, and infrared target characteristics.</p>
<p>The hardware environment of this experiment is: Dual Core CPU above 2.0 GHz and body memory above 4G. Software environment: system software above Windows 7. The experiment is based on the development of 2021.2.6f1 version of Unity3D. The development language is c&#x00023;, and the development platform is visual studio 2017.</p>
</sec>
<sec>
<title>Subjective Analysis</title>
<p>We selected four scenes from real infrared data introduced in (Hui et al., <xref ref-type="bibr" rid="B10">2020</xref>): sky background, ground background, mixed background, and sky multi-target, which are from data 1, data 7, data 3, and data 2, respectively, in the public dataset. Correspondingly, we also intercepted the above four scenarios from the simulation data, and the comparative results are shown in <xref ref-type="fig" rid="F6">Figure 6</xref>. Visually and intuitively, both the real and simulated data have the following characteristics: (1) The images are gray overall, which conforms to the characteristics of infrared images. (2) The images have low contrast and relatively few textural features. (3) The target appears as bright spots and diffuses into the surroundings. Therefore, the simulated and real infrared data are intuitively similar.</p>
<fig id="F6" position="float">
<label>Figure 6</label>
<caption><p>Real infrared scene and simulated infrared scene.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fncom-16-930827-g0006.tif"/>
</fig>
</sec>
<sec>
<title>Objective Analysis</title>
<p>The purpose of this study was to provide simulation data for the training and testing of infrared target detection and tracking models. Therefore, determining whether the performance of an algorithm on simulated data is consistent with that of the algorithm on real data is the most effective evaluation method (Deng et al., <xref ref-type="bibr" rid="B6">2022</xref>). We used two algorithms (Zhijian et al., <xref ref-type="bibr" rid="B26">2021</xref>; Deng et al., <xref ref-type="bibr" rid="B6">2022</xref>) employed in the 2nd Sky Cup National Innovation and Creativity Competition in 2019 for testing. We compared their performance both on real infrared data and simulated data generated by our method.</p>
<p>In the experiment, the data shown in <xref ref-type="fig" rid="F6">Figure 6</xref> were used; the real infrared data came from data 1, data 7, data 3, and data 2 in the public dataset (Hui et al., <xref ref-type="bibr" rid="B10">2020</xref>). The simulation data also included the sky background, ground background, mixed background, and multiple targets. The resolution was 256 &#x000D7; 256. The targets were all small, that is, &#x0003C;10 pixels.</p>
<p>As in (Zhijian et al., <xref ref-type="bibr" rid="B26">2021</xref>; Deng et al., <xref ref-type="bibr" rid="B6">2022</xref>), four indicators, namely the accurate detection rate, correct detection rate, missed detection rate, and false alarm rate, were used to evaluate the performance of the algorithm. An accurate detection (Acc) is when the detection result is within the 3 &#x000D7; 3 pixel range of the ground truth. Correct detection (Corr) is when the detection result is within the 9 &#x000D7; 9 pixel range of the ground truth. Missing detection (Miss) is when the detection result is outside the 9 &#x000D7; 9 pixel range of the ground truth. A false alarm (FA) refers to a detected non-real target. <xref ref-type="table" rid="T1">Tables 1</xref>, <xref ref-type="table" rid="T2">2</xref> present the detection results without changing any parameters of the original algorithm.</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Infrared target detection results on real and simulated data with algorithm (Tianjun et al., <xref ref-type="bibr" rid="B17">2019</xref>).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th/>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Sky</bold></th>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Ground</bold></th>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Mixed</bold></th>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Multi-targets</bold></th>
</tr>
<tr>
<th/>
<th valign="top" align="center"><bold>Real</bold></th>
<th valign="top" align="center"><bold>Simu</bold></th>
<th valign="top" align="center"><bold>Real</bold></th>
<th valign="top" align="center"><bold>Simu</bold></th>
<th valign="top" align="center"><bold>Real</bold></th>
<th valign="top" align="center"><bold>Simu</bold></th>
<th valign="top" align="center"><bold>Real</bold></th>
<th valign="top" align="center"><bold>Simu</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Acc (%)</td>
<td valign="top" align="center">100</td>
<td valign="top" align="center">99.5</td>
<td valign="top" align="center">91.5</td>
<td valign="top" align="center">84.7</td>
<td valign="top" align="center">94.7</td>
<td valign="top" align="center">90.2</td>
<td valign="top" align="center">99.0</td>
<td valign="top" align="center">98.4</td>
</tr>
<tr>
<td valign="top" align="left">Corr (%)</td>
<td valign="top" align="center">100</td>
<td valign="top" align="center">100</td>
<td valign="top" align="center">93.0</td>
<td valign="top" align="center">90.1</td>
<td valign="top" align="center">96.0</td>
<td valign="top" align="center">93.4</td>
<td valign="top" align="center">99.5</td>
<td valign="top" align="center">99.5</td>
</tr>
<tr>
<td valign="top" align="left">Miss (%)</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">4.0</td>
<td valign="top" align="center">9.9</td>
<td valign="top" align="center">4.0</td>
<td valign="top" align="center">6.6</td>
<td valign="top" align="center">0.5</td>
<td valign="top" align="center">0.5</td>
</tr>
<tr>
<td valign="top" align="left">FA (%)</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">1.8</td>
<td valign="top" align="center">3.0</td>
<td valign="top" align="center">1.2</td>
<td valign="top" align="center">0.5</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">0.0</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p>Infrared target detection results on real and simulated data with algorithm (Xianbu et al., <xref ref-type="bibr" rid="B19">2019</xref>).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th/>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Sky</bold></th>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Ground</bold></th>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Mixed</bold></th>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Multi-targets</bold></th>
</tr>
<tr>
<th/>
<th valign="top" align="center"><bold>Real</bold></th>
<th valign="top" align="center"><bold>Simu</bold></th>
<th valign="top" align="center"><bold>Real</bold></th>
<th valign="top" align="center"><bold>Simu</bold></th>
<th valign="top" align="center"><bold>Real</bold></th>
<th valign="top" align="center"><bold>Simu</bold></th>
<th valign="top" align="center"><bold>Real</bold></th>
<th valign="top" align="center"><bold>Simu</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Acc (%)</td>
<td valign="top" align="center">100</td>
<td valign="top" align="center">99.2</td>
<td valign="top" align="center">92.7</td>
<td valign="top" align="center">88.3</td>
<td valign="top" align="center">65.0</td>
<td valign="top" align="center">70.0</td>
<td valign="top" align="center">98.7</td>
<td valign="top" align="center">92.4</td>
</tr>
<tr>
<td valign="top" align="left">Corr (%)</td>
<td valign="top" align="center">100</td>
<td valign="top" align="center">100</td>
<td valign="top" align="center">97.2</td>
<td valign="top" align="center">92.1</td>
<td valign="top" align="center">79.0</td>
<td valign="top" align="center">83.3</td>
<td valign="top" align="center">99.2</td>
<td valign="top" align="center">95.3</td>
</tr>
<tr>
<td valign="top" align="left">Miss (%)</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">2.8</td>
<td valign="top" align="center">17.9</td>
<td valign="top" align="center">21.0</td>
<td valign="top" align="center">16.7</td>
<td valign="top" align="center">0.8</td>
<td valign="top" align="center">4.7</td>
</tr>
<tr>
<td valign="top" align="left">FA (%)</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">1.5</td>
<td valign="top" align="center">2.3</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">1.3</td>
<td valign="top" align="center">0.0</td>
<td valign="top" align="center">0.0</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>As shown in <xref ref-type="table" rid="T1">Table 1</xref>, the algorithm reported in (Tianjun et al., <xref ref-type="bibr" rid="B17">2019</xref>) performed well on the above four types of scenes, particularly in terms of the Acc and Corr indicators on sky background and multi-target scenes, which reached more than 99%. The performance on the ground background and mixed background is slightly worse; nevertheless, the accurate detection rate is above 90%. On the simulation data, the algorithm also performed well on sky background and multi-target scenes and is similar to the detection results on real data. On the ground and mixed backgrounds, the detection results of the simulated data are slightly worse than those of the real data; nevertheless, the maximum difference in the accurate detection rates is no more than 7% (on the ground background, the difference between the accurate detection rates of the real and simulated data was 6.8).</p>
<p>The performance of the simulation data generated by our method and the real data in the algorithm (Tianjun et al., <xref ref-type="bibr" rid="B17">2019</xref>) is compared as shown in <xref ref-type="fig" rid="F7">Figure 7</xref>. When it performs well on the real dataset, the simulation data generated by our method also perform well, such as in sky and multi-targets scenarios. When its performance of real datasets is poor, the simulation data generated by our method is also poor, such as in ground and mixed scenarios. This consistency is both reflected in the ACC and Corr indicators. Therefore, the simulation data generated by our method are consistent with the real data on the performance of algorithm (Tianjun et al., <xref ref-type="bibr" rid="B17">2019</xref>).</p>
<fig id="F7" position="float">
<label>Figure 7</label>
<caption><p>Performance of simulation data and real data on algorithm (Tianjun et al., <xref ref-type="bibr" rid="B17">2019</xref>).</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fncom-16-930827-g0007.tif"/>
</fig>
<p>As shown in <xref ref-type="table" rid="T2">Table 2</xref>, the performance of the algorithm (Xianbu et al., <xref ref-type="bibr" rid="B19">2019</xref>) is similar to that of the algorithm (Tianjun et al., <xref ref-type="bibr" rid="B17">2019</xref>) on sky background, ground background, and multi-target scenes; however, the Acc drops to 65% on the mixed background. This may be related to the applicability of the algorithm in different scenarios. Interestingly, the detection results on the simulated data also drop to 70%. Both simulation data and real data show the low performance of the algorithm (Tianjun et al., <xref ref-type="bibr" rid="B17">2019</xref>) in mixed scenarios. Regardless of the scenario, the maximum difference between the accurate detection rates of the simulated and real data is still &#x0003C;7% (in a multi-target scenario, the difference between the accurate detection rates of the real and simulated data is 6.3).</p>
<p>Similarly, the performance of the simulation data generated by our method and the real data in the algorithm (Xianbu et al., <xref ref-type="bibr" rid="B19">2019</xref>) is compared as shown in <xref ref-type="fig" rid="F8">Figure 8</xref>. When it performs well on real datasets, the simulation data generated by our method performs also well, such as in sky, ground, and multi-targets scenarios. When its performance on the real dataset is poor, the simulation data generated by our method are also poor, such as in the mixed scene. Therefore, the simulation data generated by our method are consistent with the real data on the performance of algorithm (Xianbu et al., <xref ref-type="bibr" rid="B19">2019</xref>).</p>
<fig id="F8" position="float">
<label>Figure 8</label>
<caption><p>Performance of simulation data and real data on algorithm (Xianbu et al., <xref ref-type="bibr" rid="B19">2019</xref>).</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fncom-16-930827-g0008.tif"/>
</fig>
</sec>
</sec>
<sec id="s4">
<title>Conclusion and Future Work</title>
<p>Training infrared target detection and tracking models based on deep learning requires a large number of infrared sequence images. The cost of acquisition real infrared target sequence images is high, while conventional simulation methods lack authenticity. This paper proposes a novel infrared data simulation method that combines real infrared images and simulated 3D infrared targets. Firstly, it stitches real infrared images into a panoramic image which is used as background. Then, the infrared characteristics of 3D aircraft are simulated on the tail nozzle, skin, and tail flame, which are used as targets. Finally, the background and targets are fused based on Unity3D, where the aircraft trajectory and attitude can be edited freely to generate rich multi-target infrared data. The experimental results show that the simulated image is not only visually similar to the real infrared image but also consistent with the real infrared image in terms of the performance of target detection algorithms. The method can provide training and testing samples for deep learning models for infrared target detection and tracking.</p>
<p>The infrared simulation of the target in this method has not considered the environmental factors (such as weather, temperature, illumination, etc.) and the sensor error. It is necessary to further improve the precision of target infrared simulation to meet some special application scenarios. This is also the direction of our future work.</p>
</sec>
<sec sec-type="data-availability" id="s5">
<title>Data Availability Statement</title>
<p>The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding authors.</p>
</sec>
<sec id="s6">
<title>Author Contributions</title>
<p>HZ contributed the main ideas and designed the algorithm. HB contributed the main ideas. SS contribution on experiments and result analysis. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec sec-type="funding-information" id="s7">
<title>Funding</title>
<p>This work was supported by the Key Laboratory Fund of Basic Strengthening Program (JKWATR-210503), Changsha Municipal Natural Science Foundation (kq2202067), and the Basic Science and Technology Research Project of the National Key Laboratory of Science and Technology on Automatic Target Recognition of Scientific Research under Grant (WDZC20205500209).</p>
</sec>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="disclaimer" id="s8">
<title>Publisher&#x00027;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<ack><p>Professor Fan Hongqi from the ATR Key Laboratory, National University of Defense Technology, provided real infrared data. We thank Shi Tianjun of the Harbin Institute of Technology and Dong Xiaohu of the National University of Defense Technology for their outstanding contributions in the second Sky-Cup National Innovation and Creativity Competition, which provided the test algorithms.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Alec</surname> <given-names>R.</given-names></name> <name><surname>Luke</surname> <given-names>M.</given-names></name> <name><surname>Soumith</surname> <given-names>C.</given-names></name></person-group> (<year>2016</year>). <source>Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks</source>. <publisher-loc>San Juan</publisher-loc>: <publisher-name>ICLR</publisher-name>.<pub-id pub-id-type="pmid">33873122</pub-id></citation></ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bhalla</surname> <given-names>K.</given-names></name> <name><surname>Koundal</surname> <given-names>D.</given-names></name> <name><surname>Bhatia</surname> <given-names>S.</given-names></name> <name><surname>Rahmani</surname> <given-names>M. K.</given-names></name> <name><surname>Tahir</surname> <given-names>M.</given-names></name></person-group> (<year>2022</year>). <article-title>Fusion of infrared and visible images using fuzzy based siamese convolutional network</article-title>. <source>Comput. Mater. Continua.</source> <volume>3</volume>, <fpage>2022</fpage>. <pub-id pub-id-type="doi">10.32604/cmc.2022.021125</pub-id></citation>
</ref>
<ref id="B3">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Chenyang</surname> <given-names>L.</given-names></name></person-group> (<year>2019</year>). <source>The Infrared Imaging Simulation System Based on Three-Dimensional Scene and its Implementation</source>. <publisher-loc>Beijing</publisher-loc>: <publisher-name>The University of Chinese Academy of Sciences</publisher-name>.</citation>
</ref>
<ref id="B4">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Chuanyu</surname> <given-names>Z.</given-names></name></person-group> (<year>2013</year>). <source>Infrared Image Fromation for Multiple Targets</source>. <publisher-loc>Harbin</publisher-loc>: <publisher-name>Harbin Institute of Technology</publisher-name>.<pub-id pub-id-type="pmid">32239847</pub-id></citation></ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dai</surname> <given-names>Y.</given-names></name> <name><surname>Wu</surname> <given-names>Y.</given-names></name> <name><surname>Zhou</surname> <given-names>F.</given-names></name> <name><surname>Barnard</surname> <given-names>K.</given-names></name></person-group> (<year>2021</year>). <article-title>Attentional local contrast networks for infrared small target detection</article-title>. <source>IEEE Trans. Geosci. Remote Sens</source>. <volume>59</volume>, <fpage>9813</fpage>&#x02013;<lpage>9824</lpage>. <pub-id pub-id-type="doi">10.1109/TGRS.2020.3044958</pub-id></citation>
</ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Deng</surname> <given-names>L.</given-names></name> <name><surname>Xu</surname> <given-names>D.</given-names></name> <name><surname>Xu</surname> <given-names>G.</given-names></name> <name><surname>Zhu</surname> <given-names>H.</given-names></name></person-group> (<year>2022</year>). <article-title>A generalized low-rank double-tensor nuclear norm completion framework for infrared small target detection</article-title>. <source>IEEE Trans. Aerosp. Electr. Syst</source>. <fpage>1</fpage>. <pub-id pub-id-type="doi">10.1109./TAES.2022.3147437</pub-id></citation>
</ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Guanfeng</surname> <given-names>Y.</given-names></name> <name><surname>Changhao</surname> <given-names>Z.</given-names></name> <name><surname>Yue</surname> <given-names>C.</given-names></name></person-group> (<year>2019</year>). <article-title>Research on infrared imaging simulantion for enhanced synthetic vision system</article-title>. <source>Aeronaut. Comput. Tech</source>. <volume>49</volume>, <fpage>100</fpage>&#x02013;<lpage>103</lpage>.</citation>
</ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haixing</surname> <given-names>Z.</given-names></name> <name><surname>Jianqi</surname> <given-names>Z.</given-names></name> <name><surname>Wei</surname> <given-names>Y.</given-names></name></person-group> (<year>1997</year>). <article-title>Theoretical calculation of the IR radiation of an aeroplane</article-title>. <source>J. Xidian Univ</source>. <volume>24</volume>, <fpage>78</fpage>&#x02013;<lpage>82</lpage>.</citation>
</ref>
<ref id="B9">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hou</surname> <given-names>Q.</given-names></name> <name><surname>Wang</surname> <given-names>Z.</given-names></name> <name><surname>Tan</surname> <given-names>F.</given-names></name> <name><surname>Zhao</surname> <given-names>Y.</given-names></name> <name><surname>Zheng</surname> <given-names>H.</given-names></name> <name><surname>Zhang</surname> <given-names>W.</given-names></name> <etal/></person-group>. (<year>2022</year>). <article-title>RISTDnet: robust infrared small target detection network</article-title>. <source>IEEE Geosci. Remote Sens. Lett</source>. <volume>19</volume>, <fpage>1</fpage>&#x02013;<lpage>5</lpage>. <pub-id pub-id-type="doi">10.1109/LGRS.2021.3050828</pub-id></citation>
</ref>
<ref id="B10">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hui</surname> <given-names>B.</given-names></name> <name><surname>Song</surname> <given-names>Z.</given-names></name> <name><surname>Fan</surname> <given-names>H.</given-names></name> <name><surname>Zhong</surname> <given-names>P.</given-names></name> <name><surname>Hu</surname> <given-names>W.</given-names></name> <name><surname>Zhang</surname> <given-names>X.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>A dataset for infrared detection and tracking of dim-small aircraft targets under ground/air background</article-title>. <source>Chinese Sci. Data</source>. <volume>5</volume>, <fpage>291</fpage>&#x02013;<lpage>302</lpage>. <pub-id pub-id-type="doi">10.11922/csdata.2019.0074.zh</pub-id></citation>
</ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Junhong</surname> <given-names>L.</given-names></name> <name><surname>Ping</surname> <given-names>Z.</given-names></name> <name><surname>Xiaowei</surname> <given-names>W.</given-names></name> <name><surname>Shize</surname> <given-names>H.</given-names></name></person-group> (<year>2020</year>). <article-title>Infrared small-target detection algorithms: a survey</article-title>. <source>J. Image Graph</source>. <volume>25</volume>, <fpage>1739</fpage>&#x02013;<lpage>1753</lpage>. <pub-id pub-id-type="doi">10.11834/jig.190574</pub-id></citation>
</ref>
<ref id="B12">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Junyan</surname> <given-names>Z.</given-names></name> <name><surname>Taesung</surname> <given-names>P.</given-names></name> <name><surname>Isola</surname> <given-names>P.</given-names></name></person-group> (<year>2017</year>). <source>Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks</source>. <publisher-loc>Venice</publisher-loc>: <publisher-name>ICCV</publisher-name>.</citation>
</ref>
<ref id="B13">
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Mirza</surname> <given-names>M.</given-names></name> <name><surname>Osindero</surname> <given-names>S.</given-names></name></person-group> (<year>2014</year>). <article-title>Conditional generative adversarial nets</article-title>. <source>Comput. Sci.</source> <fpage>2672</fpage>&#x02013;<lpage>80</lpage>. Avaialble online at: <ext-link ext-link-type="uri" xlink:href="https://arxiv.org/abs/1411.1784">https://arxiv.org/abs/1411.1784</ext-link></citation>
</ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rani</surname> <given-names>S.</given-names></name> <name><surname>Singh</surname> <given-names>B. K.</given-names></name> <name><surname>Koundal</surname> <given-names>D.</given-names></name> <name><surname>Athavale</surname> <given-names>V. A.</given-names></name></person-group> (<year>2022</year>). <article-title>Localization of stroke lesion in MRI images using object detection techniques: a comprehensive review</article-title>. <source>Neurosci Inform</source>. <volume>2</volume>, <fpage>100070</fpage>. <pub-id pub-id-type="doi">10.1016/j.neuri.2022.100070</pub-id></citation>
</ref>
<ref id="B15">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shi</surname> <given-names>Q.</given-names></name> <name><surname>Gao</surname> <given-names>Y.</given-names></name> <name><surname>Zhang</surname> <given-names>X.</given-names></name> <name><surname>Li</surname> <given-names>Z.</given-names></name> <name><surname>Du</surname> <given-names>J.</given-names></name> <name><surname>Shi</surname> <given-names>R.</given-names></name> <etal/></person-group>. (<year>2021</year>). <article-title>Cryogenic background infrared scene generation method based on a light-driven blackbody micro cavity array</article-title>. <source>Infrared Phys. Technol</source>. <volume>117</volume>, <fpage>103841</fpage>. <pub-id pub-id-type="doi">10.1016/j.infrared.2021.103841</pub-id></citation>
</ref>
<ref id="B16">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shuwei</surname> <given-names>T.</given-names></name> <name><surname>Bo</surname> <given-names>X.</given-names></name></person-group> (<year>2018</year>). <article-title>Research on infrared scene built by computer</article-title>. <source>Electro-Optic. Technol. Appl</source>. <volume>33</volume>, <fpage>58</fpage>&#x02013;<lpage>61</lpage>.</citation>
</ref>
<ref id="B17">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tianjun</surname> <given-names>S.</given-names></name> <name><surname>Guangzhen</surname> <given-names>B.</given-names></name> <name><surname>Fuhai</surname> <given-names>W.</given-names></name> <name><surname>Chaofei</surname> <given-names>L.</given-names></name> <name><surname>Jinnan</surname> <given-names>G.</given-names></name></person-group> (<year>2019</year>). <article-title>An infrared small target detection and tracking algorithm applying for multiple scenarios</article-title>. <source>Aero Weaponry</source>. <volume>26</volume>, <fpage>35</fpage>&#x02013;<lpage>42</lpage>. <pub-id pub-id-type="doi">10.12132/ISSN.1673-5048.2019.0220</pub-id></citation>
</ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xia</surname> <given-names>W.</given-names></name> <name><surname>Hao</surname> <given-names>W.</given-names></name> <name><surname>Chao</surname> <given-names>X.</given-names></name></person-group> (<year>2015</year>). <article-title>Overview on development of infrared scene simulation</article-title>. <source>Infrared Technol</source>. <volume>7</volume>, <fpage>537</fpage>&#x02013;<lpage>43</lpage>. <pub-id pub-id-type="doi">10.11846/j.issn.1001_8891.201507001</pub-id></citation>
</ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xianbu</surname> <given-names>D.</given-names></name> <name><surname>Ruigang</surname> <given-names>F.</given-names></name> <name><surname>Yinghui</surname> <given-names>G.</given-names></name> <name><surname>Bio</surname> <given-names>L.</given-names></name></person-group> (<year>2019</year>). <article-title>Detecting and tracking of small infrared targets adaptively in complex background</article-title>. <source>Aero Weaponry</source>. <volume>26</volume>, <fpage>22</fpage>&#x02013;<lpage>28</lpage>. <pub-id pub-id-type="doi">10.12132/ISSN.1673-5048.2019.0233</pub-id></citation>
</ref>
<ref id="B20">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Yi</surname> <given-names>H.</given-names></name></person-group> (<year>2020</year>). <source>RGB-to-NIR Image Translation Using Generative Adversarial Network</source>. <publisher-loc>Wuhan</publisher-loc>: <publisher-name>Central Normal China University</publisher-name>.</citation>
</ref>
<ref id="B21">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yi</surname> <given-names>Y.</given-names></name> <name><surname>Changbin</surname> <given-names>X.</given-names></name> <name><surname>Yuying</surname> <given-names>M.</given-names></name></person-group> (<year>2019</year>). <article-title>A review of infrared dim small target detection algorithms with low SNR</article-title>. <source>Laser Infrared</source>. <volume>49</volume>, <fpage>643</fpage>&#x02013;<lpage>649</lpage>. <pub-id pub-id-type="doi">10.3969/j.issn.1001-5078.2019.06.001</pub-id></citation>
</ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yongjie</surname> <given-names>Z.</given-names></name> <name><surname>Zhenya</surname> <given-names>X.</given-names></name> <name><surname>Jianxun</surname> <given-names>L.</given-names></name></person-group> (<year>2020</year>). <article-title>Study on simulation model of aircraft infrared hyperspectral image</article-title>. <source>Aero Weaponry</source>. <volume>27</volume>, <fpage>91</fpage>&#x02013;<lpage>96</lpage>. <pub-id pub-id-type="doi">10.12132/ISSN.1673-5048.2019.0082</pub-id></citation>
</ref>
<ref id="B23">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Yu</surname> <given-names>C.</given-names></name></person-group> (<year>2012</year>). <source>Design of Infrared Decoy HIL Simulation System Based on Finite Element Module</source>. <publisher-loc>Harbin</publisher-loc>: <publisher-name>Harbin Institute of Technology</publisher-name>.</citation>
</ref>
<ref id="B24">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Yunjey</surname> <given-names>C.</given-names></name> <name><surname>Youngjung</surname> <given-names>U.</given-names></name> <name><surname>Jaejun</surname> <given-names>Y.</given-names></name> <name><surname>Ha</surname> <given-names>J.-W.</given-names></name></person-group> (<year>2020</year>). <article-title>&#x0201C;StarGAN v2: diverse image synthesis for multiple domains,&#x0201D;</article-title> in <source>2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)</source> (<publisher-loc>Seattle</publisher-loc>).</citation>
</ref>
<ref id="B25">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>R.</given-names></name> <name><surname>Mu</surname> <given-names>C.</given-names></name> <name><surname>Yang</surname> <given-names>Y.</given-names></name> <name><surname>Xu</surname> <given-names>L.</given-names></name></person-group> (<year>2018</year>). <article-title>Research on simulated infrared image utility evaluation using deep representation</article-title>. <source>J. Electr. Imag</source>. <volume>27</volume>, <fpage>013012</fpage>. <pub-id pub-id-type="doi">10.1117/1.JEI.27.1.013012</pub-id></citation>
</ref>
<ref id="B26">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Zhijian</surname> <given-names>H.</given-names></name> <name><surname>Bingwei</surname> <given-names>H.</given-names></name> <name><surname>Shujin</surname> <given-names>S.</given-names></name></person-group> (<year>2021</year>). <source>An Automatic Image Stitching Method for Infrared Image Series</source>. <publisher-loc>Xi&#x00027;an</publisher-loc>: <publisher-name>ICCAIS</publisher-name>.</citation>
</ref>
</ref-list>
</back>
</article>