You're viewing our updated article page. If you need more time to adjust, you can return to the old layout.

REVIEW article

Front. Mar. Sci., 23 February 2023

Sec. Marine Fisheries, Aquaculture and Living Resources

Volume 10 - 2023 | https://doi.org/10.3389/fmars.2023.1010761

Artificial intelligence for fish behavior recognition may unlock fishing gear selectivity

  • Laboratoire de Technologie et de Biologie Halieutique, Institut Agro, Dynamique et durabilité des écosystèmes (DECOD) (Ecosystem Dynamics and Sustainability), L'Institut français de recherche pour l'exploitation de la mer (IFREMER), l'Institut National de Recherche pour l'Agriculture, l'Alimentation et l'Environnement (INRAE), Lorient, France

Article metrics

View details

46

Citations

37,7k

Views

9,1k

Downloads

Abstract

Through the advancement of observation systems, our vision has far extended its reach into the world of fishes, and how they interact with fishing gears—breaking through physical boundaries and visually adapting to challenging conditions in marine environments. As marine sciences step into the era of artificial intelligence (AI), deep learning models now provide tools for researchers to process a large amount of imagery data (i.e., image sequence, video) on fish behavior in a more time-efficient and cost-effective manner. The latest AI models to detect fish and categorize species are now reaching human-like accuracy. Nevertheless, robust tools to track fish movements in situ are under development and primarily focused on tropical species. Data to accurately interpret fish interactions with fishing gears is still lacking, especially for temperate fishes. At the same time, this is an essential step for selectivity studies to advance and integrate AI methods in assessing the effectiveness of modified gears. We here conduct a bibliometric analysis to review the recent advances and applications of AI in automated tools for fish tracking, classification, and behavior recognition, highlighting how they may ultimately help improve gear selectivity. We further show how transforming external stimuli that influence fish behavior, such as sensory cues and gears as background, into interpretable features that models learn to distinguish remains challenging. By presenting the recent advances in AI on fish behavior applied to fishing gear improvements (e.g., Long Short-Term Memory (LSTM), Generative Adversarial Network (GAN), coupled networks), we discuss the advances, potential and limits of AI to help meet the demands of fishing policies and sustainable goals, as scientists and developers continue to collaborate in building the database needed to train deep learning models.

1 Introduction

In observing fishes, the human eye can efficiently distinguish swimming movements, where the fish is, how it is swimming, how it is interacting with other fishes and its environment (He, 2010). For ethologists, interpreting behaviors from visual observations come almost instantaneously. As developments of non-invasive and autonomous underwater video cameras continue to advance (Graham et al., 2004; Moustahfid et al., 2020), behavioral observations can now be derived from a plethora of high-resolution marine imagery and videos (Logares et al., 2021). The reach of human vision continues to extend as cameras can be used in most conditions (Shafait et al., 2016; Christensen et al., 2018; Jalal et al., 2020), such as light, dark and muddy underwater conditions, and can go to greater depth and longer periods (Torres et al., 2020; Bilodeau et al., 2022; Xia et al., 2022). Cameras can now provide vision in 2D or 3D into how fishes interact with fishing gears used to capture marine species (e.g., pots, lines, trawls and nets) where behavior can be recorded by an observation system. It allowed direct vision on how gear components affect catches and escapements (Graham, 2003; Nian et al., 2013; Rosen et al., 2013; Williams et al., 2016; Langlois et al., 2020; Sokolova et al., 2021; Lomeli et al., 2021) and has opened windows to observe behaviors of fishes in any kind of environmental condition (Robert et al., 2020; Cuende et al., 2022).

This marked an important step to capture finer details in the process of fishing gear selectivity (i.e., the gear’s ability to retain only targeted species, while avoiding bycatch of vulnerable, unwanted species or undersized individuals). Innovations in gear selectivity continue to bring in new types of selection and bycatch reduction devices added to gear designs (e.g., for review of selective and bycatch reductions devices, see Vogel, 2016; Matt et al., 2021; for grid, see Brinkhof et al., 2020, for mesh size: Kim et al., 2008; Aydin and Tosunòlu, 2010; Cuende et al., 2020b; Cuende et al., 2022, for panels: Bullough et al., 2007; Ferro et al., 2007). By observing the influence of these modifications, finer selectivity patterns have been unraveled, highlighting how the visual, hearing and tactile cues that species are sensitive to are key in the capture process of fishes (Arimoto et al., 2010; Yan et al., 2010). As studies in fish vision show differences in behavior across species in relation to their spectral sensitivity (Goldsmith and Fernandez, 1968; Carleton et al., 2020), gears continue to be developed with visual components, such as light and color, that aim to make them more or less detectable (Ellis and Stadler, 2005; Sarriá et al., 2009; Underwood et al., 2021). Mesh and panel configurations affect tactile cues and herding behavior that can differ among species (Ryer et al., 2006). Thus, they are continually being tested across different fishing zones (Ferro et al., 2007; Cuende et al., 2020a) as environmental conditions such as depth and light penetration change fish behavior (Blaxter, 1988). Observations of how visual, acoustic, or mechanosensory stimuli elicit fish movement have been extensively studied (e.g., Forlim and Pinto, 2014; Popper and Hawkins, 2019; Xu et al., 2022). Quantifying reactions of fishes to stimuli or gear modifications requires an assessment of their swimming patterns that are highly variable and nonlinear as they are under stress, in constant locomotion (Kim and Wardle, 2003; Kim and Wardle, 2005) and are affected by several environmental factors (Schwarz, 1985; Baatrup, 2009; Yu et al., 2021; Xu et al., 2022). Moreover, their movement often differ between individual and group behavior (Viscido et al., 2004; Stienessen and Parrish, 2013; Harpaz et al., 2017; Schaerf et al., 2017).

As of today, automated tools in fish recognition have been mostly driven by economical frameworks such as in monitoring their welfare on fish farms. (Zhou et al., 2017; Muñoz-Benavent et al., 2018; Cheng et al., 2019; Måløy et al., 2019; Bekkozhayeva et al., 2021; X. Yang et al., 2020), in directing migratory trajectories in river passageways (Stuart et al., 2008; Cooke et al., 2020; Eickholt et al., 2020; Jones and Hale, 2020) and stock assessments (Mellody, 2015; Myrum et al., 2019; Connolly et al., 2021; Ovchinnikova et al., 2021). Artificial Intelligence (AI) has thus become a multi-purpose data processing tool in marine science that is integrated in model simulations, predictions of physical and ecological events (Chen et al., 2013) and imagery data processing from large-scale to fine-scale observations (Beyan and Browman, 2020). Yet, observations are often focused on the temporal aspects of swimming behavior on a 2D-scale (Lee et al., 2004; G. Wang et al., 2021) with lack of spatial depth and 3D components of the real world, providing only a narrow window of their actual behavior as a whole. These movements and their complexity need to be transformed into meaningful metrics derived from video observations (Aguzzi et al., 2015; Pereira et al., 2020). This requires a tremendous amount of time, focus, effort and is subject to error and incomplete manual observations (Huang et al., 2015; Guidi et al., 2020). This is where AI methods enter (Packard et al., 2021): the principle is to translate what the human eye sees and what the brain interprets into computer vision (or machine vision) and artificial neural networks (van Gerven and Bohte, 2017; Boyun et al., 2019). For computer vision, images of fishes and their corresponding features (temporal and spatial) must therefore be translated to numerical units that the computer can understand (Aguzzi et al., 2015).

Studies and innovations on fish observations over the past decade have successfully generated models that can automatically see fishes on videos, identify taxa and follow their swimming direction with considerable accuracy (Hsiao et al., 2014; Nasreddine and Benzinou, 2015; Ravanbakhsh et al., 2015; Boudhane and Nsiri, 2016; Qin et al., 2016; Marini et al., 2018; Xu and Matzner, 2018; Salman et al., 2019; Cai et al., 2020; Cui et al., 2020; Jalal et al., 2020; Raza and Hong, 2020; Yuan et al., 2020; Ben Tamou et al., 2021; Cao et al., 2021; Crescitelli et al., 2021; Li et al., 2021; Lopez-Marcano et al., 2021; Knausgård et al., 2021). Despite recent advancements, it remains challenging to train existing AI models (e.g., Convolutional Neural Network, CNN; Faster Recurrent CNN, Faster RCNN; Residual Network, ResNet; Long Short-Term Memory, LSTM; Convolutional 3-dimensional network, C3D, etc.) that could recognize fish behaviors from their swimming movements in 3D (Li et al., 2022) given the myriad of variability occurring at sea (Christensen et al., 2018). Artificial Intelligence may help to further improve the sustainability of fishing as the classical selective studies are reaching a plateau due to bottleneck in data collection inherent to the challenge of obtaining direct, in situ observations.

This paper addresses common stimuli that trigger fish reactions from selective devices in fishing gears and how these behavioral responses are transformable into quantifiable metrics with selectivity modeling and classification methods that can be pipelined in AI methods (Section 2). Section 3 presents current state and limitations of AI applied to fish gear interactions through a bibliometric analysis and the recent developments in automatic behavior recognition. The fourth section addresses the hurdles of observing interactions of fishes across fishing gear selectivity studies and how AI methods may help face these challenges.

2 Observing stimuli-response in fishing gears: The teaching base of AI models for behavior recognition

“Researchers now realised that, like the rest of the vertebrate kingdom, fishes exhibit a rich array of sophisticated behaviour and that learning plays a pivotal role in behavioural development of fishes. Gone, or at least redundant, are the days where fishes were looked down upon as pea-brained machines whose only behavioural flexibility was severely curtailed by their infamous 3-second memory” (Brown et al., 2006)

2.1 Observations of fish behavior in fishing gears

Early testing, through manual counting, size measurement, and quantification of catches/retention, has paved the way for selective devices and gear modifications to be integrated in the design of commercial fishing gear. Mesh modifications were suggested through empirical approaches by studying catch retention (e.g., catch comparison or covered codend methods) (Dealteris and Reifsteck, 1993; Ordines et al., 2006; Aydin and Tosunòlu, 2010b; Anders et al., 2017b), tank experiments for manual observations of fish passing through meshes (Glass et al., 1993; Glass et al., 1995) and even numerical approaches which estimates catches a posteriori (e.g., SELECT; Fonseca et al., 2005). Optic and sonar imaging rapidly came into play to directly estimate catches during capture (Silicon Intensified Target, SIT camera system, Krag et al., 2009; acoustic imaging, Ferro et al., 2007), then applied to observe species behavior in gears (Mortensen et al., 2017). Over the years, observing fishes became achievable in various conditions with the breadth of available technology that can be autonomously deployed for ecological and fisheries monitoring (Durden et al., 2016; Moustahfid et al., 2020). Example of technological solution to observe behavior in real world condition are presented in Table 1.

Table 1

Condition of observation Technological Solutions Limitation Examples
High Turbidity High spatial acuity cameras, laser-imaging, Cameras with polarized filters or light sources High cost (Lu et al., 2017)
Backscatter of natural light
Dark environment Far-red illumination (680 nm LED), near-infrared illumination Less features in images; narrow range of view (Chidami et al., 2007; Shcherbakov et al., 2012)
Species-level recognition High-definition cameras High cost, limited to RGB cameras (Crescitelli et al., 2021; Murugaiyan et al., 2021)
Abrupt changes in animal orientation, fast-swimming species High shutter speed (> 200 frames per second) High cost (Catania et al., 2008)
Continuous recordings of species distribution Long-battery/low-energy/cabled cameras Limited spatial range (Rosen and Holst, 2013; DeCelles et al., 2017)
Collective behavior, capturing large elements or objects Stage-wide cameras/multiple set-up cameras, far-range sonars, hydroacoustic High cost; logistically demanding (Wei et al., 2022)
Capture depth/3D features 3D/holographic/stereo cameras, cameras with distance-compensated structured lighting, Optical-Acoustic Hybrid Imaging Heavy computational cost; logistically demanding (Sawada et al., 2004; Negahdaripour, 2005; Pautsina et al., 2015)
Small compartments/space Compact/micro cameras Low image resolution (Duecker et al., 2020)

Example of technological solution to observe behavior in varying conditions.

Interesting behaviors from fishes have since been unearthed such as anti-predatory responses (Rieucau et al., 2014), encounters of fish with nets (Jones et al., 2008; Rudstam et al., 2011), differences in swimming speed (He, 1993; Breen et al., 2004; Spangler and Collins, 2011), avoidance (de Robertis and Handegard, 2013), exhaustion (Krag et al., 2009), orientation (Odling-Smee and Braithwaite, 2003; Holbrook and Perera, 2009; Haro et al., 2020), escapement (Glass et al., 1993; Mandralis et al., 2021), herding behavior (Ryer et al., 2006), and unique social behaviors (Anders et al., 2017a) from which selectivity studies in gears are based on. Knowledge of fish reaction and escape behavior has thus grown, leading to the development of novel gears with more open meshes, careful placement of sorting grids, and other devices to improve both size and species selectivity (Stewart, 2001; Watson and Kerstetter, 2006; Vogel, 2016; O’Neill et al., 2019). Gear selectivity might also be improved by triggering active species responses, using light, sound, and physical stimuli (O’Neill and Mutch, 2017).

2.2 Current observations of fish stimuli-response

2.2.1 Responses to light and color stimuli

Fish responses to light has been mainly studied in controlled environments and in aquaculture. It is challenging to observe light responses at sea as light attenuation limits the direct observations of fish behavior. The response to light—i.e., phototaxis—can improve gear selectivity as fishes greatly depend on vision for sensory information (Guthrie, 1986). Depending on the species and the development stage (Kunz, 2006), fishes can exhibit either positive (swimming towards light source) or negative phototaxis (swimming away) to different wavelength and intensities of light (Raymond and Widder, 2007; Underwood et al., 2021). Thus, artificial illumination is taking considerable attention for behavioral guidance of fishes to dissuade fishes from entering the gear (Larsen et al., 2017), or to help them escape from within (Southworth et al., 2020). Illumination in gears take either the form of LED light installments (e.g., illuminated escape rings for non-targeted species, Watson, 2013; illuminated separation grids for ground fishes, O’Neill et al., 2018b) or with glow-in-the-dark netting material (Karlsen et al., 2021). In dark environments, near-infrared light or red light is usually used to observe the behaviors of fishes instead of white light that may disrupt behaviors of fishes (Widder et al., 2005; Raymond and Widder, 2007; Underwood et al., 2021).

Responses of fish to color also play an important part as most bony fishes are tetrachromatic, allowing them to see colors more vividly than humans (Bowmaker and Kunz, 1987). Some fishes may be more visually sensitive to certain kinds of light wavelength and intensity (Lomeli and Wakefield, 2019), other may be non-responsive (Underwood et al., 2021). Researchers thus use these species-selective traits to install light devices (LED lights, infrared light, laser beams) on gears or change the color of the fishing nets (white, transparent, black) depending on the selected species (Simon et al., 2020; Méhault et al., 2022)

2.2.2 Responses to acoustic stimuli

Sound has been long used by fishers to scare fishes and gather them for bottom trawling. Yet, the response to sound—i.e., phonotaxis—can also be used for selectivity as hearing species are generally sensitive to specific frequencies (Dijkgraaf, 1960). Selectivity studies typically observe negative phonotaxis (i.e., avoidance) triggered by low-frequency sound (Schwarz and Greer, 2011), which can be displayed by fishes in different ways (Popper and Carlson, 1998; de Robertis and Handegard, 2013). Similar to light responses, some fishes tend to be more sensitive to certain sound frequencies, some are called “hearing specialists” such as Atlantic herring and cod (Chapman and Hawkins, 1973; Doksæter et al., 2012; Pieniazek et al., 2020). O’Neill et al. (2019) also suggested that passive acoustic approaches with sound reflectors can be designed with gears to make them more detectable for echo-locating species (He, 2010). Mainly, sound and light added to fishing gears can help attract the targeted species and help deter vulnerable or harmful animals such as mammals or fish predators (Putland and Mensinger, 2019; Lucas and Berggren, 2022). Although fishing techniques with sound have been in practice since a while (He, 2010), exploration for species selective sound devices are still at its early stages.

2.2.3 Responses to physical stimuli

The response to physical contact—i.e., thigmotaxis—shows the tendency of fishes to remain close to the seabed, or the lateral structure of gears (Millot et al., 2009). This behavior can be utilized to modify mechanical structures and panels in gears. Physical stimuli can play an important role for allowing fishes to escape (Mandralis et al., 2021) or be sorted (Larsen and Larsen, 1993; Brinkhof et al., 2020). These are usually installed in or on the gears after a series of behavioral trials on fish responses to different configurations (Santos et al., 2016). Physical stimuli are thus often drawn from the species-specific behavior (Ferro et al., 2007; Cuende et al., 2020a).

Fishes tend to orient themselves to face the water flow to hold a stationary position and lower the amount of energy they spend; this is called rheotaxis (Painter, 2021). The directional behavior due to water flow may be used to improve selectivity in trawls. For example, veil nets on shrimp fishery can modify the flow within gears, directing fishes to selective grids and net structures (Graham, 2003) and water jets projecting downward of forward can elicit early avoidance from fishes about to enter the gear (Jordan et al., 2013).

2.2.4 Other stimuli and combination of stimuli

Other stimuli relating to chemical responses (chemotaxis; Løkkeborg, 1990) and electrosensory responses (i.e., electrotaxis; Sharber et al., 1994; O’Connell et al., 2014) in fishes still need to undergo trials. Chemotaxis, which fishes use for foraging, may help fishes acquire information from greater distances (Weissburg, 2016) and are used in baited fisheries (Rose et al., 2005). Electrotaxis that elasmobranchs use to detect weak electromagnetic signals is exploited in longline fishing to reduce bycatch with electropositive metals and magnets (Kaimmer and Stoner, 2008; Robbins et al., 2011; O’Connell et al., 2014). Combination of multiple stimuli such as acoustic and visual signals also promote different responses from fishes, enhancing or impeding the responses to other cues (Lukas et al., 2021). Overall, understanding multi-sensory modalities of marine animals may help adjust selective devices, reducing bycatch and focusing catches to targeted species (Walsh et al., 2004; Jordan et al., 2013).

2.3 AI application to fish stimuli

Studying fish responses to stimuli require empirical studies, which are often limited in terms of replicates due to logistical constraints and temporal demand to collect and process raw data. Stimuli have thus been studied manually, since automatization remains difficult to apply to in situ conditions due to heterogeneous, moving background and environmental conditions. Manual observations of stimuli response currently provide the reference point for behavior recognition which now faces more and more data to process from continued observations at sea. Applying AI models may ease the data processing and enable to exploit larger amount of data. As opposed to traditional tracking method applicable to controlled experiments (e.g., background subtraction and Kalman filters, Simon et al., 2020), deep learning models are less sensitive and may be applied to harsher conditions (Sokolova et al., 2021). Computer vision can also be improved by selecting the observation system the most appropriate to produce imagery data for the fishing gear used; the variety of systems and data processing approaches for stimuli is presented in Table 2.

Table 2

Stimuli Behavioral studies Data Processing Fishery Potential Application
Taxi (Response) Short-term Behavior Current Observation Systems (Computer vision) Behavioral Data Processing AI application Advantage/Limitation of Computer Vision Fishing gears Selective Device/Method
Chemical Chemotaxis Attraction, repulsion, feeding, herding Baited Remote Underwater Videos (BRUV), Optical (RGB) or Infrared) and Hydroacoustic camera Automated Cascade Faster R-CNN (Méhault et al., 2022), C3D Model (G. Wang et al., 2021), Dual Stream Recurrent Network (Måløy et al., 2019) Zero to low visibility of chemical diffusion in water that can be seen by computer vision Baited gears (fish pots, hook-and-line, longline, gillnets, trawling) Natural or artificial baits
Light Phototaxis Attraction, repulsion, herding Optical cameras, Hydroacoustic camera Automated C3D Model (G. Wang et al., 2021), YOLOv2 + behavioral metric pipeline (Barreiros et al., 2021) Light attenuation in water Any type of gears (Pots, longline, Gillnet, Surrounding nets, Lift nets, Seine, Trawl, Dredge) LED lights, laser beams
Sound Phonotaxis Attraction or repulsion Optical cameras (RGB or Infrared), Hydroacoustic camera Automated C3D Model (G. Wanget al., 2021), YOLOv2 + behavioral metric pipeline (Barreiros et al., 2021) Sound diffusion can only be detected with acoustic cameras, stimuli origin not visible to optical cameras Gillnet, Purse Seine Acoustic beams (Gan et al., 2012), pingers/sonar reflectors, fish calling devices (donburi, payao) (Yan et al., 2010)
Water current Rheotaxis Change in orientation, herding, or speed Optical cameras (RGB or Infrared), Hydroacoustic camera Manual Particle image velocimetry (PIV) (Oteiza et al., 2017) Requires additional measurement for speed of current Any type of gears (Pots, longline, Gillnet, Surrounding nets, Lift nets, Seine, Trawl, Dredge) Bait diffusion from source, Water jets, gear panels
Physical barriers/touch Thigmotaxis Herding, sheltering behavior Optical cameras (RGB or Infrared), Hydroacoustic camera Automated Motion influence map + RNN (Zhao et al., 2018) Requires wide angles of video recording and image capture Any type of gears (Pots, longline, Gillnet, Surrounding nets, Lift nets, Seine, Trawl, Dredge) Panels, mesh size and shape, netting grids

Examples of fish behavior studies exploring species’ responses to stimuli using AI and their application on fisheries.

For comprehensive summary of fishing gears, see (He et al., 2021).

3 Artificial Intelligence for fish behavior applications

3.1 Bibliometric analysis

3.1.1 Bibliometric analysis methods

A bibliographic research was done in February 2022 on SCOPUS for scientific journals on 2 sets of 5 queries (Figure 1). Each of the query of the first set (256 articles) included AI-related keywords. The queries linked to the AI keywords were selected to obtain studies that focus on fish behavior, underwater observations, fishing gears, and in ecological studies. The second set had the same keywords as the first set but included keywords for both saltwater and freshwater ecosystems to exclude automatic detection and classification of fish species done onboard fishing vessels with the use of keywords in all the 5 queries. This narrowed down the number of extracted publications to 138 articles (Figure 1). However, both sets of publications still included studies not relevant to the topic, so a manual screening was undertaken. The screening was done one by one among the extracted studies to keep only the relevant studies which were cross analyzed with other pertinent studies that have not been included in the SCOPUS results but are mentioned in this review. The studies that were removed from the list focused on topology mapping, stock assessment, climatological studies, biochemical studies, and automatic identification for other marine fauna and flora such as sea cucumber and algae. A final list of 384 relevant studies was collected and reviewed to extract the studies with automated fish detection, counting, species classification, motion tracking and behavior recognition with deep learning models in underwater systems.

Figure 1

Figure 1

Visualization of bibliographic search. Top photo: Set of queries in SCOPUS and number of resulting articles. Fish*W/2 ecology keyword was used to focus the search on ecologically-based studies. Bottom photo: Bibliometric landscape of topics from articles (Linkage of keywords, occurrence > 5).

3.1.2 Bibliometric analysis results

The gathered studies show that the automation of tasks such as fish detection, species classification, fish counting, fish tracking, and behavior recognition is progressively materializing in the 21st century (Figure 2). The onset of ecological studies of fishes based on AI and computer vision has surfaced in the past 10 years (87 publications in relation to fish detection and classification; 36 in relation to fish behavior recognition extracted from bibliography search in SCOPUS). Developments are still on their early stages but are gaining attention rapidly, particularly for automatic detection and classification techniques thanks to the rise of deep learning (LeCun et al., 2015). Studies are fewer for automatic motion tracking of fishes and behavior recognition compared to detection and classification studies as they build on the AI methods of the latter and require more complex processing. While fish detection is being widely applied in marine habitats for several years (Fisher et al., 2016), automatic tracking and behavior recognition of fishes during capture process has yet to be applied. The following sections expand the results from the bibliometric analysis and give a brief explanation of AI and examples on the current applications of behavior recognition that can be transferred to selectivity studies.

Figure 2

Figure 2

Number of publications between 1989 and 2022 for the 3 categories. The number of publications in all categories is from the cross-analysis between bibliographic search in SCOPUS and manual search in both Google Scholar and Web of Science. The final list includes 388 relevant articles reviewed one by one and categorized by the authors according to the methods included in each study.

3.2 Introduction to Artificial Intelligence

As current observations of fish behaviors in fishing gears now step into the era of AI and deep learning along with other domains in marine science (Malde et al., 2020; Logares et al., 2021; Packard et al., 2021), Internet of Underwater Things (IoUT) and Big Data coupled to AI will inevitably revolutionize the field (Jahanbakht et al., 2021). Today, behavioral studies in fisheries science stand on top of highly evolving tools to automatize analysis and processing of data. They are curated from interdisciplinary fields among marine science, computer science, neuroscience, and mechanical science among many other disciplines that are now coagulating because of AI (Xu et al., 2021). Some useful references for AI in marine science and reviews can be found in Beyan and Browman (2020); Malde et al. (2020) and Goodwin et al. (2021).

In marine sciences, neural networks used for object detection are usually “supervised” (Cunningham et al., 2008), meaning that they are trained using ground-truth objects, manually located in images, and classified into pre-defined classes. These objects, defined using the four coordinates of their bounding boxes and their associated classes (see Figure 3 for examples of bounding boxes), are then used to train the model to localize and classify these target objects within new images. Indeed, objects are assigned to one or several categories based on the probability of belonging to each of the classes used to train the model (Pang et al., 2017; Ciaparrone et al., 2020). Once object detection is done on different frames (Figure 4E, F), the tracking model pairs the bounding boxes among frames to reconstruct the track of each object through time and space (Belmouhcine et al., 2021; Park et al., 2021). During the training, if the model can predict classes and bounding boxes that match the groundtruth validation data with a minor error, depending on the given parameters, it can be considered an accurate model. However, if the model has poor predictive performances, then the learning continues.

Figure 3

Figure 3

Examples of bounding boxes of fishes. Top panel: Tracking of fishes on the open-source VIAME platform for image and video analysis (Dawkins et al., 2017). Bottom left: multiple trajectories of black seabreams around a fixed bait. Bottom right: In situ detections of sardines and horse mackerel inside a gear (Game of Trawls Project).

Broadly speaking, images are streamlined into computer algorithms to extract information. These algorithms contain artificial neural networks that apply a sequence of mathematical operations (convolution, pooling, etc.) to perform object detection. Those operations are linked together to orchestrate a pipeline, so that image processing is not interrupted (Figure 4G). The operations can detect objects because they determine patterns in pixels (i.e., binary trait of computers; Shaw, 2004; Pietikäinen et al., 2011) from the input images that define features (Blum and Langley, 1997). Features are measurable variables that can be interpreted from images, such as shapes and textures of objects (Chandrashekar and Sahin, 2014). Algorithms trained to detect patterns from features automatically are called detection models. Before training the model, images are preprocessed to be enhanced (i.e., neutralize discriminations and scale dimensions) so that models can learn better (Nawi et al., 2013; Calmon et al., 2017), since data are generally noisy when captured in the real-world conditions. Recent artificial neural networks contain attention modules (Vaswani et al., 2017; Gupta et al., 2021) to capture long-range dependencies and understand what is going on in an image globally (Grauman and Leibe, 2011).

Figure 4

Figure 4

General process from in situ observations to behavior classification. (A), Representation of a section of an active gear (i.e., pelagic net) with a white-colored material that act as a clear background for video capture. (B), Representation of a passive gear (i.e., baited gear)-baitfish prototype fixed on the seafloor with a remote underwater video set-up. (C), Field of vision of a camera secured attached on one side of the pelagic net section. (D), Field of vision of a camera facing the bait. (E, F), Frames from video footage of the underwater observation systems. (G), General workflow for deep learning model application on object detection. (H, I), Sample of fish detections with bounding boxes and fish tracking with bounding boxes and line trails (Game of Trawls and Baitfish)., (J) Representation of behavior classification labels inside active gear. The “region of interest” labels the section of the gear near the exit and “escaping” labels the fishes that are exiting. (K), Representation of behavior classification labels with passive gear. The “region of interest” labels the area in proximity of the bait and “approaching” labels the fish within this proximity. 3D model of baited gear credit to BAITFISH project and image of fishes inside the pelagic net credit to Game of Trawls project.

Current deep learning methods are mostly “black boxes” since humans cannot see how individual neurons work together to compute the final output (e.g., why a fish in an image has been detected or not), so improving the accuracy of models relies on better inputs and comparison of trainings (LeCun et al., 2015). However, unsupervised learning is gaining more interest as it allows the transition from recognition to cognition (Forbus and Hinrichs, 2006; Xu et al., 2021). This means that innovations in the AI domain are now making interpretable models that can figure out why and how they localize and classify objects on a scene (Ribeiro et al., 2016; Hoffman et al., 2018; Gilpin et al., 2019). Among unsupervised learning models, Generative Adversarial Neural Networks (GAN) are composed of two networks: a generator that generates synthetic data and a discriminator that classify the data as real or fake. The generator learns how to fool the discriminator by learning the real data distribution and generating synthetic data that follow this distribution. The discriminator should not be able to distinguish real from synthetic data. Thus, object detection models can now be coupled to a GAN and learn by themselves, in a semi-supervised manner, by artificially generating new sets of images (from the generator model) that feed through another model: the object detector (e.g., generator model produces synthetic images of fishes for another model to detect them; Creswell et al., 2018). Applying these AI methods to fish interactions with fishing gears would enable us to decipher which behaviors lead to the catch and escapement of fish at more significant scales than what could be reached until today. For a comprehensive review on available deep learning-based architectures, see Aziz et al. (2020).

3.3 AI for fish behavior

Tools for automatic behavior recognition are being developed mainly in aquaculture (Valletta et al., 2017; Niu et al., 2018) and in coastal fish communities (e.g., Kim, 2003; Fisher et al., 2016; Capoccioni et al., 2019; Lopez-Marcano et al., 2021; Ditria et al., 2021a). Over the last decade, there has been an emergence of automatic fish detection, species classification, combined with tracking innovations, and this has contributed to a robust foundation for behavioral recognition. Behavioral studies of fishes in aquaculture looked at feeding behavior to monitor appetite and abnormal behaviors in intensive farming conditions (Kadri et al., 1991; Zhou et al., 2017; Niu et al., 2018; Måløy et al., 2019; Pylatiuk et al., 2019; Li et al., 2020). Behaviors that were automatically detected include: feeding movements at individual and school level, feeding intensity (Zhou et al., 2019), abnormal behaviors due to lack of oxygen or stress response (J. Wang et al., 2020), and curiosity by showing inspection behaviors when interacting with bait or objects in experimental set-up (Papadakis et al., 2012).

In laboratory experiments, goal-directed behaviors of fishes have also been recognized by computer vision and are automatically detected (Long et al., 2020) such as construction of spawning nests by cichlid fishes that either form mounds or burrow in the sand. This type of complex behavior can be distilled into recognizable patterns such as manipulation of their physical environment (cichlid fish use its mouth and fins to move sand) and distinct fish movements such as quivering (usual mating movement observed from cichlid fishes). Automatically recognizing these behavior patterns contributes to systematic analysis of these traits across taxa (York et al., 2015) and can be an effective metric for measuring natural variations (Long et al., 2020).

Artificial Intelligence methods trained to recognize fish behavior have multiple components that are all connected in branching streams of mathematical and statistical operations. From a video of swimming schools of fishes, the attributes of what is happening in the scene would be broken down into features of the fishes, their appearance in terms of shape, texture, or color, and their reaction to different types of stimuli translated into quantifiable metrics. Some additional examples of applications can be found in Spampinato et al. (2010); Fouad et al. (2014); Hernández-Serna and Jiménez-Segura (2014); Iqbal et al. (2021) and Lopez-Marcano et al. (2021).

3.3.1 AI-based automatic behavior recognition for fishes

Fish detection by AI models is when individuals or species are recognized on a single image (Sung et al., 2017). An algorithm is trained to identify features of fishes and localize regions in a scene. The YOLO (You Only Look Once; Redmon et al., 2016) object detection framework has been frequently used for fish detection and species classification on 2D images (Cai et al., 2020; Jalal et al., 2020; McIntosh et al., 2020; Raza and Hong, 2020; Bonofiglio et al., 2022; Knausgård et al., 2021). The YOLO algorithm and its different versions are widely used since its detecting speed on an entire image are faster and more accurate than classic object detectors (for technical specifications, see: Redmon et al., 2016). A trained detection model can thus differentiate targeted and non-targeted species, and identify differences between their morphology (i.e., round vs flat fish). Moreover, a cluster of individual detections can also illustrate herding behavior from crowd movements.

Identifying different swimming patterns between targeted and non-targeted species, however, requires tracking the spatial alignments of trajectories inside gears and directions of swimming through time, i.e., tracking. Fish tracking is done using motion algorithms based on successions of images with multiple or individual fish until they are no longer seen on the footage (Li et al., 2021). To track fishes, algorithms are thus trained as a single network or are coupled into a pipeline of networks for more complex behavior interpretations (Table 3). Different implementations of deep learning-based tracking have been used across studies, depending on their tracking objectives or available resources (for object detection: Faster R-CNN, for instance segmentation: Mask R-CNN, for tracking based on loss: Minimum Output Sum of Squared Error (MOSSE), for tracking based on comparing similarity among masks (similarity learning): Siamese Mask (SiamMask), and for tracking based on Non-Maximum Suppression (NMS) applied to sequences: Seq-NMS). Their differences lie on the way they compute detections from frame to frame and associate them to new or existing tracks of detected fishes (Lopez-Marcano et al., 2021). Coupled networks in AI pipelines are thus used for tracking to interpret finer details in behavior (Table 3).

Table 3

AI & Pipelines Application Results Behavior classes Scene Light source Underwater Observation System Database Source Reference
YOLOv3 + dense optical flow method + trajectory image compression with VGG19 + data augmentation generative sampling + binary behavior classification Response of zebrafish (Danio rerio) to odorants Best accuracy achieved among tested classifiers of 0.867 with data augmentation and decision tree classifier Olfactory response Lab Low-light intensity Infrared video camera Own dataset + PASCAL VOC and MS-COCO Banerjee et al., 2021
Pre-trained ResNet50 + Motion estimation algorithm for optical flow data Grazing behavior of free-swimming luderick (Girella tricuspidata) on sea grass patches Recall, precision and F1 score between 0.73 and 0.79 (without spatiotemporal filtering); Recall, precision, F1 score between 0.84 and 0.87 (with spatiotemporal filtering) Grazing/non grazing In situ open water Natural light Action cameras (Haldex Sports Action Cam HD 1080p) Own dataset (RGB videos) Ditria et al., 2021b
YOLOv3 + MobileNetv2 backbone + improved detection method with pyramid pooling block and multiscale feature extraction technique Quantify feeding and stress behavior of carps (Carassius auratus) and catfish (Pelteobagrus fulvidraco) Precision of 0.897, a recall of 0.884, an intersection over union of 0.892 Separate feeding and hypoxia experimental conditions Fish tank 120 light-emitting diodes Go-pro Hero 7 Black Own dataset (RGB videos) Hu et al., 2021
Idtracker.ai hybrid system (Gaussian mixture model + greedy acceleration minimization principle) Characterized mutual motor coordination and multi-functional maneuvers in zebrafish (Dania rerio) Identification accuracy of 0.98 Fighting behavior Lab light intensity of 200–300 lx at the water surface Video camera – not specified Own dataset Laan et al., 2018
3D Residual Networks Behavior of cichlids (foraging, construction, and social behavior) Accuracy for behavior recognition of construction behavior of 0.76 Construction, feeding, mating Lab Artificial light source RaspberryPi camera v2 Own dataset (RGB videos) Long et al., 2020
Mask RCNN + 3 different trackers (MOSSE, Seq-NMS, SiamMask) Characterizing movement behavior of yellowfin seabreams (Acanthopagrus australis) Detection F1 score of 0.91, 120 of 169 individuals correctly identified, 0.78 tracking accuracy (MOSSE and SiamMask) and 0.84 (Seq-NMS) Tracking angles: In situ rocky (rocky reefs and seagrass meadows) Natural light Action cameras (1080p Haldex Sports Action Cam HD) Own dataset (RGB videos) Lopez-Marcano et al., 2021
Dual-Stream Recurrent Network (DSRN) (Spatial Network + 3D CNN + LSTM) Tracking if feeding behavior of salmon (Salmo salar) Behavior prediction of 0.80 Feeding/Non feeding Breeding cages at sea Natural light Video camera (not specified) Own dataset (RGB videos) Måløy et al., 2019
YOLOv3 + LSTM networks Identifying startle behavior in sablefish (Anoplopoma fimbria) Average precision of 0.85 Startle/non-startle event on video clips In situ, open water, tropical Natural light Fixed in situ camera Own dataset (RGB videos) McIntosh et al., 2020
Built-in algorithms in LabView software + Vision Development Module Detection of gilthead seabream (Sparus aurata) changes in speed and position Less than 21 frames, equivalent to 2.3 s, were lost from a total number of 778,378 recorded frames per day Net inspection and net biting Lab Artificial light Charge coupled device (CDD) cameras Own dataset Papadakis et al., 2012
2 converging streams of event classifier with SVM + trajectory-based algorithms Influence of typhoons on mixed coral reef fish behavior in tropical underwater scenes Accuracy of 0.80 for fish detection, 0.95 for tracking, 0.97 for event detection Typhoon/non typhoon videos In situ Natural light GoPro cameras Fish4Knowledge dataset Spampinato et al., 2014
Dual-stream 3D convolutional network with State Definition + Tracking Encoding + Decoding by Directed Cycle Graph (DSC3D) Behavior of spotted knifejaw (Oplegnathus punctatus) in high stress environments Mean correct behavior recognition of 0.950 5 behavioral states: Feeding, Hypoxia, Hypothermia, Frightening, Normal Lab Artificial light source HD digital camera (HikvisionDS-2CD2T87E(D)WD-L) Own dataset (RGB and optical flow videos) G. Wang et al., 2021
Motion influence map + RNN Detection, localization, recognition of unusual local behavior in tilapia (Oreochromis niloticus) Accuracy for detection (0.98), location (0.92), recognition (0.90) Unusual (3 behavioral subcategories of sudden movements) Aquaculture tank Artificial light Charge coupled device (CDD) cameras (DS-2CD6233F-SDI, Hikvisio) Own dataset (RGB images) Zhao et al., 2018
Clustering index with near-infrared images Analyze of feeding behavior of carps (Cyprinus carpio var. specularis) Accuracy of 0.945 Temporal feeding states before, during and after (t=5, 15,30,60 s) Aquaculture Near-infrared light source Industrial camera (Mako G-223B NIR) Own dataset (infrared images) Zhou et al., 2017
LeNet5 7 layered CNN structure Assessing fish appetite of tilapia (Oreochromis niloticus) Accuracy of 0.90 Fish appetite (none, strong, medium, weak) Aquaculture Near-infrared light Industrial camera (Mako G-223B NIR) Own dataset (infrared images) Zhou et al., 2019

Summary of AI pipelines for fish behavior recognition in different underwater environments.

Full terms of included abbreviations, LSTM, Long Short-Term Memory; CNN, Convolutional Neural Network; MOSSE, Minimum Output Sum of Squared Error; RNN/RCNN, Recurrent Neural Network/Recurrent Convolutional Neural Network; Seq-NMS, Sequential Non-Maximum Suppression; SiamMask, Siamese Mask; SVM, Support Vector Machine; YOLO/v3, You Only Look Once version 3.

To decipher underlying behavioral patterns of fishes from manual or automatically generated fish tracks, repeated patterns can be translated into sets of labelled classes (i.e., n number of trajectory moving in an x, y direction = escaping to upper panel), representing one or several specific behaviors. In AI, classes that can be labelled and quantified (i.e., fish passing a mesh) can be learned by a deep learning model so manual behavior classification can then be automated. In aquaculture, swimming behavior have been manually classified and fed through an algorithm that learns how to recognize the behavioral classes from computer vision (Long et al., 2020; J. Wang et al., 2020; Yu et al., 2021) . In commercial fishing, the challenge lies in deciphering these patterns as fishes interact with different structure of gears, modified parts, and selective devices. To have AI models classify these types of interactions, a systematic approach may thus be needed first in controlled environment, such as fish tanks or behavioral chambers. This would allow stimuli to be restricted and localized (Skinner, 2010) rather than being enhanced or inhibited by spatiotemporal conditions (Ryer and Olla, 2000; Owen et al., 2010; Maia and Volpato, 2013; Heydarnejad et al., 2017; Lomeli et al., 2021).

Recurrent AI models based on LSTM architecture targeting fish tracking are getting more attention since they are designed to give more weight to significant movement patterns among chaotic ones as they are trained. This adds a more cognitive ability to the learning of AI models. For instance, Gupta et al. (2021) investigated different vision-based object-tracking algorithms for multiple fishes in underwater scenes both in controlled and uncontrolled environments. They combined an object tracker designed with two complex networks (a siamese network and a recurrent network) named DFTNet (for Deep Fish Tracking Network). The first network used two identical neural networks to reidentify fish, and the second network is an LSTM that allows the AI model to learn from the fish’s chaotic motions.

In fishing activities, AI architecture with attention and memory is thus particularly important to address the chaotic patterns seen among species during capture process. Tracks can show swimming angles or abrupt changes in movement that measure distance from gear structures (Santos et al., 2020), mean trajectory in relation to the stimuli source (Peterson, 2022), selective device placement or difference in position of group or individual trajectories within gears. The visual features from automatic detection (i.e., color, texture, shape among species, group, or individual level) and the spatiotemporal features from tracking (i.e., swimming direction, angle, speed) (Figure 4H, I) can then be combined to define the behavior classification (Figure 4J, K).

3.3.2 Behavioral classes tailored with AI architecture

Fish behavior recognition is when a model can recognize a behavior based on tracking features identified as events. An event is a scene (Figure 4A, B) directly observed from videos, for example, when a group of fish swims out from fishing gear. The combination of fish detections and tracks (swimming patterns) can be categorized as a class “escapement”, and behavioral metrics can be derived from such events (see Figure 4J). Automatic behavior recognition is thus trained from classified sets of tracking features and is the final step in synthesizing chaotic fish swimming into distinguished sets of behaviors.

Classes of behaviors are defined by scientists and are used to label an image sequence or a video clip that shows a defined behavior. For example, a class label of escapement behavior can be defined from a clip of a fish passing through a mesh. This can be defined as when the detected body of the fish overlaps or touches the mesh. A behavior class of a fish not escaping is when the detected trajectory of the fish stays within the mesh barrier, or a class can consequently show it has escaped if the tracked fish is detected outside the gear. The option to label whether a fish has escaped is a detail that depends on the study’s classification decisions (i.e., either when the fish’s body is entirely outside the gear or as the fish passes through the mesh). Classes can also be separated into action, and non-action classes (see Table 3), where a defined behavior present in a video clip is labeled as the action class, and another clip presenting unchanged or normal fish movement is labeled with the non-action class. McIntosh et al. (2020) defined four features that translate the startling behavior of sablefish from their trajectories into measurable metrics: direction of travel, speed, aspect ratio, and Local Momentary Change metric. They combined the four features into a form suited to train an AI-based classifier with an LSTM architecture (i.e., tensor data). Like applying LSTM for tracking, an AI behavior recognizer with LSTM remembers important features efficiently to classify swimming movements (Niu et al., 2018; L. Yang et al., 2021)​. ​​Behavior classes have been defined in selectivity studies as events classified in empirical models (Santos et al., 2016, Santos et al., 2020) or video tracking software (Noldus et al., 2001)​. J. Wang et al. (2020) proposed a method for real-world detections of anomalous behavior for multiple fish under high stress with a 3-stage pipeline. Examples of AI pipelines are summarized in Table 3, with the underwater scene, light source, and type of underwater observation system used included for comparison.

3.3.3 The problem of occlusion emphasized in the crowded scenes of fishing

The occlusion problem is when fishes overlap or swim behind one another, leading to a loss of fish detections and fragmentation of tracks (Gupta et al., 2021). Multiple objects tracking on videos is challenging since overlaps are flattened in a 2D view (See Figure 4C, D, F). This problem occurs when studying behaviors in crowded scenes of fishing. In 2D images and videos, training models to recognize the body parts of fish can help to overcome occlusion. In general, if a detector fails to locate an entire fish, a tracker can still follow the movement according to other features of the fish (i.e., fisheye, fins, tail). For example, ​Liu et al. (2019)​ simultaneously track the fish head and its center body so the head can be detected even when the center body is hidden. Therefore, trackers can maintain fish identity after occlusion happens if more appearance features are learned by the model (Qian et al., 2014). Fish heads have relatively fixed shapes and colors, so tracking them from frame to frame can still be done even after frequent occlusions (L. Yang et al., 2021). The darker color intensity of the head behind another and its elliptical shape can be characterized as a blob and still be tracked.

Three-dimensional tracking from stereo cameras or multiple camera systems where 3D components can be triangulated can help address occlusion problems. By reconstructing trajectories on a 3D view, fish trajectories are seen with depth, improving reidentifying a fish after an occlusion (Cachat et al., 2011; Huang et al., 2021) ​. However, AI models trained to recognize 3D trajectories demand computationally intensive algorithms to associate the deconstructed features together (L. Yang et al., 2021).

3.3.4 Transfer learning for data-deficient environments

We have shown that assessing fish behavior relies on analyzing trajectories. Considering tracks instead of detections generates even larger amounts of data than single detections of fishes on frames. Thousands to millions of such fish trajectories have likely been generated worldwide. These data may now be used to train models to detect fishes, at species level or as generic fish, in unseen environments. We provide a few examples of available published datasets that have been used to train models (Table 4).

Table 4

Dataset Access Size Included Fishes Labels Location Source References for model training Content type
BYU (Brigham Young University) Fish dataset http://roboticvision.groups.et.byu.net/Machine_Vision/BYUFish/BYU_Fish.html 12 labelled species (2.5 GB) 90 Asian Carp, 110 Crucian Carp, 74 Predatory Carp, and 89 Colossoma and four non-invasive species (120 Cottids, 137 Speckled Dace, 172 Whitefish, and 240 Yellowstone Cutthroat image) Species Not specified Lillywhite and Lee, 2013 Chua et al., 2011; Rasheed, 2021; Simons and Lee, 2021 Images
Croatian Fish Dataset http://www.inf-cv.uni-jena.de/fine_grained_recognition.html#datasets (currently not accessible) ~700 images of 12 different fish species in real life conditions (120 images in training set and 674 in testing set) Mixed Species Adriatic Sea in Croatia Jäger et al., 2015 Okafor et al., 2018; Qiu et al., 2018; Zhao et al., 2018; Guo et al., 2020; Yan et al., 2021; Pang et al., 2022 Images
DeepFish https://github.com/alzayats/DeepFish ~40,000 annotated classification labels, collected from 20 different habitats Mixed in situ Fish/no fish Coastal marine environments in Australia Saleh et al., 2020 Laradji et al., 2020; Saleh et al., 2020 Images
Deep Vision Fish Dataset http://metadata.nmdc.no/metadata-api/landingpage/01d102345aef4639f063a13ea20cd3f3 Two surveys 2017 to 2019 from Deep Vision system Blue whiting, Atlantic herring, Atlantic herring, other mesopelagic fishes Species In situ from surveys + synthetic data Allken et al., 2021b Allken et al., 2021b Images
FathomNET http://fathomnet.org ~80,000 images of marine animals, 106 000 localizations, 26 000 h videos, 6.8 million annotations, 4 349 terms Mixed Mixed Worldwide Boulais et al., 2020; Katija et al., 2021a Katija et al., 2021b Images and Videos
Fish4Knowledge (F4K) http://www.perceivelab.com/datasets 3.5k bounding fishes/700k 10-minute video clips Species of tropical fishes Fish/no fish Taiwan coral reefs Boom et al., 2014; Fisher et al., 2016 Rathi et al., 2017; Pramunendar et al., 2019; Wang et al., 2019; Alshdaifat et al., 2020; Guo et al., 2020; Murugaiyan et al., 2021; Prasetyo et al., 2021; Knausgård et al., 2021 Images and Videos
FishNet https://www.fishnet.ai/ 406,463 bounding boxes in 86,029 images from 73 different electronic monitoring cameras Majority of species detected tuna species (albacore, yellowfin, Skippack, bigeye) Species From longline tuna vessels in western and central Pacific Kay and Merrifield, 2021 Mujtaba and Mahapatra, 2022 Images
Fish-Pak https://data.mendeley.com/datasets/n3ydw29sbz/3 915 images of carps with different orientation, position, mouth Ctenopharyngodon idella, Cyprinus carpio, Cirrhinus mrigala, Labeo rohita, Hypophthalmichthys molitrix, and Catla catla. Species Pakistan, local farms, and river systems Shah et al., 2019 Shah et al., 2019 Images
HabCam (Habitat Mapping Camera System) https://habcam.whoi.edu/ 2,500,000 images Mixed marine vertebrates and invertebrates Species Not specified Northeast Fisheries Science Center (NEFSC) Unknown Images
J-EDI JAMSTEC E-library of Deep-sea Images) https://www.godac.jamstec.go.jp/jedi/e/ 1,500,000 images Deep sea species Species Not specified Japan Agency of Marine-Earth Science and Technology Unknown Images and Videos
LifeCLEF 2015/FISHCLEF/SeaCLEF https://www.imageclef.org/2014/lifeclef/fish 73 annotated videos Species of tropical fishes Species Taiwan coral reefs Joly et al., 2016 Hossain et al., 2016; Salman et al., 2016; Salman et al., 2020; Zhang et al., 2020; Ben Tamou et al., 2021 Videos
NINA204 Not retrievable 204 video clips (101 stocked fish and 103 wild fish) Brown trout species No fish/stocked fish/wild fish Stocked fish and freshwater wild species in Norway Pedersen and Mohammed, 2021 Albawi et al., 2017; Myrum et al., 2019 Videos
NorFisk https://dataverse.no/dataset.xhtml?persistentId=doi:10.18710/H5G3K5 12514 annotated images (timestamp 2021 as it is expected to grow from 2020 recorded footages) 3027 annotated images of saithe, 9487 of salmonids Saithe and salmonids Species Norwegian fish farms using GoPro hero 4,5 and 8 cameras (49 hours of video) Crescitelli et al., 2021 Crescitelli et al., 2021 Images
ONC Video Data https://github.com/bonorico/analysis-of-ONC-video-data 9772 video clips, 9205 annotated sablefish individuals Sablefish Species Berkeley Canyon, North America Bonofiglio et al., 2022 Fier et al., 2015; Bonofiglio et al., 2022 Videos
OzFish https://github.com/open-AIMS/ozfish 80k labeled crop images, 45k bounding box annotations, 507 species of fish Mixed (e.g,. Scarids, Chlorurus, Capistratoides sp). Species/no species, fish/no fish Stereo BRUVS Australian Institute of Marine Science (AIMS), 2019 Ditria et al., 2021b Images
QUT Fish Dataset https://www.dropbox.com/s/e2xya1pzr2tm9xr/QUT_fish_data.zip?dl=0 ∼4000 classification images, 486 species Mixed in situ Species Varying ex situ and in situ habitats Anantharajah et al., 2014 Qiu et al., 2018; Guo et al., 2020; Pang et al., 2022 Images
RockFish/Labeled fishes in the wild https://swfscdata.nmfs.noaa.gov/labeled-fishes-in-the-wild/ 929 images with 1005 marked fish, 17 videos at 10min long, rate of 5 fps, ∼1k bounding boxes (fish) Ground fishes Fish/no fish Southern California Bight from 2000-2012 surveys Cutter et al., 2015 Cutter et al., 2015 Images and Videos
The Nature Conservancy Fisheries Monitoring Database https://www.kaggle.com/competitions/the-nature-conservancy-fisheries-monitoring/data Unspecified Albacore tuna, Bigeye tuna, Yellowfin tuna, Mahi mahi, Opah, Sharks Species Mixed The Nature Conservancy Fisheries Pelletier et al., 2018 Images
TROUT39 Not retrievable 39 images of brown trouts in 288 frames Brown trout species Species Stocked fish and freshwater wild species in Norway Pedersen and Mohammed, 2021 Pedersen and Mohammed, 2021 Videos
VOC2012/PASCAL Visual Object Class http://host.robots.ox.ac.uk/pascal/VOC/voc2012/ 17,000 annotated fishes Unspecified Fish/no fish Unspecified Everingham and Winn, 2012 Li et al., 2021 Images
WildFish https://github.com/PeiqinZhuang/WildFish ~2000 fish categories with 103,034 wild fish images based on several professional fish websites (e.g., Fish Base, Encyclopedia of Life, Fishes of Australia) Mixed Species Mixed Zhuang et al., 2021 Pang et al., 2022 Images

Summary of public datasets of fish images and videos for AI model training merged from open access database, from collection of generic image datasets (with other objects not focused on fishes) and from Ditria et al. (2020); Saleh et al. (2020) and Pedersen et al. (2021).

For tropical fishes, Fish4Knowledge (F4K; Fisher et al., 2016), a project that started in 2010, garnered millions of images from GoPro cameras that were set-up in coral reef areas of Taiwan. The project resulted to 87K hours of video (95 TB) and 145 million fish identifications. It has then made the successfully curated database available to the rest of the world and most of the developments in automatic classification and identification tools for fishes have used the database to train deep learning models (see in Table 4 uses of F4K: Spampinato et al., 2010; Palazzo and Murabito, 2014; Shafait et al., 2016; Jalal et al., 2020; Murugaiyan et al., 2021). For temperate fishes, only a few commercial species can be automatically identified by existing models but are nonetheless gaining more recognition. Bonofiglio et al. (2022) trained an AI pipeline to detect and track sablefish, Anoplopoma fimbria, in an underwater canyon in North America on ~650 hours of video recording with ~9000 manual annotations. Due to growing fish databases and application of image processing techniques, AI models can now detect fishes with human-like accuracy in some species such as Scythe butterfly fish (Benson et al., 2013), some tropical species (Spampinato et al., 2010), and mesopelagic species (Allken et al., 2021a).

Studying fish-gear interactions is particularly difficult due to the unique and challenging conditions often met at sea. Pipelines of automatic detections have applied transfer learning and data augmentation techniques to cope with the lack of available data. For example, ​Knausgård et al. (2021) applied transfer learning to train an AI system to identify temperate fishes that are commercially valuable, such as wrasses (Ctenolabrus rupestris, C. exoletus and Sympohodus melops) and gadoids (Gadus morhua, Pollachius virens, P. pollachius, Molva molva, and Melanogrammus aeglefinus). Using models pre-trained on available public datasets (see Table 4, e.g., Fish4Knowledge and ImageNet), they obtained high accuracies in object detection and classification using their fine-tuned models (86.96% and 99.27%, respectively). Transfer learning from pre-existing object detection algorithms coupled with existing data from other environments can thus be a promising approach for the automatic analysis of fish species even from environments that still lack data (Fisher et al., 2016; Siddiqui et al., 2018; Knausgård et al., 2021), additional augmentation methods, such as generating synthetic datasets, may help overcome the insufficiency of small datasets for training models (Allken et al., 2019; Villon et al., 2021).

4 Discussion

4.1 Insights from AI applications for behavior recognition from other domains

Automated behavior recognition has been applied to several domains outside of fisheries. Dynamic systems of fish schools, just as any large groups of moving individuals such as birds or insects (Chapman et al., 2010; Altshuler and Srinivasan, 2018), will produce a bundle of condensed and interloping trajectories when tracked. Directional patterns of behavior (i.e., individual or collective) can be interpreted from them (Sinhuber et al., 2019), but visual details of targets can be lost in footages due to occlusions or motion blur (Liu et al., 2016). Conveniently, apart from data enhancement methods, there are already available algorithms and AI methods that particularly addresses this challenge in natural systems of humans, social animals and insects (i.e., Swarm Intelligence; Ahmed and Glasgow, 2012, Boids algorithms; Alaliyat et al., 2014). Algorithms to track behavior in congested human crowds have been developed based on motion capture and optical flow techniques (Krausz and Bauckhage, 2011). Different types of human behavior can now be recognized by AI in all sorts of environment due to the considerable attention in the domain and since high performing models learn from a gigantic amount of training database of diverse human behavior (Popoola and Wang, 2012; Vinicius et al., 2013).

Three-dimensional motion capture techniques can also provide more information such as depth and detailed tracking of animal paths (Wu et al., 2009). Moreover, 3D trajectories can provide the analytics (i.e., positions, velocities, accelerations) to study cohesive and unique behaviors (Sinhuber et al., 2019). For instance, Liu et al. (2016) proposed an automatic tracking system that can reconstruct 3D trajectories of fruit flies using three high-speed cameras that can be generally adapted to large swarms of moving object. Dollár et al. (2012) made use of features of human pedestrians to geometrically quantify their overlaps and distances on a 2D scale. The AI models that recognize facial features and postures of humans or other animals therefore have the algorithmic backbone to extract behavior. Since algorithms can be scalable and adaptable (see Section 3.3.4 on transfer learning), such Al models may now be adapted to fish features and postures.

4.2 Towards smart fishing

The way we fish is constantly evolving. The more we understand the impact of fishing, the more we look for ways to make our fishing gears more selective. We are not just modifying the components of gears anymore but also adding devices and camera systems to them to create intelligent fishing gears. This turns fishing operations into interactive, fine-scale observations platforms rather than catch-then-see operations (Rosen et al., 2013; Kyllingstad et al., 2022). Performances of modified fishing gears can almost be assessed real-time which can elevate the plateau of gear selectivity studies by exploring fish-gear interactions at finer scales. The challenge now lies on obtaining consistent findings from these direct observations. In highly stimulating, crowded, and stressful scenes in fishing activities, subtle movements of fishes may turn into sharp and chaotic escapes where learned behavior and predispositions are overcome by survival instincts (Manière and Coureaud, 2020). Large volumes of fishes can also be influenced by herding behavior and individuals may tend to follow swimming routes of the group (Måløy et al., 2019). Addressing this herding constrain currently relies on applying complex pipelines, often coupled with stereovision (Rosen et al., 2013; Kyllingstad et al., 2022). Handling such data in real-time is one of the current bottlenecks because it has to be processed within embedded AI systems. To equip fishing gears, these embedded systems have to remain as light as possible, with controlled size, memory and power consumption. These issues will be partially solved as the algorithms presented above (see Section 3.3: The problem of occlusion emphasized in the crowded scenes of fishing and Table 3) keep improving in handling the occlusion problem, and as the observation systems keep improve to meet the image quality required for AI applications (see Section 2.1 Observations of fish behavior in fishing gears and Table 1).

In the meantime, AI may already facilitate the assessment of fishing gear modification. When a fishing gear is designed with a new stimulus (e.g., Southworth et al., 2020; Ruokonen et al., 2021) or when its parts are modified (e.g., Feekings et al., 2019), the certainty that they dominantly cause a change in behavior of fishes leading to escapes or retention is impossible to single out due the large variability in external and internal factors affecting the fishes’ responses. It is also unlikely that the exact movements by the same community of fishes can be observed upon two successive occasions (Ryer and Barnett, 2006; Ryer et al., 2010; Lomeli and Wakefield, 2019). Applying automatic behavior recognition in such situations would enable to process much larger amount of data on fine-scale differences than what could be done manually, even if it comes with some levels of errors inherent to using any fully automatic recognition algorithm (Faillettaz et al., 2016; Villon et al., 2021). Complementary laboratory studies may also help collect consistent findings (Hannah and Jones, 2012), which are needed to gather a database of automatically classifiable behaviors. For example, the influence of light intensity on juvenile walleye pollock Theragra chalcogramma were studied in laboratory conditions and in situ and showed that juveniles either struck the nets more often or swam closer to them in darkness than at the highest illumination (Olla et al., 2000). Such systematic behavioral responses could thus be used to train an AI model which could then be used to automatically analyze replicates of additional trials. Similarly, AI applications would enable to amplify the number of replicates of sea or laboratory trials, for example when assessing how changes in the positions of stimuli influences species behaviors (Larsen et al., 2017; Yochum et al., 2021).

4.3 Sharing and collaboration for the sake of fishes

Transferred learning of adaptable deep learning models from other behavioral studies and sceneries is key for automated fish behavior recognition, but technically executing this requires collaboration among the scientific community. The advances of fish behavior recognition in aquaculture and in situ environments often stem out of joint efforts between ecologists and computer scientists. AI practitioners mostly have the knowledge on which algorithm or AI network can be appropriated to specific study cases, while marine scientists provide the underlying ecological question and the inherent parameters (i.e., classification of fish behaviors, metrics for quantification) to fine-tune the algorithms. Automated behavior recognition models that are successful have benefited from huge streams of imagery data and unprecedented fundings in terms of technological specifications. Existing and previous data mining and collection practices included outsourcing efforts. Fish4Knowledge branched out to volunteers, subprojects, and gamifying techniques (Fisher et al., 2016). Popular datasets such as ImageNet and COCO used Amazon Analytics to crowdsource annotations of objects (Gauen et al., 2017). McClure et al. (2020) discussed that citizen science is beneficial for AI applied in ecological monitoring as it can fast track data collection since AI is now within reach because of integration in mobile devices and user-friendly platforms. The phytoplankton world is benefitting from citizen science as online portals are used by volunteers to do simple classification tasks that has led to millions of plankton ID’s to be verified (Robinson et al., 2017). Moreover, scientists are adapting FAIR (Findability, Accessibility, Interoperability, and Reuse) data principles to realize the full value of fish behavior data and to carefully curate a unifying database (Guidi et al., 2020; Bilodeau et al., 2022).

Bridging the gap between computer and marine sciences can accelerate the development of powerful tools for automated fish monitoring (Goodwin et al., 2021). User-friendly software platforms for image processing and analysis of animal tracks and events are publicly accessible and designed for non-AI experts (Dawkins et al., 2017). So even if observations of fish-gear interactions are more demanding in terms of observation requirements that can produce small sizes of data and are distinctly case-specific, training models can still be aided by means of data transfer, open-access databases, and participatory platforms. This will be beneficial for everyone as end-tools that grow in performance will also grow in scalability thanks to shared data. If there are enough collaborations across domains, extensive engagement with fish ethologists to construct behavioral classifiers, consistent sharing of reproducible, understandable, and scalable data then it might become possible to quantify, in near completeness, what a fish is doing or how it is interacting with its environment in any conditions.

4.4 Limitation of AI: A critical view

AI-adapted electronic fishing is still fairly new to fisheries so practical applications to improve selectivity of fishing gears may not be seen directly. AI models are dependent on the quality of the training data and imagery is still currently lacking. Contrary to fisheries-based observation, land- and air-based behavior studies have more opportunity to use AI for automatic behavior recognition as aerial and terrestrial devices can be smaller and lighter than underwater camera systems (e.g., Rosen and Holst (2013) for an underwater example; Liu et al. (2016) for a land example).

The environmental impact of these developing hardware and software systems in fisheries must not also be taken for granted. They may reduce operational energy consumption with automation but if intelligent tools are eventually applied in a commercial level, this may imply significant extraction of heavy metals to manufacture the hardware and increase in the carbon footprint of storage servers (Gupta et al., 2022). Scientists should be cautious to not be swept away by the promise of intelligent fishing without also seeking the environmental cost of making and maintaining it. AI application may tip the scale in favor of fishes but the integration of AI to fisheries must be accompanied by environmental impact assessments and an active search for alternative materials for machines.

Furthermore, our perception of animal behavior can be anthropomorphic, and this bias may be transferred to artificial tools. Researchers have consistently indicated the possible transfer of human bias into artificial intelligence that can be worsened by training models with limited data (Horowitz and Bekoff, 2015). As of today, human still need to be cautious in identifying behavioral both in manual and automatic methods; unsupervised learning may help get rid of anthropomorphic biases (Sengupta et al., 2018).

Another critical view of the use of AI in fisheries sustains the reality that it can be a double edge sword. On one hand, it may help scientists understand fish behavior and reduce bycatch (e.g., Knausgård et al., 2021; Sokolova et al., 2021). On the other hand, it may help the fishing industry to increase their catch with the use of automated tools (Walsh et al., 2002). As with any other technological advancement, the practical nature of it stems on how humans decide to use them (Bostrom and Yudkowsky, 2018). It is therefore in the hands of stakeholders to discuss among one another, to stress both the negative and positive impacts of AI, and to lay down ethical practices to prevent mishandling of this new technology. Debates in using AI tools in fisheries arise but if we go forward with the intention to help address ecological problems and emphasize its use for selectivity, then it may build the tools for a sustainable use of our resources.

4.5 Navigating a rapidly evolving field of research

The main challenge of studying fish-gear interactions is not of lack but of abundance. The growing data in fish behavior and existing footages of their interactions with gears carry with them the vital information for better gears waiting to be synthesized. Automating the methods of data collection and process not only unlatches the time and effort given by scientists from laborious practices but also liberates the focus unto deeper scientific and creative endeavors. User-friendly platforms that translate complex AI algorithms into software tools can encourage interest even from non-practitioners to participate in model training and fish tracking.

As we write this review, powerful and cognitive AI models in the field of computer science are advancing in an unparalleled speed. This will inevitably pour into the development of models for fisheries. AI applied in other sectors have cognitive understanding allowing machines to have higher level of ability of induction, reasoning and acquisition of knowledge. The evolution of future AI models for automatic recognition of fish-gear interactions now depends on multiple factors:

  • - First is the careful and accurate classification of fish trajectories that considers 3D components in a moving world.

  • - Second is the adaptation and re-training of pre-trained models from different human and animal behavioral studies.

  • - Third is the production of scalable and adaptable models for different case studies in gears and the shareability of fish behavior data among scientists.

  • - Fourth is the reliance on a continued and harmonious engagement of both marine scientists and AI practitioners to develop cognitive AI for fish-gear interaction systems.

There is no magic gear that completely selects targeted species, allow all unwanted species to escape, and has no economic and biological losses. However, equipping fishing gear with state-of-the-art technologies may help address ecological problems, understand overlooked species’ behavior and make our fishing practices more sustainable, laying the right track as we step into a technological era.

Statements

Author contributions

AA, DK, RF conceptualized the content of the review. AA conducted the bibliometric analysis, generated the illustrations and drafted the initial manuscript. DK, RF wrote, commented and reviewed the manuscript. All authors contributed to the article and approved the submitted version.

Funding

This work was done as a part of the Game of Trawls S2 project, funded by the European Maritime and Fisheries Fund and France Filière Pêche (PFEA390021FA1000002). AA’s PhD program is funded by IFREMER (grant DS/2021/10).

Acknowledgments

The authors are thankful to Abdelbadie Belmouhcine for his valuable insights on deep learning methods, to Sonia Méhault, Julien Simon and Pascal Larnaud for the discussion on gear selectivity, to Marie Savina-Rolland for her comments on the manuscript and to Megan Quimbre (IFREMER, Bibliothèque La Pérouse) for the bibliometric analysis. The authors are also grateful to the two reviewers for their time and criticisms which greatly improved the manuscript.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

  • 1

    Aguzzi J. Doya C. Tecchio S. de Leo F. C. Azzurro E. Costa C. et al . (2015). Coastal observatories for monitoring of fish behaviour and their responses to environmental changes. Rev. Fish Biol. Fish25, 463483. doi: 10.1007/s11160-015-9387-9

  • 2

    Ahmed H. Glasgow J. (2012). Swarm Intelligence: Concepts, Models and Applications. [online] Ontario, Canada: School of Computing Queen’s University Kingston. Available at: https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=116b67cf2ad2c948533e6890a9fccc5543dded89.

  • 3

    Alaliyat S. Yndestad H. Sanfilippo F. (2014). Optimisation of boids swarm model based on genetic algorithm and particle swarm optimisation algorithm (Comparative study). Proceedings - 28th European Conference on Modelling and Simulation, ECMS 2014. doi: 10.7148/2014-0643

  • 4

    Albawi S. Mohammed T. A. Al-Zawi S. (2017). “Understanding of a convolutional neural network,” in Proceedings of 2017 International Conference on Engineering and Technology, ICET 2017, Antalya, Turkey, 2018-January. 16. doi: 10.1109/ICENGTECHNOL.2017.8308186

  • 5

    Allken V. Handegard N. O. Rosen S. Schreyeck T. Mahiout T. Malde K. (2019). Fish species identification using a convolutional neural network trained on synthetic data. ICES J. Mar. Sci.76, 342349. doi: 10.1093/icesjms/fsy147

  • 6

    Allken V. Rosen S. Handegard N. O. Malde K. (2021a). A deep learning-based method to identify and count pelagic and mesopelagic fishes from trawl camera images. ICES J. Mar. Sci.78, 37803792. doi: 10.1093/icesjms/fsab227

  • 7

    Allken V. Rosen S. Handegard N. O. Malde K. (2021b). A real-world dataset and data simulation algorithm for automated fish species identification. Geoscience Data Journal8, 199209. doi: 10.1002/gdj3.114

  • 8

    Alshdaifat N. F. F. Talib A. Z. Osman M. A. (2020). Improved deep learning framework for fish segmentation in underwater videos. Ecol. Inform59, 101121. doi: 10.1016/j.ecoinf.2020.101121

  • 9

    Altshuler D. L. Srinivasan M. V. (2018). Comparison of visually guided flight in insects and birds. Front. Neurosci.12. doi: 10.3389/fnins.2018.00157

  • 10

    Anantharajah K. Ge Z. Y. McCool C. Denman S. Fookes C. Corke P. et al . (2014). “Local inter-session variability modelling for object classification,” in 2014 IEEE Winter Conference on Applications of Computer Vision, WACV, Steamboat Springs, CO, USA. 309316. doi: 10.1109/WACV.2014.6836084

  • 11

    Anders N. Fernö A. Humborstad O. B. Løkkeborg S. Rieucau G. Utne-Palm A. C. (2017a). Size-dependent social attraction and repulsion explains the decision of Atlantic cod Gadus morhua to enter baited pots. J. Fish Biol.91, 15691581. doi: 10.1111/JFB.13453

  • 12

    Anders N. Fernö A. Humborstad O. B. Løkkeborg S. Utne-Palm A. C. (2017b). Species specific behaviour and catchability of gadoid fish to floated and bottom set pots. ICES J. Mar. Sci.74, 769779. doi: 10.1093/icesjms/fsw200

  • 13

    Arimoto T. Glass C. W. Zhang X. (2010)Fish vision and its role in fish capture. In: Behavior of marine fishes: Capture processes and conservation challenges. Available at: https://books.google.fr/books?hl=en&lr=&id=Rp28-2cAaD8C&oi=fnd&pg=PA25&ots=R4AIAl7dAS&sig=2gJyoWORuHB8iycWs3bu6s_BJug&redir_esc=y#v=onepage&q&f=false (Accessed June 29, 2022).

  • 14

    Aydin C. Tosunòlu Z. (2010). Selectivity of diamond, square and hexagonal mesh codends for Atlantic horse mackerel Trachurus trachurus, European hake Merluccius merluccius, and greater forkbeard Phycis blennoides in the eastern Mediterranean. J. Appl. Ichthyology26, 7177. doi: 10.1111/j.1439-0426.2009.01376.x

  • 15

    Aziz L. Salam H. Bin M. Sheikh U. U. Ayub S. (2020). Exploring deep learning-based architecture, strategies, applications and current trends in generic object detection: A comprehensive review. IEEE Access8, 170461170495. doi: 10.1109/ACCESS.2020.3021508

  • 16

    Baatrup E. (2009). Measuring complex behavior patterns in fish - effects of endocrine disruptors on the guppy reproductive behavior. Hum. Ecol. Risk Assess.15, 5362. doi: 10.1080/10807030802615097

  • 17

    Banerjee S. Alvey L. Brown P. Yue S. Li L. Scheirer W. J. (2021). An assistive computer vision tool to automatically detect changes in fish behavior in response to ambient odor. Sci. Rep.11, 547. doi: 10.1038/s41598-020-79772-3

  • 18

    Barreiros M. de O. Dantas D.de O. Silva L.C.de O. Ribeiro S. Barros A. K. (2021). Zebrafish tracking using YOLOv2 and kalman filter. Sci. Rep.11, 3219. doi: 10.1038/s41598-021-81997-9

  • 19

    Bekkozhayeva D. Saberioon M. Cisar P. (2021). Automatic individual non-invasive photo-identification of fish (Sumatra barb Puntigrus tetrazona) using visible patterns on a body. Aquaculture Int.29, 14811493. doi: 10.1007/s10499-021-00684-8

  • 20

    Belmouhcine A. Simon J. Courtrai L. Lefevre S. (2021). “Robust deep simple online real-time tracking,” in 2021 12th International Symposium on Image and Signal Processing and Analysis, ISPA, Zagreb, Croatia, 2021-September. 138144. doi: 10.1109/ISPA52656.2021.9552062

  • 21

    Benson B. Cho J. Goshorn D. Kastner R. (2013) Field programmable gate array (FPGA) based fish detection using haar classifiers. Available at: https://agris.fao.org/agris-search/search.do?recordID=AV2012071748 (Accessed July 7, 2022).

  • 22

    Ben Tamou A. Benzinou A. Nasreddine K. (2021). Multi-stream fish detection in unconstrained underwater videos by the fusion of two convolutional neural network detectors. Appl. Intell.51, 58095821. doi: 10.1007/s10489-020-02155-8

  • 23

    Beyan C. Browman H. I. (2020). Setting the stage for the machine intelligence era in marine science. ICES J. Mar. Sci.77, 12671273. doi: 10.1093/ICESJMS/FSAA084

  • 24

    Bilodeau S. M. Schwartz A. W. H. Xu B. Pauca V. P. Silman M. R. (2022). A low-cost, long-term underwater camera trap network coupled with deep residual learning image analysis. PloS One17, e0263377. doi: 10.1371/JOURNAL.PONE.0263377

  • 25

    Blaxter J. H. S. (1988). ‘Sensory performance, behavior, and ecology of fish’. in AtemaJ.et al (eds.) Sensory Biol. Aquat. Anim. Berlin Heidelberg New York: Springer, 203232. doi: 10.1007/978-1-4612-3714-3_8

  • 26

    Blum A. L. Langley P. (1997). Selection of relevant features and examples in machine learning. Artif. Intell.97, 245271. doi: 10.1016/S0004-3702(97)00063-5

  • 27

    Bonofiglio F. de Leo F. C. Yee C. Chatzievangelou D. Aguzzi J. Marini S. (2022). Machine learning applied to big data from marine cabled observatories: A case study of sablefish monitoring in the NE pacific. Front. Mar. Sci.9. doi: 10.3389/fmars.2022.842946

  • 28

    Boom B. J. He J. Palazzo S. Huang P. X. Beyan C. Chou H.-M. et al . (2014). A research tool for long-term and continuous analysis of fish assemblage in coral-reefs using underwater camera footage. Ecol. Inform23, 8397. doi: 10.1016/j.ecoinf.2013.10.006

  • 29

    Bostrom N. Yudkowsky E. (2018). ‘The ethics of artificial intelligence’, in FrankishK.RamseyW. M. (eds.) The Cambridge Handbook of Artificial Intelligence1, 316334. doi: 10.1017/CBO9781139046855.020

  • 30

    Boudhane M. Nsiri B. (2016). Underwater image processing method for fish localization and detection in submarine environment. J. Vis. Commun. Image Represent39, 226238. doi: 10.1016/j.jvcir.2016.05.017

  • 31

    Boulais O. E. Woodward B. Schlining B. Lundsten L. Barnard K. Croff Bell K. et al . (2020). FathomNet: An underwater image training database for ocean exploration and discovery. arXiv preprint arXiv:2007.00114. doi: 10.48550/arxiv.2007.00114

  • 32

    Bowmaker J. K. Kunz Y. W. (1987). Ultraviolet receptors, tetrachromatic colour vision and retinal mosaics in the brown trout (Salmon trutta): Age-dependent changes. Vision Res.27, 21012108. doi: 10.1016/0042-6989(87)90124-6

  • 33

    Boyun V. P. Voznenko L. O. Malkush I. F. (20192019). Principles of organization of the human eye retina and their use in computer vision systems. Cybernetics Syst. Anal.55, 5 55, 701713. doi: 10.1007/S10559-019-00181-0

  • 34

    Breen M. Dyson J. O’Neill F. G. Jones E. Haigh M. (2004). Swimming endurance of haddock (Melanogrammus aeglefinus l.) at prolonged and sustained swimming speeds, and its role in their capture by towed fishing gears. ICES J. Mar. Sci.61(7), 10711079. doi: 10.1016/j.icesjms.2004.06.014

  • 35

    Brinkhof J. Larsen R. B. Herrmann B. Sistiaga M. (2020). Size selectivity and catch efficiency of bottom trawl with a double sorting grid and diamond mesh codend in the north-east Atlantic gadoid fishery. Fish Res.231, 105647. doi: 10.1016/j.fishres.2020.105647

  • 36

    Brown C. Laland K. Krause J. (2006). Fish cognition and behavior. Fish Cogn. Behav.page1. doi: 10.1002/9780470996058

  • 37

    Bullough L. W. Napier I. R. Laurenson C. H. Riley D. Fryer R. J. Ferro R. S. T. et al . (2007). A year-long trial of a square mesh panel in a commercial demersal trawl. Fish Res.83, 105112. doi: 10.1016/J.FISHRES.2006.09.008

  • 38

    Cachat J. M. Stewart A. Utterback E. Kyzar E. Hart P. C. Carlos D. et al . (2011). Deconstructing adult zebrafish behavior with swim trace visualizations. Neuromethods51, 191201. doi: 10.1007/978-1-60761-953-6_16

  • 39

    Cai K. Miao X. Wang W. Pang H. Liu Y. Song J. (2020). A modified YOLOv3 model for fish detection based on MobileNetv1 as backbone. Aquac Eng.91, 102117. doi: 10.1016/j.aquaeng.2020.102117

  • 40

    Calmon F. P. Wei D. Vinzamuri B. Ramamurthy K. N. Varshney K. R. (2017). arXiv preprint arXiv:1703.02476

  • 41

    Cao S. Zhao D. Sun Y. Ruan C. (2021). Learning-based low-illumination image enhancer for underwater live crab detection. ICES J. Mar. Sci.78, 979993. doi: 10.1093/icesjms/fsaa250

  • 42

    Capoccioni F. Leone C. Pulcini D. Cecchetti M. Rossi A. Ciccotti E. (2019). Fish movements and schooling behavior across the tidal channel in a Mediterranean coastal lagoon: An automated approach using acoustic imaging. Fish Res.219, 105318. doi: 10.1016/j.fishres.2019.105318

  • 43

    Carleton K. L. Escobar-Camacho D. Stieb S. M. Cortesi F. Justin Marshall N. (2020). Seeing the rainbow: Mechanisms underlying spectral sensitivity in teleost fishes. J. Exp. Biol.223. doi: 10.1242/JEB.193334/223810

  • 44

    Catania K. C. Hare J. F. Campbell K. L. (2008). Water shrews detect movement, shape, and smell to find prey underwater. Proceedings of the National Academy of Sciences105(2), 571576. doi: 10.1073/pnas.0709534104

  • 45

    Chandrashekar G. Sahin F. (2014). A survey on feature selection methods. Comput. Electrical Eng.40, 1628. doi: 10.1016/J.COMPELECENG.2013.11.024

  • 46

    Chapman C. J. Hawkins A. D. (1973). A field study of hearing in the cod,Gadus morhua l. J. Comp. Physiol.2 (85), 147167. doi: 10.1007/BF00696473

  • 47

    Chapman J. W. Nesbit R. L. Burgin L. E. Reynolds D. R. Smith A. D. Middleton D. R. et al . (2010). Flight orientation behaviors promote optimal migration trajectories in high-flying insects. Science327, 682685. doi: 10.1126/science.1182990

  • 48

    Cheng S. Zhao K. Zhang D. (2019). Abnormal water quality monitoring based on visual sensing of three-dimensional motion behavior of fish. Symmetry11, 1179. doi: 10.3390/sym11091179

  • 49

    Chen Q. Zhang C. Zhao J. Ouyang Q. (2013). Recent advances in emerging imaging techniques for non-destructive detection of food quality and safety. TrAC Trends Analytical Chem.52, 261274. doi: 10.1016/J.TRAC.2013.09.007

  • 50

    Chidami S. Guénard G. Amyot M. (2007). Underwater infrared video system for behavioral studies in lakes. Limnol Oceanogr Methods5, 371378. doi: 10.4319/lom.2007.5.371

  • 51

    Christensen J. H. Mogensen L. V. Galeazzi R. Andersen J. C. (2018). Detection, localization and classification of fish and fish species in poor conditions using convolutional neural networks; detection, localization and classification of fish and fish species in poor conditions using convolutional neural networks. IEEE/OES Autonomous Underwater Vehicle Workshop (AUV), Porto, Portugal, 16. doi: 10.1109/AUV.2018.8729798

  • 52

    Chua Y. Tan C. Lee Z. Chai T. Seet G. Sluzek A. (2011). Using MTF with fixed-zoning method for automated gated imaging system in turbid medium. Indian J. Mar. Sci.40, 236241.

  • 53

    Ciaparrone G. Luque Sánchez F. Tabik S. Troiano L. Tagliaferri R. Herrera F. (2020). Deep learning in video multi-object tracking: A survey. Neurocomputing381, 6188. doi: 10.1016/J.NEUCOM.2019.11.023

  • 54

    Connolly R. M. Fairclough D. Jinks E. L. Ditria E. M. Jackson G. Lopez-Marcano S. et al . (2021). Improved accuracy for automated counting of a fish in baited underwater videos for stock assessment. Front. Mar. Sci.8. doi: 10.3389/fmars.2021.658135

  • 55

    Cooke S. J. Cech J. J. Glassman D. M. Simard J. Louttit S. Lennox R. J. et al . (2020). Water resource development and sturgeon (Acipenseridae): state of the science and research gaps related to fish passage, entrainment, impingement and behavioural guidance. Rev. Fish Biol. Fish30, 219244. doi: 10.1007/s11160-020-09596-x

  • 56

    Crescitelli A. M. Gansel L. C. Zhang H. (2021). NorFisk: fish image dataset from Norwegian fish farms for species recognition using deep neural networks. Modeling, Identification and Control: A Norwegian Research Bulletin42, 116. doi: 10.4173/MIC.2021.1.1

  • 57

    Creswell A. White T. Dumoulin V. Arulkumaran K. Sengupta B. Bharath A. A. (2018). Generative adversarial networks: An overview. IEEE Signal Process Mag35, 5365. doi: 10.1109/MSP.2017.2765202

  • 58

    Cuende E. Arregi L. Herrmann B. Sistiaga M. Aboitiz X. (2020a). Prediction of square mesh panel and codend size selectivity of blue whiting based on fish morphology. ICES J. Mar. Sci.77, 28572869. doi: 10.1093/icesjms/fsaa156

  • 59

    Cuende E. Arregi L. Herrmann B. Sistiaga M. Onandia I. (2020b). Stimulating release of undersized fish through a square mesh panel in the Basque otter trawl fishery. Fish Res.224, 105431. doi: 10.1016/J.FISHRES.2019.105431

  • 60

    Cuende E. Herrmann B. Sistiaga M. Basterretxea M. Edridge A. Mackenzie E. K. et al . (2022). Species separation efficiency and effect of artificial lights with a horizonal grid in the Basque bottom trawl fishery. Ocean Coast. Manag221, 106105. doi: 10.1016/J.OCECOAMAN.2022.106105

  • 61

    Cui S. Zhou Y. Wang Y. Zhai L. (2020). Fish detection using deep learning. Appl. Comput. Intell. Soft Computing. 6, 66. doi: 10.1155/2020/3738108

  • 62

    Cunningham P. Cord M. Delany S. J. (2008). Supervised learning, in Machine Learning Techniques for MultimediaBerlin Heidelberg, Germany1, 2149. doi: 10.1007/978-3-540-75171-7_2

  • 63

    Cutter G. Stierhoff K. Zeng J. (2015). “Automated detection of rockfish in unconstrained underwater videos using haar cascades and a new image dataset: Labeled fishes in the wild,” in Proceedings - 2015 IEEE Winter Conference on Applications of Computer Vision Workshops, , WACVW 2015. 5762. doi: 10.1109/WACVW.2015.11

  • 64

    Dawkins M. Sherrill L. Fieldhouse K. Hoogs A. Richards B. Zhang D. et al . (2017). “An open-source platform for underwater image & video analytics,” in Proceedings - 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), Santa Rosa, CA, USA2017, 898906 (Institute of Electrical and Electronics Engineers Inc). doi: 10.1109/WACV.2017.105

  • 65

    Dealteris J. T. Reifsteck D. M. (1993). Escapement and survival of fish from the codend of a demersal trawl. ICES m ar. Sci. Sym196, 128131.

  • 66

    DeCelles G. R. Keiley E. F. Lowery T. M. Calabrese N. M. Stokesbury K. D. (2017). Development of a video trawl survey system for New England groundfish. Transactions of the American Fisheries Society146(3), 462477. doi: 10.1080/00028487.2017.1282888

  • 67

    de Robertis A. Handegard N. O. (2013). Fish avoidance of research vessels and the efficacy of noise-reduced vessels: A review. ICES J. Mar. Sci.70, 3445. doi: 10.1093/icesjms/fss155

  • 68

    Dijkgraaf S. (1960). Hearing in bony fishes. Proc. R Soc. Lond B Biol. Sci.152, 5154. doi: 10.1098/RSPB.1960.0022

  • 69

    Ditria E. M. Connolly R. M. Jinks E. L. Lopez-Marcano S. (2021a). Annotated video footage for automated identification and counting of fish in unconstrained seagrass habitats. Front. Mar. Sci.8, 629485. doi: 10.3389/fmars.2021.629485

  • 70

    Ditria E. M. Jinks E. L. Connolly R. M. (2021b). Automating the analysis of fish grazing behaviour from videos using image classification and optical flow. Anim. Behav.177, 3137. doi: 10.1016/j.anbehav.2021.04.018

  • 71

    Ditria E. M. Sievers M. Lopez-Marcano S. Jinks E. L. Connolly R. M. (2020). Deep learning for automated analysis of fish abundance: the benefits of training across multiple habitats. Environ. Monit Assess.192. doi: 10.1007/s10661-020-08653-z

  • 72

    Doksæter L. Handegard N. O. Godø O. R. Kvadsheim P. H. Nordlund N. (2012). Behavior of captive herring exposed to naval sonar transmissions (1.0–1.6 kHz) throughout a yearly cycle. J. Acoust Soc. Am.131, 16321642. doi: 10.1121/1.3675944

  • 73

    Dollár P. Wojek C. Schiele B. Perona P. (2012). Pedestrian detection: An evaluation of the state of the art. IEEE Trans. Pattern Anal. Mach. Intell.34, 743761. doi: 10.1109/TPAMI.2011.155

  • 74

    Duecker D. A. Hansen T. Kreuzer E. (2020). “RGB-D camera-based navigation for autonomous underwater inspection using low-cost micro AUVs,” in 2020 IEEE/OES Autonomous Underwater Vehicles Symposium AUV, St. Johns, NL, Canada. doi: 10.1109/AUV50043.2020.9267890

  • 75

    Durden J. M. Schoening T. Althaus F. Friedman A. Garcia R. Glover A. G. et al . (2016). Perspectives in visual imaging for marine biology and ecology: from acquisition to understanding. Oceanography Mar. Biology: Annu. Rev.54, 315366. doi: 10.1201/9781315368597

  • 76

    Eickholt J. Kelly D. Bryan J. Miehls S. Zielinski D. (2020). Advancements towards selective barrier passage by automatic species identification: Applications of deep convolutional neural networks on images of dewatered fish. ICES J. Mar. Sci.77, 28042813. doi: 10.1093/icesjms/fsaa150

  • 77

    Ellis W. L. Stadler J. (2005). Application of an in situ infrared camera system for evaluating icthyofaunal utilization of restored and degraded mangrove habitats: developing a set of reference conditions from a NERRS site. Final Report. NOAA/UNH Cooperative Institute for Coastal and Estuarine Environmental Technology (CICEET)

  • 78

    Everingham M. Winn J. (2012). The PASCAL Visual Object Challenge 2012 (VOC2012) Results. http://www.pascalnetwork.org/challenges/VOC/voc2012/workshop/index.html.

  • 79

    Faillettaz R. Picheral M. Luo J. Y. Guigand C. Cowen R. K. Irisson J. O. (2016). Imperfect automatic image classification successfully describes plankton distribution patterns. Methods Oceanography15–16, 6077. doi: 10.1016/j.mio.2016.04.003

  • 80

    Feekings J. O’Neill F. G. Krag L. Ulrich C. Veiga Malta T. (2019). An evaluation of European initiatives established to encourage industry-led development of selective fishing gears. Fish Manag Ecol.26, 650660. doi: 10.1111/FME.12379

  • 81

    Ferro R. S. T. Jones E. G. Kynoch R. J. Fryer R. J. Buckett B. E. (2007). Separating species using a horizontal panel in the Scottish north Sea whitefish trawl fishery. ICES J. Mar. Sci.64, 15431550. doi: 10.1093/ICESJMS/FSM099

  • 82

    Fier R. Albu A. B. Hoeberechts M. (20152014). Automatic fish counting system for noisy deep-sea videos. 2014 Oceans - St. John’s OCEANS. St. John's, NL, Canada, 2014, 16. doi: 10.1109/OCEANS.2014.7003118

  • 83

    Fisher R. Chen-Burger Y.-H. Giordano D. Hardman L. Lin F.-P. (Eds.). (2016). Fish4Knowledge: Collecting and analyzing massive coral reef fish video data104, 319. Berlin Heidelberg, Germany: Springer. doi: 10.1007/978-3-319-30208-9

  • 84

    Fonseca P. Martins R. Campos A. Sobral P. (2005). Gill-net selectivity off the Portuguese western coast. Fish Res.73, 323339. doi: 10.1016/j.fishres.2005.01.015

  • 85

    Forbus K. D. Hinrichs T. R. (2006). Companion cognitive systems: A step toward human-level AI. AI Mag27, 8383. doi: 10.1609/AIMAG.V27I2.1882

  • 86

    Forlim C. G. Pinto R. D. (2014). Automatic realistic real time stimulation/recording in weakly electric fish: Long time behavior characterization in freely swimming fish and stimuli discrimination. PloS One9, e84885. doi: 10.1371/journal.pone.0084885

  • 87

    Fouad M. M. M. Zawbaa H. M. El-Bendary N. Hassanien A. E. (2014). “Automatic Nile tilapia fish classification approach using machine learning techniques,” in 13th International Conference on Hybrid Intelligent Systems, HIS 2013, Gammath, Tunisia. 173178. doi: 10.1109/HIS.2013.6920477

  • 88

    Gan W. S. Yang J. Kamakura T. (2012). A review of parametric acoustic array in air. Appl. Acoustics73, 12111219. doi: 10.1016/J.APACOUST.2012.04.001

  • 89

    Gauen K. Dailey R. Laiman J. Zi Y. Asokan N. Lu Y. H. et al . (2017). “Comparison of visual datasets for machine learning,” in 2017 IEEE International Conference on Information Reuse and Integration (IRI), San Diego, CA, USA. 346355. doi: 10.1109/IRI.2017.59

  • 90

    Gilpin L. H. Bau D. Yuan B. Z. Bajwa A. Specter M. Kagal L. (2019). “Explaining explanations: An overview of interpretability of machine learning,” in 2018 IEEE 5th International Conference on Data Science and Advanced Analytics (DSAA), Turin, Italy. 8089. doi: 10.1109/DSAA.2018.00018

  • 91

    Glass C. W. Wardle C. S. Glass S. J. G. Wardle C. W. Gosden C. S. Behavioural S. J. et al . (1993). Behavioural studies of the principles underlying mesh penetration by fish. ICES Mar. Sei. Symp196, 9297.

  • 92

    Glass C. W. Wardle C. S. Gosden S. J. Racey D. N. (1995). Studies on the use of visual stimuli to control fish escape from codends. i. laboratory studies on the effect of a black tunnel on mesh penetration. Fish Res.23, 157164. doi: 10.1016/0165-7836(94)00330-Y

  • 93

    Goldsmith T. H. Fernandez H. R. (1968). Comparative studies of crustacean spectral sensitivity. Z. Vergl. Physiol.60, 156175. doi: 10.1007/BF00878449

  • 94

    Goodwin M. Halvorsen K. T. Jiao L. Knausgård K. M. Martin A. H. Moyano M. et al . (2021) Unlocking the potential of deep learning for marine ecology: overview, applications, and outlook. Available at: http://arxiv.org/abs/2109.14737.

  • 95

    Graham N. (2003). By-catch reduction in the brown shrimp, crangon crangon, fisheries using a rigid separation nordmøre grid (grate). Fish Res.59, 393407. doi: 10.1016/S0165-7836(02)00015-2

  • 96

    Graham N. Jones E. Reid D. G. (2004). Review of technological advances for the study of fish behaviour in relation to demersal fishing trawls. ICES J. Mar. Sci.61(7), 10361043. doi: 10.1016/j.icesjms.2004.06.006

  • 97

    Grauman K. Leibe B. (2011). Visual object recognition. Synthesis Lectures Artif. Intell. Mach. Learn.11, 1180. doi: 10.2200/S00332ED1V01Y201103AIM011

  • 98

    Guidi L. Guerra A. F. Canchaya C. Curry E. Foglini F. Irisson J. O. et al . (2020). “Big data in marine science,“ in Future Science Brief 6 of the European Marine Board, eds AlexanderB.HeymansJ. J.Muñiz PiniellaA.KellettP.CoopmanJ. (Ostend: European Marine Board), 152. doi: 10.5281/ZENODO.3755793

  • 99

    Guo Z. Zhang L. Jiang Y. Niu W. Gu Z. Zheng H. et al . (2020). Few-shot fish image generation and classification (Singapore - U.S. Gulf Coast: 2020 Global Oceans 2020). doi: 10.1109/IEEECONF38699.2020.9389005

  • 100

    Gupta U. Kim Y. G. Lee S. Tse J. Lee H. H. S. Wei G. Y. et al . (2022). Chasing carbon: The elusive environmental footprint of computing. IEEE Micro42, 3747. doi: 10.1109/MM.2022.3163226

  • 101

    Gupta S. Mukherjee P. Chaudhury S. Lall B. Sanisetty H. (2021). DFTNet: Deep fish tracker with attention mechanism in unconstrained marine environments. IEEE Trans. Instrum Meas70, 113. doi: 10.1109/TIM.2021.3109731

  • 102

    Guthrie D. M. (1986). Role of vision in fish behaviour (75–113: The Behaviour of Teleost Fishes). doi: 10.1007/978-1-4684-8261-4_4

  • 103

    Hannah R. W. Jones S. A. (2012). Evaluating the behavioral impairment of escaping fish can help measure the effectiveness of bycatch reduction devices. Fish Res.131–133, 3944. doi: 10.1016/J.FISHRES.2012.07.010

  • 104

    Haro A. Miehls S. Johnson N. S. Wagner C. M. (2020). Evaluation of visible light as a cue for guiding downstream migrant juvenile Sea lamprey. Trans. Am. Fish Soc.149, 635647. doi: 10.1002/tafs.10261

  • 105

    Harpaz R. Tkačik G. Schneidman E. (2017). Discrete modes of social information processing predict individual behavior of fish in a group. Proc. Natl. Acad. Sci. U.S.A.114, 1014910154. doi: 10.1073/PNAS.1703817114

  • 106

    He P. (1993). Swimming speeds of marine fish in relation to fishing gears. ICES Mar. Sci. Symp.196, 183189.

  • 107

    He P. (2010). Behavior of marine fishes : capture processes and conservation challengesWiley-Blackwell, Iowa.

  • 108

    He P. Chopin F. Suuronen P. Ferro R. Lansley J. (2021). Classification and illustrated definition of fishing gears. FAO Fisheries and Aquaculture Technical Paper672 (2021), p. I94. UnitedNations Food and Agriculture Organization (FAO), Rome. doi: 10.4060/cb4966en

  • 109

    Hernández-Serna A. Jiménez-Segura L. F. (2014). Automatic identification of species with neural networks. PeerJ. 2, p.e563. doi: 10.7717/peerj.563

  • 110

    Heydarnejad M. S. Fattollahi M. Khoshkam M. (2017). Influence of light colours on growth and stress response of pearl gourami trichopodus leerii under laboratory conditions. J. Ichthyology6 (57), 908912. doi: 10.1134/S0032945217060054

  • 111

    Hoffman R. R. Mueller S. T. Klein G. Litman J. (2018). Metrics for explainable AI: Challenges and prospects. arXiv preprint arXiv: 2007.00114. doi: 10.48550/arxiv.1812.04608

  • 112

    Holbrook R. I. Burt de Perera T. (2009). Separate encoding of vertical and horizontal components of space during orientation in fish. Anim. Behav.78, 241245. doi: 10.1016/j.anbehav.2009.03.021

  • 113

    Horowitz A. C. Bekoff M. (2015). Naturalizing anthropomorphism: Behavioral prompts to our humanizing of animals. Anthrozoös20(1), 20, 2335. doi: 10.2752/089279307780216650

  • 114

    Hossain E. Alam S. M. S. Ali A. A. Amin M. A. (2016). “Fish activity tracking and species identification in underwater video,” in 2016 5th International Conference on Informatics, Electronics and Vision (ICIEV), Dhaka, Bangladesh. 6266. doi: 10.1109/ICIEV.2016.7760189

  • 115

    Hsiao Y. H. Chen C. C. Lin S. I. Lin F. P. (2014). Real-world underwater fish recognition and identification, using sparse representation. Ecol. Inform23, 1321. doi: 10.1016/j.ecoinf.2013.10.002

  • 116

    Huang K. Han Y. Chen K. Pan H. Zhao G. Yi W. et al . (2021). A hierarchical 3D-motion learning framework for animal spontaneous behavior mapping. Nat. Commun.1 (12), 114. doi: 10.1038/s41467-021-22970-y

  • 117

    Huang D. Zhao D. Wei L. Wang Z. Du Y. (2015). Modeling and analysis in marine big data: Advances and challenges. Math Probl Eng. 2015, pp. 113. doi: 10.1155/2015/384742

  • 118

    Hu J. Zhao D. Zhang Y. Zhou C. Chen W. (2021). Real-time nondestructive fish behavior detecting in mixed polyculture system using deep-learning and low-cost devices. Expert Syst. Appl.178, 115051. doi: 10.1016/j.eswa.2021.115051

  • 119

    Iqbal M. A. Wang Z. Ali Z. A. Riaz S. (2021). Automatic fish species classification using deep convolutional neural networks. Wirel Pers. Commun.116, 10431053. doi: 10.1007/s11277-019-06634-1

  • 120

    Jäger J. Simon M. Jaeger J. Denzler J. Wolff V. Fricke-Neuderth K. et al . (2015). Croatian Fish dataset: Fine-grained classification of fish species in their natural habitat, in AmaralT.MatthewsS.PlötzT.McKennaS.FisherR. (eds.), Proceedings of the Machine Vision of Animals and their Behaviour (MVAB), pp. 6.1-6.7. doi: 10.5244/C.29.MVAB.6

  • 121

    Jahanbakht M. Xiang W. Hanzo L. Azghadi M. R. (2021). Internet Of underwater things and big marine data analytics - a comprehensive survey. IEEE Commun. Surveys Tutorials23, 904956. doi: 10.1109/COMST.2021.3053118

  • 122

    Jalal A. Salman A. Mian A. Shortis M. Shafait F. (2020). Fish detection and species classification in underwater environments using deep learning with temporal information. Ecol. Inform57, 101088. doi: 10.1016/j.ecoinf.2020.101088

  • 123

    Joly A. Goëau H. Glotin H. Spampinato C. Bonnet P. Vellinga W.-P. et al . (2016). LifeCLEF 2016: Multimedia life species identification challenges. In Proceedings of the 2016 International Conference of the Cross-Language Evaluation Forum for European Languages (CLEF), Evora, Portugal. 286310. doi: 10.1007/978-3-319-44564-9_26ï

  • 124

    Jones M. J. Hale R. (2020). Using knowledge of behaviour and optic physiology to improve fish passage through culverts. Fish Fisheries21, 557569. doi: 10.1111/faf.12446

  • 125

    Jones E. G. Summerbell K. O’Neill F. (2008). The influence of towing speed and fish density on the behaviour of haddock in a trawl cod-end. Fish Res.94, 166174. doi: 10.1016/j.fishres.2008.06.010

  • 126

    Jordan L. K. Mandelman J. W. McComb D. M. Fordham S. Carlson J. K. Werner T. B. (2013). Linking sensory biology and fisheries bycatch reduction in elasmobranch fishes: a review with new directions for research. Conserv. Physiol.1, 120. doi: 10.1093/CONPHYS/COT002

  • 127

    Kadri S. Metcalfe N. B. Huntingford F. A. Thorpe J. E. (1991). Daily feeding rhythms in Atlantic salmon in sea cages. Aquaculture92, 219224. doi: 10.1016/0044-8486(91)90023-Z

  • 128

    Kaimmer S. Stoner A. W. (2008). Field investigation of rare-earth metal as a deterrent to spiny dogfish in the pacific halibut fishery. Fish Res.94, 4347. doi: 10.1016/J.FISHRES.2008.06.015

  • 129

    Karlsen J. D. Melli V. Krag L. A. (2021). Exploring new netting material for fishing: The low light level of a luminous netting negatively influences species separation in trawls. ICES J. Mar. Sci.78, 28182829. doi: 10.1093/icesjms/fsab160

  • 130

    Katija K. Orenstein E. Schlining B. Lundsten L. Barnard K. Sainz G. et al . (2021a). FathomNet: A global image database for enabling artificial intelligence in the ocean. arXiv preprint doi: 10.48550/arxiv.2109.14646

  • 131

    Katija K. Roberts P. Daniels J. Lapides A. Barnard K. Risi M. et al . (2021b). Visual tracking of deepwater animals using machine learning-controlled robotic underwater vehicles. In Proceedings of the IEEE/CVF winter conference on applications of computer vision, 860869.

  • 132

    Kay J. Merrifield M. (2021). The fishnet open images database: A dataset for fish detection and fine-grained categorization in fisheries. arXiv preprint arXiv:2106.09178. doi: 10.48550/arxiv.2106.09178

  • 133

    Kim Y.-H. (2003). Numerical modeling of chaotic behavior for small-scale movements of demersal fishes in coastal water. Fisheries Sci.69, 535546. doi: 10.1046/j.0919-9268.2003.00654.x

  • 134

    Kim Y.-H. Wardle C. S. (2003). Optomotor response and erratic response: quantitative analysis of fish reaction to towed fishing gears. Fisheries Research60, 455470. doi: 1016/S0165-7836(02)00114-5

  • 135

    Kim Y. H. Wardle C. S. (2005). Basic modelling of fish behaviour in a towed trawl based on chaos in decision-making. Fish Res.73, 217229. doi: 10.1016/j.fishres.2004.12.003

  • 136

    Kim Y. H. Wardle C. S. An Y. S. (2008). Herding and escaping responses of juvenile roundfish to square mesh window in a trawl cod end. Fisheries Sci.74, 17. doi: 10.1111/j.1444-2906.2007.01490.x

  • 137

    Knausgård K. M. Wiklund A. Sørdalen T. K. Halvorsen K. T. Kleiven A. R. Jiao L. et al . (2021). Temperate fish detection and classification: a deep learning based approach. Appl. Intell. 52(6), 69887001. doi: 10.1007/s10489-020-02154-9

  • 138

    Krag L. A. Madsen N. Karlsen J. D. (2009). A study of fish behaviour in the extension of a demersal trawl using a multi-compartment separator frame and SIT camera system. Fish Res.98, 6266. doi: 10.1016/J.FISHRES.2009.03.012

  • 139

    Krausz B. Bauckhage C. (2011). “Analyzing pedestrian behavior in crowds for automatic detection of congestions,” in 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Barcelona, Spain. 144149. doi: 10.1109/ICCVW.2011.6130236

  • 140

    Kunz Y. W. (2006). Review of development and aging in the eye of teleost fish. Neuroembryology Aging4, 3160. doi: 10.1159/000103451

  • 141

    Kyllingstad L. T. Reite K.-J. Haugen J. Ladstein J. (2022) SMARTFISH H2020 D5.3: FishData analysis (Open access revision) (SINTEF Ocean). Available at: https://sintef.brage.unit.no/sintef-xmlui/handle/11250/3013186 (Accessed January 11, 2023).

  • 142

    Løkkeborg S. (1990). Rate of release of potential feeding attractants from natural and artificial bait. Fish Res.8, 253261. doi: 10.1016/0165-7836(90)90026-R

  • 143

    Laan A. Iglesias-Julios M. de Polavieja G. G. (2018). Zebrafish aggression on the sub-second time scale: evidence for mutual motor coordination and multi-functional attack manoeuvres. R Soc. Open Sci.5, 180679. doi: 10.1098/RSOS.180679

  • 144

    Langlois T. Goetze J. Bond T. Monk J. Abesamis R. A. Asher J. et al . (2020). A field and video annotation guide for baited remote underwater stereo-video surveys of demersal fish assemblages. Methods Ecol. Evol.11, 14011409. doi: 10.1111/2041-210X.13470

  • 145

    Laradji I. Saleh A. Rodriguez P. Nowrouzezahrai D. Azghadi M. R. Vazquez D. (2020). Affinity LCFCN: Learning to segment fish with weak supervision. arXiv preprint arXiv:2011.03149. doi: 10.48550/arxiv.2011.03149

  • 146

    Larsen R. B. Herrmann B. Sistiaga M. Brinkhof J. Tatone I. Langård L. (2017). Performance of the Nordmøre grid in shrimp trawling and potential effects of guiding funnel length and light stimulation. Mar. Coast. Fisheries. 9(1), 479492. doi: 10.1080/19425120.2017.1360421

  • 147

    Larsen R. B. Larsen I. (1993). Size selectivity of rigid sorting grids in bottom trawls for Atlantic cod (Gadus morhua) and haddock (Melanogrammus aeglefinus). ICES Mar Sci Symp196, 178182.

  • 148

    LeCun Y. Bengio Y. Hinton G. (2015). Deep learning. Nature7553 (521), 436444. doi: 10.1038/nature14539

  • 149

    Lee D.-J. Schoenberger R. B. Shiozawa D. Xu X. Zhan P. (2004). Contour matching for a fish recognition and migration-monitoring system. Two-and Three-Dimensional Vision Systems for Inspection, Control, and Metrology II, 5606, 3748. doi: 10.1117/12.571789

  • 150

    Lillywhite K. D. Lee D. J. (2013) Robotic vision lab (Brigham Young University, Fish dataset). Available at: http://roboticvision.groups.et.byu.net/Machine_Vision/BYUFish/BYU_Fish.html (Accessed August 1, 2022).

  • 151

    Liu Y. Wang S. Chen Y. Q. (2016). Automatic 3D tracking system for large swarm of moving objects. Pattern Recognit52, 384396. doi: 10.1016/J.PATCOG.2015.11.014

  • 152

    Liu X. Yue Y. Shi M. Qian Z. M. (2019). 3-d video tracking of multiple fish in a water tank. IEEE Access7, 145049145059. doi: 10.1109/ACCESS.2019.2945606

  • 153

    Li D. Wang G. Du L. Zheng Y. Wang Z. (2022). Recent advances in intelligent recognition methods for fish stress behavior. Aquac Eng.96, 102222. doi: 10.1016/J.AQUAENG.2021.102222

  • 154

    Li D. Wang Z. Wu S. Miao Z. Du L. Duan Y. (2020). Automatic recognition methods of fish feeding behavior in aquaculture: A review. Aquaculture528. doi: 10.1016/j.aquaculture.2020.735508

  • 155

    Li J. Zhu K. Wang F. Jiang F. (2021). Deep neural network-based real time fish detection method in the scene of marine fishing supervision. J. Intelligent Fuzzy Syst.41, 45274532. doi: 10.3233/JIFS-189713

  • 156

    Logares R. Alos J. Catalan I. Solana A. C. Javier del Ocampo F. (2021). “Oceans of big data and artificial intelligence,” Oceans. CSIC scientific challenges towards 2030. 163179. Available at: https://hal.archives-ouvertes.fr/hal-03372264/.

  • 157

    Lomeli M. J. M. Wakefield W. W. (2019). The effect of artificial illumination on Chinook salmon behavior and their escapement out of a midwater trawl bycatch reduction device. Fish Res.218, 112119. doi: 10.1016/j.fishres.2019.04.013

  • 158

    Lomeli M. J. M. Wakefield W. W. Herrmann B. Dykstra C. L. Simeon A. Rudy D. M. et al . (2021). Use of artificial illumination to reduce pacific halibut bycatch in a U.S. West coast groundfish bottom trawl. Fish Res.233, 105737. doi: 10.1016/j.fishres.2020.105737

  • 159

    Long L. Johnson Z. V. Li J. Lancaster T. J. Aljapur V. Streelman J. T. et al . (2020). Automatic classification of cichlid behaviors using 3D convolutional residual networks. iScience23, 101591. doi: 10.1016/j.isci.2020.101591

  • 160

    Lopez-Marcano S. ,. L. Jinks E. Buelow C. A. Brown C. J. Wang D. Kusy B. et al . (2021). Automatic detection of fish and tracking of movement for ecology. Ecol. Evol.11, 82548263. doi: 10.1002/ece3.7656

  • 161

    Lucas S. Berggren P. (2022). A systematic review of sensory deterrents for bycatch mitigation of marine megafauna. Rev. Fish Biol. Fisheries2022, 133. doi: 10.1007/S11160-022-09736-5

  • 162

    Lukas J. Romanczuk P. Klenz H. Klamser P. Arias Rodriguez L. Krause J. et al . (2021). Acoustic and visual stimuli combined promote stronger responses to aerial predation in fish. Behav. Ecol.32, 10941102. doi: 10.1093/BEHECO/ARAB043

  • 163

    Lu H. Li Y. Zhang Y. Chen M. Serikawa S. Kim H. (2017). Underwater optical image processing: a comprehensive review. Mobile Networks Appl.22, 12041211. doi: 10.1007/s11036-017-0863-4

  • 164

    Maia C. M. Volpato G. L. (2013). Environmental light color affects the stress response of Nile tilapia. Zoology116, 6466. doi: 10.1016/J.ZOOL.2012.08.001

  • 165

    Måløy H. Aamodt A. Misimi E. (2019). A spatio-temporal recurrent network for salmon feeding action recognition from underwater videos in aquaculture. Comput. Electron Agric.167, 105087. doi: 10.1016/j.compag.2019.105087

  • 166

    Malde K. Handegard N. O. Eikvil L. Salberg A. B. (2020). Machine intelligence and the data-driven future of marine science. ICES J. Mar. Sci.77, 12741285. doi: 10.1093/icesjms/fsz057

  • 167

    Mandralis I. Weber P. Novati G. Koumoutsakos P. (2021). Learning swimming escape patterns for larval fish under energy constraints. Phys. Rev. Fluids6, 093101. doi: 10.1103/PhysRevFluids.6.093101

  • 168

    Manière G. Coureaud G. (2020). Editorial: From stimulus to behavioral decision-making. Frontiers in Behavioral Neuroscience13, 274. doi: 10.3389/fnbeh.2019.00274

  • 169

    Marini S. Fanelli E. Sbragaglia V. Azzurro E. del Rio Fernandez J. Aguzzi J. (20182018). Tracking fish abundance by underwater image recognition. Sci. Rep.1 (8), 112. doi: 10.1038/s41598-018-32089-8

  • 170

    Matt S. J. K. Broadhurst K. Kennelly S. J. Broadhurst M. K. (2021). A review of bycatch reduction in demersal fish trawls. Rev. Fish Biol. Fisheries2 (31), 289318. doi: 10.1007/s11160-021-09644-0

  • 171

    McClure E. C. Sievers M. Brown C. J. Buelow C. A. Ditria E. M. Hayes M. A. et al . (2020). Artificial intelligence meets citizen science to supercharge ecological monitoring. Patterns. 1 (7), 100109. doi: 10.1016/j.patter.2020.100109

  • 172

    McIntosh D. Marques T. P. Albu A. B. Rountree R. de Leo F. (2020). Movement tracks for the automatic detection of fish behavior in videos. arXiv preprint arXiv:2011.14070. doi: 10.48550/arXiv.2011.14070

  • 173

    Méhault S. Morandeau F. Simon J. Faillettaz R. Abangan A. Cortay A. et al . (2022). Using fish behavior to design a fish pot: Black seabream (Spondyliosoma cantharus) case study. Front. Mar. Sci.9. doi: 10.3389/FMARS.2022.1009992

  • 174

    Mellody M. (2015) Robust methods for the analysis of images and videos for fisheries stock assessment: Summary of a workshop robust methods for the analysis of images and videos for fisheries stock assessment. Available at: https://nap.nationalacademies.org/catalog/18986/robust-methods-for-the-analysis-of-images-and-videos-for-fisheries-stock-assessment (Accessed June 29, 2022).

  • 175

    Millot S. Bégout M. L. Chatain B. (2009). Exploration behaviour and flight response toward a stimulus in three sea bass strains (Dicentrarchus labrax l.). Appl. Anim. Behav. Sci.119, 108114. doi: 10.1016/J.APPLANIM.2009.03.009

  • 176

    Mortensen L. O. Ulrich C. Olesen H. J. Bergsson H. Berg C. W. Tzamouranis N. et al . (2017). Effectiveness of fully documented fisheries to estimate discards in a participatory research scheme. Fish Res.187, 150157. doi: 10.1016/J.FISHRES.2016.11.010

  • 177

    Moustahfid H. Michaels W. Alger B. Gangopadhyay A. Brehmer P. (2020). “Advances in fisheries science through emerging observing technologies,” in Global Oceans 2020: Singapore – U.S. Gulf Coast, Biloxi, MS, USA. doi: 10.1109/IEEECONF38699.2020.9389452

  • 178

    Mujtaba D. F. Mahapatra N. R. (2022). Fish species classification with data augmentation. In 2021 International Conference on Computational Science and Computational Intelligence (CSCI). 15881593. doi: 10.1109/CSCI54926.2021.00307

  • 179

    Muñoz-Benavent P. Andreu-García G. Valiente-González J. M. Atienza-Vanacloig V. Puig-Pons V. Espinosa V. (2018). Automatic bluefin tuna sizing using a stereoscopic vision system. ICES J. Mar. Sci.75, 390401. doi: 10.1093/ICESJMS/FSX151

  • 180

    Murugaiyan J. S. Palaniappan M. Durairaj T. Muthukumar V. (2021). Fish species recognition using transfer learning techniques. Int. J. Adv. Intelligent Inf.7, 188197. doi: 10.26555/ijain.v7i2.610

  • 181

    Myrum E. Norstebo S. A. George S. Pedersen M. Museth J. (2019) An automatic image-based system for detecting wild and stocked fish (NIK: Norsk Informatikkonferanse). Available at: https://ntnuopen.ntnu.no/ntnu-xmlui/handle/11250/2639552 (Accessed July 15, 2022).

  • 182

    Nasreddine K. Benzinou A. (2015). “Shape-based fish recognition via shape space,” in 2015 23rd European Signal Processing Conference (EUSIPCO), Nice, France. 145149. doi: 10.1109/EUSIPCO.2015.7362362

  • 183

    Nawi N. M. Atomi W. H. Rehman M. Z. (2013). The effect of data pre-processing on optimized training of artificial neural networks. Proc. Technol.11, 3239. doi: 10.1016/J.PROTCY.2013.12.159

  • 184

    Negahdaripour S. (2005). Calibration of DIDSON forward-scan acoustic video camera. Proc. MTS/IEEE OCEANS2, 12871294. doi: 10.1109/OCEANS.2005.1639932

  • 185

    Nian R. He B. Yu J. Bao Z. Wang Y. (2013). ROV-based underwater vision system for intelligent fish ethology research. Int. J. Adv. Robot Syst.10, 326. doi: 10.5772/56800

  • 186

    Niu B. Li G. Peng F. Wu J. Zhang L. Li Z. (2018). Survey of fish behavior analysis by computer vision. J. Aquac Res. Dev.09, 1000534. doi: 10.4172/2155-9546.1000534

  • 187

    Noldus L. P. J. J. Spink A. J. Tegelenbosch R. A. J. (20012001). EthoVision: A versatile video tracking system for automation of behavioral experiments. Behav. Res. Methods Instruments Comput.3 (33), 398414. doi: 10.3758/BF03195394

  • 188

    O’Connell C. P. Stroud E. M. He P. (2014). The emerging field of electrosensory and semiochemical shark repellents: Mechanisms of detection, overview of past studies, and future directions. Ocean Coast. Manag97, 211. doi: 10.1016/J.OCECOAMAN.2012.11.005

  • 189

    Odling-Smee L. Braithwaite V. A. (2003). The role of learning in fish orientation. Fish Fisheries4, 235246. doi: 10.1046/J.1467-2979.2003.00127.X

  • 190

    Okafor E. Schomaker L. Wiering M. A. (2018). An analysis of rotation matrix and colour constancy data augmentation in classifying images of animals. Journal of Information and Telecommunication2, 465491. doi: 10.1080/24751839.2018.1479932

  • 191

    Olla B. L. Davis M. W. Rose C. (2000). Differences in orientation and swimming of walleye pollock Theragra chalcogramma in a trawl net under light and dark conditions: concordance between field and laboratory observations. Fish Res.44, 261266. doi: 10.1016/S0165-7836(99)00093-4

  • 192

    O’Neill F. G. Feekings J. Fryer R. J. Fauconnet L. Afonso P. (2019). “Discard avoidance by improving fishing gear selectivity: Helping the fishing industry help itself,” in UhlmannS.UlrichC.KennellyS. (eds) The European Landing Obligation. Springer, Cham. 279296. doi: 10.1007/978-3-030-03308-8_14

  • 193

    O’Neill F. G. Mutch K. (2017). Selectivity in trawl fishing gears. Scottish Mar. Freshw. Sci.8, 185. doi: 10.4789/1890-1

  • 194

    O’Neill F. G. Summerbell K. Barros L. (2018b) ICES WGFTFB 2018 report: Some recent trials with illuminated grids. Available at: https://archimer.ifremer.fr/doc/00586/69766/ (Accessed August 1, 2022).

  • 195

    Ordines F. Massutí E. Guijarro B. Mas R. (2006). Diamond vs. square mesh codend in a multi-species trawl fishery of the western Mediterranean: effects on catch composition, yield, size selectivity and discards. Aquat Living Resour19, 329338. doi: 10.1051/ALR:2007003

  • 196

    Oteiza P. Odstrcil I. Lauder G. Portugues R. Engert F. (2017). A novel mechanism for mechanosensory-based rheotaxis in larval zebrafish. Nature547, 445448. doi: 10.1038/nature23014

  • 197

    Ovchinnikova K. James M. A. Mendo T. Dawkins M. Crall J. Boswarva K. (2021). Exploring the potential to use low cost imaging and an open source convolutional neural network detector to support stock assessment of the king scallop (Pecten maximus). Ecol. Inform62, 101233. doi: 10.1016/j.ecoinf.2021.101233

  • 198

    Owen M. A. G. Davies S. J. Sloman K. A. (2010). Light colour influences the behaviour and stress physiology of captive tench (Tinca tinca). Rev Fish Biol Fisheries20, 375380. doi: 10.1007/s11160-009-9150-1

  • 199

    Packard J. M. Folse L. J. Stone N. D. Makela M. E. Coulson R. N. (2021). Applications of artificial intelligence to animal behavior, in BekoffM.JamiesonD. (eds), Interpretation and Explanation in the study of Animal Behavior2, 147191. doi: 10.4324/9780429042799-11

  • 200

    Painter K. J. (2021). The impact of rheotaxis and flow on the aggregation of organisms. J. R Soc. Interface18, 20210582. doi: 10.1098/RSIF.2021.0582

  • 201

    Palazzo S. Murabito F. (2014). “Fish species identification in real-life underwater images,” in MAED 2014 - Proceedings of the 3rd ACM International Regular and Data Challenge Workshop on Multimedia Analysis for Ecological Data (Association for Computing Machinery), New York, NY, USA. 1318. doi: 10.1145/2661821.2661822

  • 202

    Pang S. del Coz J. J. Yu Z. Luaces O. Díez J. (2017). Deep learning to frame objects for visual target tracking. Eng. Appl. Artif. Intell.65, 406420. doi: 10.1016/J.ENGAPPAI.2017.08.010

  • 203

    Pang J. Liu W. Liu B. Tao D. Zhang K. Lu X. (2022). “Interference distillation for underwater fish recognition,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 13188 LNCS. 6274. doi: 10.1007/978-3-031-02375-0_5

  • 204

    Papadakis V. M. Papadakis I. E. Lamprianidou F. Glaropoulos A. Kentouri M. (2012). A computer-vision system and methodology for the analysis of fish behavior. Aquac Eng.46, 5359. doi: 10.1016/j.aquaeng.2011.11.002

  • 205

    Park Y. Dang L. M. Lee S. Han D. Moon H. (2021). Multiple object tracking in deep learning approaches: A survey. Electronics10, 2406. doi: 10.3390/ELECTRONICS10192406

  • 206

    Pautsina A. Císař P. Štys D. Terjesen B. F. Espmark Å.M.O. (2015). Infrared reflection system for indoor 3D tracking of fish. Aquac Eng.69, 717. doi: 10.1016/J.AQUAENG.2015.09.002

  • 207

    Pedersen M. Madsen N. Moeslund T. B. (2021). Video data in marine environments. J. Ocean Technol.16, 2130.

  • 208

    Pedersen M. Mohammed A. (2021). Photo identification of individual Salmo trutta based on deep learning. Appl. Sci.11, 9039. doi: 10.3390/app11199039

  • 209

    Pelletier S. Montacir A. Zakari H. Akhloufi M. (2018). “Deep learning for marine resources classification in non-structured scenarios: Training vs. transfer learning,” in 2018 IEEE Canadian Conference on Electrical & Computer Engineering (CCECE), Quebec, QC, Canada. doi: 10.1109/CCECE.2018.8447682

  • 210

    Pereira T. D. Shaevitz J. W. Murthy M. (20202020). Quantifying behavior to understand the brain. Nat. Neurosci.12 (23), 15371549. doi: 10.1038/s41593-020-00734-z

  • 211

    Peterson A. N. (2022) The persistent-pursuit and evasion strategies of lionfish and their prey. Available at: https://escholarship.org/uc/item/89n1p8wt (Accessed August 2, 2022).

  • 212

    Pieniazek R. H. Mickle M. F. Higgs D. M. (2020). Comparative analysis of noise effects on wild and captive freshwater fish behaviour. Anim. Behav.168, 129135. doi: 10.1016/j.anbehav.2020.08.004

  • 213

    Pietikäinen M. Hadid A. Zhao G. Ahonen T. (2011). Computer vision using local binary patterns. Springer Science & Business Media40. doi: 10.1007/978-0-85729-748-8

  • 214

    Popoola O. P. Wang K. (2012). “Video-based abnormal human behavior recognitiona review,” in IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, 42. 865878. doi: 10.1109/TSMCC.2011.2178594

  • 215

    Popper A. N. Carlson T. J. (1998). Application of sound and other stimuli to control fish behavior. Trans. Am. Fish Soc.127, 673707. doi: 10.1577/1548-8659(1998)127<0673:aosaos>2.0.co;2

  • 216

    Popper A. N. Hawkins A. D. (2019). An overview of fish bioacoustics and the impacts of anthropogenic sounds on fishes. J. Fish Biol.94, 692713. doi: 10.1111/JFB.13948

  • 217

    Pramunendar R. A. Wibirama S. Santosa P. I. (2019). “Fish classification based on underwater image interpolation and back-propagation neural network,” in 2019 5th International Conference on Science and Technology (ICST), Yogyakarta, Indonesia. 16. doi: 10.1109/ICST47872.2019.9166295

  • 218

    Prasetyo E. Suciati N. Fatichah C. (2021). Multi-level residual network VGGNet for fish species classification. J. King Saud Univ. - Comput. Inf. Sci. 34(8), 52865295. doi: 10.1016/j.jksuci.2021.05.015

  • 219

    Putland R. L. Mensinger A. F. (2019). Acoustic deterrents to manage fish populations. Rev. Fish Biol. Fish29, 789807. doi: 10.1007/s11160-019-09583-x

  • 220

    Pylatiuk C. Zhao H. Gursky E. Reischl M. Peravali R. Foulkes N. et al . (2019). DIY automated feeding and motion recording system for the analysis of fish behavior. SLAS Technol.24, 394398. doi: 10.1177/2472630319841412

  • 221

    Qian Z. M. Cheng X. E. Chen Y. Q. (2014). Automatically detect and track multiple fish swimming in shallow water with frequent occlusion. PloS One9, e106506. doi: 10.1371/JOURNAL.PONE.0106506

  • 222

    Qin H. Li X. Liang J. Peng Y. Zhang C. (2016). DeepFish: Accurate underwater live fish recognition with a deep architecture. Neurocomputing187, 4958. doi: 10.1016/j.neucom.2015.10.122

  • 223

    Qiu C. Zhang S. Wang C. Yu Z. Zheng H. Zheng B. (2018). Improving transfer learning and squeeze- and-excitation networks for small-scale fine-grained fish image classification. IEEE Access6, 7850378512. doi: 10.1109/ACCESS.2018.2885055

  • 224

    Rasheed J. (2021). “A sustainable deep learning based computationally intelligent seafood monitoring system for fish species screening,” in 2021 International Conference on Artificial Intelligence of Things (ICAIoT), Nicosia, Turkey. 16. doi: 10.1109/ICAIOT53762.2021.00008

  • 225

    Rathi D. Jain S. Indu S. (2017) Underwater fish species classification using convolutional neural network and deep learning. Available at: http://dhruvrathi.me/http://www.dtu.ac.in/Web/Departments/Electronics/faculty/sindu.php.

  • 226

    Ravanbakhsh M. Shortis M. R. Shafait F. Mian A. Harvey E. S. Seager J. W. (2015). Automated fish detection in underwater images using shape-based level sets. Photogrammetric Rec.30, 4662. doi: 10.1111/phor.12091

  • 227

    Raymond E. H. Widder E. A. (2007). Behavioral responses of two deep-sea fish species to red, far-red, and white light. Mar. Ecol. Prog. Ser.350, 291298. doi: 10.3354/MEPS07196

  • 228

    Raza K. Hong S. (2020). Fast and accurate fish detection design with improved YOLO-v3 model and transfer learning. Int. J. Advanced Comput. Sci. Appl.11(2), 716. doi: 10.14569/ijacsa.2020.0110202

  • 229

    Redmon J. Divvala S. Girshick R. Farhadi A. (2016). “You only look once: Unified, real-time object detection,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA. 779788. doi: 10.1109/CVPR.2016.91

  • 230

    Ribeiro M. T. Singh S. Guestrin C. (2016). “Why should i trust you?” explaining the predictions of any classifier,” in Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (Association for Computing Machinery. 11351144. doi: 10.1145/2939672.2939778

  • 231

    Rieucau G. de Robertis A. Boswell K. M. Handegard N. O. (2014). School density affects the strength of collective avoidance responses in wild-caught Atlantic herring Clupea harengus: a simulated predator encounter experiment. J. Fish Biol.85, 16501664. doi: 10.1111/jfb.12520

  • 232

    Robbins W. D. Peddemors V. M. Kennelly S. J. (2011). Assessment of permanent magnets and electropositive metals to reduce the line-based capture of Galapagos sharks, Carcharhinus galapagensis. Fish Res.109, 100106. doi: 10.1016/J.FISHRES.2011.01.023

  • 233

    Robert M. Cortay A. Morfin M. Simon J. Morandeau F. Deneubourg J. L. et al . (2020). A methodological framework for characterizing fish swimming and escapement behaviors in trawls. PloS One15, e0243311. doi: 10.1371/journal.pone.0243311

  • 234

    Robinson K. L. Luo J. Y. Sponaugle S. Guigand C. Cowen R. K. (2017). A tale of two crowds: Public engagement in plankton classification. Front. Mar. Sci.4. doi: 10.3389/FMARS.2017.00082/BIBTEX

  • 235

    Rosen S. Holst J. C. (2013). DeepVision in-trawl imaging: Sampling the water column in four dimensions. Fish Res.148, 6473. doi: 10.1016/J.FISHRES.2013.08.002

  • 236

    Rosen S. Jörgensen T. Hammersland-White D. Holst J. C. (2013). DeepVision: A stereo camera system provides highly accurate counts and lengths of fish passing inside a trawl. Can. J. Fisheries Aquat. Sci.70, 14561467. doi: 10.1139/CJFAS-2013-0124/SUPPL_FILE/CJFAS-2013-0124SUPPLJ.TIF

  • 237

    Rose C. S. Stoner A. W. Matteson K. (2005). Use of high-frequency imaging sonar to observe fish behaviour near baited fishing gears. Fish Res.76, 291304. doi: 10.1016/J.FISHRES.2005.07.015

  • 238

    Rudstam L. G. Magnuson J. J. Tonn W. M. (2011). Size selectivity of passive fishing gear: A correction for encounter probability applied to gill nets, Canadian Journal of Fisheries and Aquatic Sciences. 41, 12521255. doi: 10.1139/F84-151

  • 239

    Ruokonen T. J. Keskinen T. Luoma M. Leskelä A. Suuronen P. (2021). The effect of LED lights on trap catches in Finnish inland fisheries. Fish Manag Ecol.28, 211218. doi: 10.1111/fme.12482

  • 240

    Ryer C. H. Barnett L. A. K. (2006). Influence of illumination and temperature upon flatfish reactivity and herding behavior: Potential implications for trawl capture efficiency. Fisheries Research81, 242250. doi: 10.1016/J.FISHRES.2006.07.001

  • 241

    Ryer C. H. Olla B. L. (2000). Avoidance of an approaching net by juvenile walleye pollock theragra chalcogramma in the laboratory: The influence of light intensity. Fish Res.45, 195199. doi: 10.1016/S0165-7836(99)00113-7

  • 242

    Ryer C. H. Rose C. S. Iseri P. J. (2010). Flatfish herding behavior in response to trawl sweeps: a comparison of diel responses to conventional sweeps and elevated sweeps. Fishery Bull.108, 145154.

  • 243

    Saleh A. Laradji I. H. Konovalov D. A. Bradley M. Vazquez D. Sheaves M. (2020). A realistic fish-habitat dataset to evaluate algorithms for underwater visual analysis. Sci Rep.10, 14671. doi: 10.1038/s41598-020-71639-x

  • 244

    Salman A. Jalal A. Shafait F. Mian A. Shortis M. Seager J. et al . (2016). Fish species classification in unconstrained underwater environments based on deep learning. Limnol Oceanogr Methods14, 570585. doi: 10.1002/lom3.10113

  • 245

    Salman A. Maqbool S. Khan A. H. Jalal A. Shafait F. (2019). Real-time fish detection in complex backgrounds using probabilistic background modelling. Ecol. Inform51, 4451. doi: 10.1016/j.ecoinf.2019.02.011

  • 246

    Salman A. Siddiqui S. A. Shafait F. Mian A. Shortis M. R. Khurshid K. et al . (2020). Automatic fish detection in underwater videos by a deep neural network-based hybrid motion learning system. ICES J. Mar. Sci.77, 12951307. doi: 10.1093/icesjms/fsz025

  • 247

    Santos J. Herrmann B. Otero P. Fernandez J. Pérez N. (2016). Square mesh panels in demersal trawls: does lateral positioning enhance fish contact probability? Aquat Living Resour29, 302. doi: 10.1051/ALR/2016025

  • 248

    Santos J. Herrmann B. Stepputtis D. Kraak S. B. M. Gökçe G. Mieske B. (2020). Quantifying the performance of selective devices by combining analysis of catch data and fish behaviour observations: Methodology and case study on a flatfish excluder. ICES Journal of Marine Science77, 28402856. doi: 10.1093/ICESJMS/FSAA155

  • 249

    Sarriá D. Rio J. del Lázaro A. M. Aguzzi J. del Río J. et al . (2009). Studying the behaviour of Norway lobster using RFID and infrared tracking technologies. OCEANS 2009-EUROPE, Bremen, Germany, pp. 14. doi: 10.1109/OCEANSE.2009.5278280

  • 250

    Sawada K. Takahashi H. Takao Y. Watanabe K. Horne J. K. McClatchie S. et al . (2004). “Development of an acoustic-optical system to estimate target-strengths and tilt angles from fish aggregations,” in Ocean ‘04 - MTS/IEEE Techno-Ocean ‘04: Bridges across the Oceans - Conference Proceedings, Kobe, Japan. 1. 395400. doi: 10.1109/OCEANS.2004.1402949

  • 251

    Schaerf T. M. Dillingham P. W. Ward A. J. W. (2017). The effects of external cues on individual and collective behavior of shoaling fish. Sci. Adv.3, e1603201. doi: 10.1126/sciadv.1603201

  • 252

    Schwarz A. L. (1985). The behavior of fishes in their acoustic environment. Environ Biol Fish13, 315. doi: doi : 10.1007/BF00004851.

  • 253

    Schwarz A. L. Greer G. L. (2011). Responses of pacific herring, Clupea harengus pallasi, to some underwater sounds. Canadian Journal of Fisheries and Aquatic Sciences41, 11831192. doi: 10.1139/F84-140

  • 254

    Sengupta E. Garg D. Choudhury T. Aggarwal A. (2018). “Techniques to elimenate human bias in machine learning,” in 2018 International Conference on System Modeling & Advancement in Research Trends (SMART), Moradabad, India. 226230. doi: 10.1109/SYSMART.2018.8746946

  • 255

    Shafait F. Mian A. Shortis M. Ghanem B. Culverhouse P. F. Edgington D. et al . (2016). Fish identification from videos captured in uncontrolled underwater environments. ICES J. Mar. Sci.73, 27372746. doi: 10.1093/icesjms/fsw106

  • 256

    Shah S. Z. H. Rauf H. T. lali I. Bukhari S. A. C. Khalid M. S. Farooq M. et al . (2019). Fish-pak: Fish species dataset from Pakistan for visual features based classification. Data in brief27, 104565. doi: 10.17632/N3YDW29SBZ.3

  • 257

    Sharber N. G. Carothers J. P. Sharber J. C. de Vos J. R. DA H. (1994). Reducing Electrofishing‐Induced injury of rainbow trout. N Am. J. Fish Manag14, 340346. doi: 10.1577/1548-8675(1994)014%3C0340:REIIOR%3E2.3.CO;2

  • 258

    Shaw R. F. (2004). Arithmetic operations in a binary computer. Rev. Sci. Instruments21, 687. doi: 10.1063/1.1745692

  • 259

    Shcherbakov D. Knörzer A. Hilbig R. Haas U. Blum M. (2012). Near-infrared orientation of Mozambique tilapia Oreochromis mossambicus. Zoology115, 233238. doi: 10.1016/J.ZOOL.2012.01.005

  • 260

    Siddiqui S. A. Salman A. Malik M. I. Shafait F. Mian A. Shortis M. R. et al . (2018). Automatic fish species classification in underwater videos: Exploiting pre-trained deep neural network models to compensate for limited labelled data. ICES J. Mar. Sci.75, 374389. doi: 10.1093/icesjms/fsx109

  • 261

    Simon J. Kopp D. Larnaud P. Vacherot J. P. Morandeau F. Lavialle G. et al . (2020). Using automated video analysis to study fish escapement through escape panels in active fishing gears: Application to the effect of net colour. Mar. Policy116, 103785. doi: 10.1016/j.marpol.2019.103785

  • 262

    Simons T. Lee D. J. (2021). Efficient binarized convolutional layers for visual inspection applications on resource-limited FPGAs and ASICs. Electronics10, 1511. doi: 10.3390/ELECTRONICS10131511

  • 263

    Sinhuber M. van der Vaart K. Ni R. Puckett J. G. Kelley D. H. Ouellette N. T. (2019). Three-dimensional time-resolved trajectories from laboratory insect swarms. Sci. Data6, 18. doi: 10.1038/sdata.2019.36

  • 264

    Skinner B. F. (2010). The generic nature of the concepts of stimulus and response. The Journal of General Psychology12, 4065. doi: 10.1080/00221309.1935.9920087

  • 265

    Sokolova M. Mompó Alepuz A. Thompson F. Mariani P. Galeazzi R. Krag L. A. (2021). A deep learning approach to assist sustainability of demersal trawling operations. Sustainability13, 12362. doi: 10.3390/su132212362

  • 266

    Southworth L. K. Ratcliffe F. C. Bloor I. S. M. Emmerson J. Watson D. Beard D. et al . (2020). Artificial light improves escapement of fish from a trawl net. J. Mar. Biol. Assoc. United Kingdom100, 267275. doi: 10.1017/S0025315420000028

  • 267

    Spampinato C. Giordano D. di Salvo R. Chen-Burger Y. H. Fisher R. B. Nadarajan G. (2010). “Automatic fish classification for underwater species behavior understanding,” in ARTEMIS’10 - Proceedings of the 1st ACM Workshop on Analysis and Retrieval of Tracked Events and Motion in Imagery Streams, Firenze, Italy. 4550. doi: 10.1145/1877868.1877881

  • 268

    Spampinato C. Palazzo S. Boom B. van Ossenbruggen J. Kavasidis I. di Salvo R. et al . (2014). Understanding fish behavior during typhoon events in real-life underwater environments. Multimed Tools Appl.70, 199236. doi: 10.1007/s11042-012-1101-5

  • 269

    Spangler G. Collins J. (2011). Lake Huron fish community structure based on gill-net catches corrected for selectivity and encounter probability. North Am. J. Fisheries Manage.12 (3), 585597.

  • 270

    Stewart P. A. M. (2001). A review of studies of fishing gear selectivity in the meditteranean. FAO COPEMED Report No. 9, Aberdeen, UK, pp 57.

  • 271

    Stienessen S. C. Parrish J. K. (2013). The effect of disparate information on individual fish movements and emergent group behavior. Behav. Ecol.24, 11501160. doi: 10.1093/BEHECO/ART042

  • 272

    Stuart I. G. Zampatti B. P. Baumgartner L. J. (2008). Can a low-gradient vertical-slot fishway provide passage for a lowland river fish community? Mar. Freshw. Res.59, 332346. doi: 10.1071/MF07141

  • 273

    Sung M. Yu S. C. Girdhar Y. (2017). “Vision based real-time fish detection using convolutional neural network,” in OCEANS 2017 - Aberdeen, Aberdeen, UK. 16. doi: 10.1109/OCEANSE.2017.8084889

  • 274

    Torres A. Abril A. M. Clua E. E. G. (2020). A time-extended (24 h) baited remote underwater video (BRUV) for monitoring pelagic and nocturnal marine species. J. Mar. Sci. Eng.8, 208. doi: 10.3390/jmse8030208

  • 275

    Underwood M. J. Utne Palm A. C. Øvredal J. T. Bjordal Å. (2021). The response of mesopelagic organisms to artificial lights. Aquaculture and Fisheries6, 519529. doi: 10.1016/j.aaf.2020.05.002

  • 276

    Valletta J. J. Torney C. Kings M. Thornton A. Madden J. (2017). Applications of machine learning in animal behaviour studies. Anim. Behav.124, 203220. doi: 10.1016/j.anbehav.2016.12.005

  • 277

    van Gerven M. Bohte S. (2017). Editorial: Artificial neural networks as models of neural information processing. Front. Comput. Neurosci.11. doi: 10.3389/fncom.2017.00114

  • 278

    Vaswani A. Brain G. Shazeer N. Parmar N. Uszkoreit J. Jones L. et al . (2017). Attention is all you need. In Advances in Neural Information Processing Systems. 59986008.

  • 279

    Villon S. Iovan C. Mangeas M. Claverie T. Mouillot D. Villéger S. et al . (2021). Automatic underwater fish species classification with limited data using few-shot learning. Ecol. Inform63, 101320. doi: 10.1016/j.ecoinf.2021.101320

  • 280

    Vinicius P. Borges K. Conci N. Cavallaro A. (2013). Video-based human behavior understanding: A survey. IEEE Transactions on Circuits and Systems for Video Technology23, 19932008. doi: 10.1109/TCSVT.2013.2270402

  • 281

    Viscido S. V. Parrish J. K. Grünbaum D. (2004). Individual behavior and emergent properties of fish schools: a comparison of observation and theory. Mar. Ecol. Prog. Ser.273, 239249. doi: 10.3354/MEPS273239

  • 282

    Vogel C. (2016) Sélectivité des engins de pêche. Available at: https://archimer.ifremer.fr/doc/00317/42869/ (Accessed June 29, 2022).

  • 283

    Walsh S. J. Engås A. Ferro R. Fonteyne R. van Marlen Walsh B. Ferro A. R. (2002). To catch or conserve more fish: the evolution of fishing technology in fisheries science. ICES Marine Science Symposia. Report. doi: 10.17895/ices.pub.8887

  • 284

    Walsh S. J. Godø O. R. Michalsen K. (2004). Fish behaviour relevant to fish catchability. ICES J. Mar. Sci.61, 12381239. doi: 10.1016/J.ICESJMS.2004.08.004

  • 285

    Wang J. H. Lee S. K. Lai Y. C. Lin C. C. Wang T. Y. Lin Y. R. et al . (2020). Anomalous behaviors detection for underwater fish using AI techniques. IEEE Access8, 224372224382. doi: 10.1109/ACCESS.2020.3043712

  • 286

    Wang G. Muhammad A. Liu C. Du L. Li D. Automatic D. et al . (2021). Automatic recognition of fish behavior with a fusion of RGB and optical flow data based on deep learning recognition of fish behavior with a fusion of RGB and optical flow data based on deep learning. Animals. 11, 2274. doi: 10.3390/ani11102774

  • 287

    Wang X. Ouyang J. Li D. Zhang G. (2019). Underwater object recognition based on deep encoding-decoding network. J. Ocean Univ. China18, 376382. doi: 10.1007/s11802-019-3858-x

  • 288

    Watson J. (2013). “Subsea imaging and vision: An introduction,” in WatsonJ.ZielinskiO. (eds.), Subsea optics and imagingAmsterdam, Netherlands: Elsevier, 1734. doi: 10.1533/9780857093523.1.17

  • 289

    Watson J. W. Kerstetter D. W. (2006). Pelagic longline fishing gear: A brief history and review of research efforts to improve selectivity. Mar. Technol. Soc. J.40, 6. doi: 10.4031/002533206787353259

  • 290

    Wei Y. Duan Y. An D. (2022). Monitoring fish using imaging sonar: Capacity, challenges and future perspective. Fish and Fisheries23(6), 13471370.

  • 291

    Weissburg M. J. (2016). The fluid dynamical context of chemosensory behavior198, 188202. doi: 10.2307/1542523

  • 292

    Widder E. A. Robison B. H. Reisenbichler K. R. Haddock S. H. D. (2005). Using red light for in situ observations of deep-sea fishes. Deep Sea Res. 1 Oceanogr Res. Pap52, 20772085. doi: 10.1016/J.DSR.2005.06.007

  • 293

    Williams K. Lauffenburger N. Chuang M.-C. Hwang J.-N. Towler R. (2016). Automated measurements of fish within a trawl using stereo images from a camera-trawl device (CamTrawl). Methods Oceanography17, 138152. doi: 10.1016/j.mio.2016.09.008

  • 294

    Wu Z. Hristov N. I. Kunz T. H. Betke M. (2009). Tracking-reconstruction or reconstruction-tracking? doi: 10.1109/WMVC.2009.5399245. comparison of two multiple hypothesis tracking approaches to Interpret 3D Object Motion from Several Camera Views. 2009 Workshop on Motion and Video Computing, WMVC ‘09.

  • 295

    Xia M. Chen X. Lang H. Shao H. Williams D. Gazivoda M. et al . (20222022). Features and always-on wake-up detectors for sparse acoustic event detection. Electronics11, 478. doi: 10.3390/ELECTRONICS11030478

  • 296

    Xu Y. Liu X. Cao X. Huang C. Liu E. Qian S. et al . (2021). Artificial intelligence: A powerful paradigm for scientific research. Innovation2, 100179. doi: 10.1016/J.XINN.2021.100179

  • 297

    Xu W. Matzner S. (2018). “Underwater fish detection using deep learning for water power applications,” in Proceedings - 2018 International Conference on Computational Science and Computational Intelligence, CSCI 2018, Las Vegas, NV, USA. 313318. doi: 10.1109/CSCI46756.2018.00067

  • 298

    Xu J. Sang W. Dai H. Lin C. Ke S. Mao J. et al . (2022). A detailed analysis of the effect of different environmental factors on fish phototactic behavior: Directional fish guiding and expelling technique. Animals : an Open Access Journal from MDPI, [online] 12 (3), p.240. doi: 10.3390/ani12030240

  • 299

    Yan H. Y. Anraku K. Babaran R. P. (2010). Hearing in marine fish and its application in fisheries. Behav. Mar. Fishes: Capture Processes Conserv. Challenges45–64. doi: 10.1002/9780813810966.CH3

  • 300

    Yan Z. Bi Y. Xue B. Zhang M. (2021). “Automatically extracting features using genetic programming for low-quality fish image classification,” in 2021 IEEE Congress on Evolutionary Computation, CEC 2021 - Proceedings, Kraków, Poland., 2015–2022. doi: 10.1109/CEC45853.2021.9504737

  • 301

    Yang L. Liu Y. Yu H. Fang X. Song L. Li D. et al . (2020). Computer Vision Models in Intelligent Aquaculture with Emphasis on Fish Detection and Behavior Analysis: A review. Arch. Comput. Methods Eng.28, 27852816. doi: 10.1007/s11831-020-09486-2

  • 302

    Yang X. Zhang S. Liu J. Gao Q. Dong S. Zhou C. (2021b). Deep learning for smart fish farming: applications, opportunities and challenges. Rev. Aquac13 (1), 6690. doi: 10.1111/raq.12464

  • 303

    Yochum N. Stone M. Breddermann K. Berejikian B. A. Gauvin J. R. Irvine D. J. (2021). Evaluating the role of bycatch reduction device design and fish behavior on pacific salmon (Oncorhynchus spp.) escapement rates from a pelagic trawl. Fish Res.236, 105830. doi: 10.1016/J.FISHRES.2020.105830

  • 304

    York R. A. Patil C. Darrin Hulsey C. Anoruo O. Streelman J. T. Fernald R. D. (2015). Evolution of bower building in lake Malawi cichlid fish: Phylogeny, morphology, and behavior. Front. Ecol. Evol.3. doi: 10.3389/fevo.2015.00018

  • 305

    Yuan H. Zhang S. Chen G. Yang Y. (2020). Underwater image fish recognition technology based on transfer learning and image enhancement. J. Coast. Res.105, 124128. doi: 10.2112/JCR-SI105-026.1

  • 306

    Yu X. Wang Y. An D. Wei Y. (2021). Identification methodology of special behaviors for fish school based on spatial behavior characteristics. Comput. Electron Agric.185, 106169. doi: 10.1016/j.compag.2021.106169

  • 307

    Zhang L. Li W. Liu C. Zhou X. Duan Q. (2020). Automatic fish counting method using image density grading and local regression. Comput. Electron Agric.179, 105844. doi: 10.1016/J.COMPAG.2020.105844

  • 308

    Zhao J. Bao W. Zhang F. Zhu S. Liu Y. Lu H. et al . (2018). Modified motion influence map and recurrent neural network-based monitoring of the local unusual behaviors for fish school in intensive aquaculture. Aquaculture493, 165175. doi: 10.1016/J.AQUACULTURE.2018.04.064

  • 309

    Zhou C. Xu D. Chen L. Zhang S. Sun C. Yang X. et al . (2019). Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision. Aquaculture507, 457465. doi: 10.1016/J.AQUACULTURE.2019.04.056

  • 310

    Zhou C. Zhang B. Lin K. Xu D. Chen C. Yang X. et al . (2017). Near-infrared imaging to quantify the feeding behavior of fish in aquaculture. Comput. Electron Agric.135, 233241. doi: 10.1016/j.compag.2017.02.013

  • 311

    Zhuang P. Wang Y. Qiao Y. (2021). Wildfish++: A comprehensive fish benchmark for multimedia research. IEEE Trans. Multimedia23, 36033617. doi: 10.1109/TMM.2020.3028482

Summary

Keywords

fisheries, gear technology, underwater observation systems, deep learning, fish behavior tracking

Citation

Abangan AS, Kopp D and Faillettaz R (2023) Artificial intelligence for fish behavior recognition may unlock fishing gear selectivity. Front. Mar. Sci. 10:1010761. doi: 10.3389/fmars.2023.1010761

Received

03 August 2022

Accepted

31 January 2023

Published

23 February 2023

Volume

10 - 2023

Edited by

Lyne Morissette, M–Expertise Marine, Canada

Reviewed by

Abdullah-Al Arif, Yokohama City University, Japan; Hongsheng Bi, College Park, United States

Updates

Copyright

*Correspondence: Robin Faillettaz,

This article was submitted to Marine Fisheries, Aquaculture and Living Resources, a section of the journal Frontiers in Marine Science

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Figures

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics