Machines 12 00124
Machines 12 00124
Article
Electromyography-Based Biomechanical Cybernetic Control of a
Robotic Fish Avatar
Manuel A. Montoya Martínez 1,† , Rafael Torres-Córdoba 1 , Evgeni Magid 2,3 and Edgar A. Martínez-García 1, *,†
1 Laboratorio de Robótica, Institute of Engineering and Technology, Universidad Autónoma de Ciudad Juárez,
Ciudad Juárez 32310, Mexico; [email protected] (M.A.M.M.); [email protected] (R.T.-C.)
2 Institute of Information Technology and Intelligent Systems, Kazan Federal University, Kazan 420008, Russia;
[email protected]
3 HSE Tikhonov Moscow Institute of Electronics and Mathematics, HSE University, Moscow 101000, Russia
* Correspondence: [email protected]
† These authors contributed equally to this work.
Abstract: This study introduces a cybernetic control and architectural framework for a robotic fish
avatar operated by a human. The behavior of the robot fish is influenced by the electromyographic
(EMG) signals of the human operator, triggered by stimuli from the surrounding objects and scenery.
A deep artificial neural network (ANN) with perceptrons classifies the EMG signals, discerning
the type of muscular stimuli generated. The research unveils a fuzzy-based oscillation pattern
generator (OPG) designed to emulate functions akin to a neural central pattern generator, producing
coordinated fish undulations. The OPG generates swimming behavior as an oscillation function,
decoupled into coordinated step signals, right and left, for a dual electromagnetic oscillator in the
fish propulsion system. Furthermore, the research presents an underactuated biorobotic mechanism
of the subcarangiform type comprising a two-solenoid electromagnetic oscillator, an antagonistic
musculoskeletal elastic system of tendons, and a multi-link caudal spine composed of helical springs.
The biomechanics dynamic model and control for swimming, as well as the ballasting system for
submersion and buoyancy, are deduced. This study highlights the utilization of EMG measurements
encompassing sampling time and µ-volt signals for both hands and all fingers. The subsequent
feature extraction resulted in three types of statistical patterns, namely, Ω, γ, λ, serving as inputs
Citation: Montoya Martínez, M.A.;
for a multilayer feedforward neural network of perceptrons. The experimental findings quantified
Torres-Córdoba, R.; Magid, E.;
Martínez-García, E.A.
controlled movements, specifically caudal fin undulations during forward, right, and left turns, with
Electromyography-Based a particular emphasis on the dynamics of caudal fin undulations of a robot prototype.
Biomechanical Cybernetic Control of a
Robotic Fish Avatar. Machines 2024, 12, Keywords: biorobotics; cybernetics; neural network; robot fish; EMG signals; robotic avatar; dynamic
124. https://doi.org/10.3390/ control
machines12020124
within the system. The application domains of cybernetic control span engineering, biology,
and psychology, with the goal of enabling robots to interact with humans in more intuitive
ways [6]. This involves adapting their actions and responses based on human feedback and
behavior, cultivating a more seamless and responsive human–robot interaction.
Cybernetic biorobotics, at its core, is an interdisciplinary frontier that harmonizes
principles from cybernetics, biology, and robotics. Its primary mission is the exploration and
development of robots or robotic systems intricately inspired by the marvels of biological
organisms. This field is driven by the ambition to conceive robots capable of mimicking
and integrating the sophisticated principles and behaviors observed in living entities.
Researchers draw inspiration from the intricate control systems of biological organisms
and this creative synthesis results in the creation of robots characterized by adaptive and
intelligent behaviors, thus mirroring the intricacies found in the natural world. Bioinspired
robotics, a central focus within this discipline, involves distilling the fundamental principles
and behaviors intrinsic to biological entities and skillfully incorporating them into the
design and control of robotic systems. It has the potential to advance the development
of robots endowed with locomotion and manipulation capabilities akin to animals, as
well as robots capable of adapting to dynamic environments or interacting with humans
in more natural and intuitive ways [7]. Moreover, research in cybernetic biorobotics can
offer valuable insights into comprehending biological systems, fostering advancements in
disciplines like neuroscience and biomechanics. Furthermore, remote cybernetic robots
may rely on haptic systems as essential interfaces. A haptic system, characterized by its
ability to provide users with a sense of touch or tactile feedback through force, vibration,
or other mechanical means, comprises a haptic interface and a haptic rendering system.
Collaboratively, these components simulate touch sensations, enabling users to engage
with virtual or remote environments in a tactile manner [8].
This research introduces a control and sensing architecture that integrates a cybernetic
scheme based on the recognition of electromyographic control signals, governing a range of
locomotive behaviors in a robotic fish. Conceptually, the human operator receives feedback
signals from the sensors of the biorobotic avatar, conveying information about its remote
environment. The proposed approach stands out due to its key features and contributions,
which include:
1. The exposition of an innovative conceptual cybernetic fish avatar architecture.
2. The creation of an EMG data filtering algorithm, coupled with a method for extract-
ing, classifying, and recognizing muscular patterns using a deep ANN, serves as a
cybernetic interface for the governance of the fish avatar.
3. The development of a fuzzy-based oscillation pattern generator (OPG) designed to
generate periodic oscillation patterns around the fish’s caudal fin. These coordinated
oscillations are decoupled into right and left step functions, specifically crafted to input
into a lateral pair of electromagnetic coils, thereby producing undulating swimming
motions of the robot fish.
4. The conception of a bioinspired robotic fish mechanism is characterized by the incor-
poration of underactuated elements propelled by serial links featuring helical springs.
This innovative design is empowered by a dual solenoid electromagnetic oscillator
and a four-bar linkage, reflecting a novel approach to bioinspired robotics.
5. The derivation of closed-form control laws for both the undulation of the underactu-
ated caudal multilink dynamics and the ballasting system.
Section 2 provides a comprehensive discussion of the comparative analysis of the
current state of the art. Section 3 provides a detailed description of the proposed archi-
tecture of thecybernetic system model. In Section 4 presents an approach for filtering
electromyography (EMG) data and delves into an in-depth discussion of a classifier based
on deep ANN for the recognition of hand-motion EMG stimuli patterns. Section 5 presents
the development of a fuzzy-based oscillation pattern generator. Section 6 details the robot’s
mechanism parts and its dynamic model. Section 7 focuses on the development of a feed-
Machines 2024, 12, 124 3 of 41
back control for the fish’s ballasting system. Finally, Section 8 provides the concluding
remarks of the research study.
As delineated in Table 1, the present study introduces distinctive elements that set it
apart from the recognized relevant literature. However, it is noteworthy to acknowledge
that various multidisciplinary domains may exhibit commonalities. Across these diverse
topics, shared elements encompass robotic avatars, teleoperation, telepresence, immersive
human–robot interfaces, as well as haptic or cybernetic systems in different application
domains. In this research, the fundamental principle of a robotic avatar entails controlling
its swimming response to biological stimuli from the human operator. The human controller
is able to gain insight into the surrounding world of the robotic fish avatar through a haptic
interface. This interface allows the human operator to yield biological electromyography
stimuli as the result of their visual and skin impressions (e.g., pressure, temperature,
heading vibrations). The biorobotic fish generates its swimming locomotive behavior,
which is governed by EMG stimuli yielded in real-time in the human. Through a neuro-
Machines 2024, 12, 124 5 of 41
fuzzy controller, the neuronal part (cybernetic observer) classifies the type of human EMG
reaction, and the fuzzy part determines the swimming behavior.
Figure 1. Cybernetic robotic avatar system architecture. Signal electrodes (green circles) and ground
electrodes (red circles) are experimentally positioned on the Flexor Digitorum Superficialis, Flexor
Digitorum Profundus, and Flexor Carpi muscles.
Essentially, there are six haptic feedback sensory inputs of interest for the human,
representing the observable state of the avatar robot: Eulerian variables, including angular
and linear displacements and their higher-order derivatives; biomechanical caudal motion;
hydraulic pressure; scenario temperature; and passive vision. Figure 2 left provides an
illustration of the geometric distribution of the sensing devices.
The instrumented robot is an embodiment of the human submerged in water, featuring
an undulatory swimming mechanical body imbued with muscles possessing underactuated
characteristics. These features empower the biorobotic avatar to execute movements and
swim in its aquatic surroundings.
The observation models aim to provide insights into how these sensory perceptions
are conveyed to the haptic helmet with haptic devices, including a wheel reaction mecha-
nism. A comprehensive schema emerges wherein the haptic nexus, bolstered by pivotal
Machines 2024, 12, 124 6 of 41
dαι 1
Z
ωα = + α˙g + aα dt, (1)
dt dα t
dβ ι 1
Z
ωβ = + β˙g + a β dt, (2)
dt dβ t
dγι 1
Z
ωγ = + γ˙g + aγ dt. (3)
dt dγ t
Machines 2024, 12, 124 7 of 41
Within this context, the tangential accelerations experienced by the robot body are
denoted aα,β,γ [m/s2 ]. Additionally, the angular velocities measured by the gyroscopes
are represented by α̇, β̇, γ̇ [rad/s2 ]. Correspondingly, the inclinometers provide angle
measurements denoted α, β, γ [rad]. These measurements collectively contribute to the
comprehensive observability and characterization of the robot’s dynamic behavior and
spatial orientation.
Furthermore, the oscillations of the caudal tail are reflections of the dynamics of the
underactuated spine. These dynamics are captured by quantifying encoder pulses, denoted
ηt , which provide precise angular positions for each vertebra. Given that real-time angular
measurements of the vertebrae are desired, higher-order data are prioritized. Consequently,
derivatives are computed by initiating from the Taylor series to approximate the angle of
each vertebral element with respect to time, denoted t.
(1)
ϕi ≊ ϕi + ϕi (t2 − t1 ), (4b)
(1)
dropping ϕi off as a state variable, the first-order derivative (ϕ(1) (t) ≡ ϕ̇(t)) is given by
ϕ2 − ϕ1
ϕ̇(t) = , (4c)
t2 − t1
and assuming a vertebra’s angular measurement model in terms of the encoder’s pulses η
with resolution R, then it is stated that by substituting the pulses encoder model into the
angular speed function for the first vertebra,
η2 − η1
2π 2π 2π
ϕ̇1 = η2 − η = . (5a)
R ( t2 − t1 ) R ( t2 − t1 ) 1 R t2 − t1
electrodes were systematically adjusted, and the results from each trial were compared.
Upon data analysis, it was discerned that the most effective electrode placement is on the
ulnar nerve, situated amidst the muscles flexor digitorium superficialis, flexor digitorium
profundus, and flexor carpi ulnaris. A series of more than ten experiments was executed
for each planned stimulus or action involving hands, allowing a 2s interval between each
action, including the opening and closing of hands, as well as the extension and flexion of
the thumb, index, middle, ring, and little fingers. The data were measured by a g.MOBIlab+
device with two-channel electrodes and quantified in microvolts per second [µv/s], as
depicted in Figure 3.
The data acquired from the electromyogram often exhibit substantial noise, attributed
to both the inherent nature of the signal and external vibrational factors. To refine the data
quality by mitigating this noise, a filtering process is essential.
(a)
(b)
(c)
Figure 3. Experimental raw EMG data (from left to right): (a) Left and right hand, left and right
thumb. (b) Left and right index, left and right middle. (c) Left and right ring, left and right little.
In this context, a second-order Notch filter was utilized. This filter is tailored to target
specific frequencies linked to noise, proving particularly effective in eliminating electrical
interferences and other forms of stationary noise [35]. A Notch filter is a band-rejection
filter to greatly reduce interference caused by a specific frequency component or a narrow
band signal. Hence, in the Laplace space, the second-order filter is represented by the
analog Laplace domain transfer function:
s2 + ωo2
H (s) = , (6)
s2 + 2sξωo + ωo2
where ω0 signifies the cut angular frequency targeted for elimination, and 2ξ signifies the
damping factor or filter quality, determining the bandwidth. Consequently, we solve to
obtain its solution in the physical variable space. The bilinear transformation relates the
variable s from the Laplace domain to the variable zk in the Z domain, considering T as the
sampling period, and is defined as follows:
zk − 1
. 1
s= . (7)
T zk + 1
Upon substituting the previous expression into the transfer function of the analog Notch
filter and algebraically simplifying, the following transfer function in the Z domain is
Machines 2024, 12, 124 9 of 41
obtained, redefining the notation as υt = zk just to meet equivalence with the physical
variable:
1 − 2 cos(ω0 t)υt−1 + υt−2
h( υt ) = . (8)
1 − 2ξ cos(ω0 t)υt−1 + ξυt−2
The second-order Notch filter h(υ) was employed on raw EMG data to alleviate noise
resulting from electrical impedance and vibrational electrode interference, with parameters
set at ω0 = 256 Hz and ξ = 0.1 and results depicted in Figure 4.
Subsequently, while other studies explored time and frequency feature extraction,
as seen in [36], in the present context, by utilizing the outcomes of the Notch filter, our
data undergo processing through three distinct filters or linear envelopes. This serves as a
secondary spatial filter and functions as a pattern extraction mechanism. These include a
filter for average variability, one for linear variability, and another for average dispersion.
Each filter serves a specific purpose, enabling the analysis of different aspects of the signal.
Consider n as the number of measurements constituting a single experimental stimulus, and
let N represent the entire sampled data space obtained from multiple measurements related
to the same stimulus. Furthermore, denote v̂i as the ith EMG measurement of an upper
limb, measured in microvolts (µV). From such statements, the following Propositions 1–3
are introduced as new data patterns.
Proposition 1 (Filter γ). The γ pattern refers to a statistical linear envelope described by the
difference of a local mean v̂k in a window of samples and the statistical mean υ̂i of all samples in an
experiment.
1 n
γ(vk ) = v̂i − ∑ vk . (9)
n k =1
Proposition 2 (Filter λ). The λ(vk ) pattern refers to a statistical linear envelope denoted by
the difference of a local mean v̂k in a window of samples and the statistical mean υ̂i of the whole
population of experiments of the same type,
Nk
1
λ(vk ) = v̂i −
Nk ∑ vˆk . (10)
k =1
Proposition 3 (Filter Ω). The Ω pattern refers to a statistical linear envelope denoted by the
difference of statistical means between the population of one experiment v̂i and the whole population
of numerous experiments of the same type:
n N
1 1
Ω(vk ) =
n ∑ vk − N ∑ vk . (11)
k =1 k =1
Machines 2024, 12, 124 10 of 41
Hence, let the vector ⃗δ ∈ R3 such that δk = (Ωk , γk , λk )⊤ and represent filtered data
points in the Ωγλ-space. This study includes a brief data preprocessing as a method to
improve the multi-class separability and data scattering reduction in pattern extraction.
Three distinctive patterns—γ, λ, Ω—captivatingly converge in Figure 5. This illustration
exclusively features patterns associated with sequences of muscular stimuli from both the
right and left hands. For supplementary stimulus plots, refer to Appendix A at the end of
this manuscript.
(a)
(b)
Figure 5. Components of hand pattern space: filters γ, λ, and Ω. (a) Filters applied to the left hand.
(b) Filters applied to the right hand.
Machines 2024, 12, 124 11 of 41
From numerous laboratory experiments, over 75% of the sampled raw data fall within
the range of one standard deviation. Consider the vector ⃗σ ∈ R3 such that the standard
deviation vector ⃗σ = (σΩ , σγ , σλ )⊤ encompasses the three spatial components by its norm.
v
2
Ωk
u
u N µγ
2 1
∑
u
σ=t γ k
− µλ . (12)
N k =1 λk µΩ
Definition 1 (Discrimination condition). Consider the scalar value δj as preprocessed EMG data
located within a radius of magnitude κd times the standard deviation ∥σ ∥:
(
δk , κd ∥σ ∥ ≤ ∥⃗δ∥
δj = (13)
0, κd ∥σ ∥ > ∥⃗ δ∥
Hence, consider the recent Definition 1 in the current scenario with κd = 1.0, which
serves as a tuning discrimination factor. Therefore, the norm lh represents the distance
between the frame origin and any class in the Ωγλ-space. This distance is adaptively
calculated based on the statistics of each EMG class.
q
lh = 2 (κΩ σΩ )2 + (κγ σγ )2 + (κλ σλ )2 (14)
where the coefficients κΩ , κγ , and κλ are smooth adjustment parameters to set separa-
bility along the axes. Hence, relocating each class center to a new position is stated by
Proposition 4.
µ+
Ω = µΩ + ζ Ω lh , (15a)
µ+
σ = µγ + ζ γ lh (15b)
and
µ+
λ = µλ + ζ λ lh , (15c)
where ζ Ωγλ are coarse in-space separability factors. The mean values µΩγλ are the actual class
positions obtained from the linear envelopes Ω(υk ), γ(υk ), and λ(υk ).
Thus, by following the step-by-step method outlined earlier, Figure 6 showcases the
extracted features of the EMG data, representing diverse experimental muscular stim-
uli. These results hold notable significance in the research, as they successfully achieve
the desired class separability and data scattering, serving as crucial inputs for the multi-
layer ANN.
Machines 2024, 12, 124 12 of 41
(a) (b)
Figure 6. EMG stimuli pattern space (γ, λ, Ω). (a) Left hand classes. (b) Right hand classes.
Henceforth, the focus lies on identifying and interpreting the EMG patterns projected
in the γλΩ-space, as illustrated in Figure 6. The subsequent part of this section delves into
the architecture and structure of the deep ANN employed as a classifier, providing a de-
tailed account of the training process. Additionally, this section highlights the performance
metrics and results achieved by the classifier, offering insights into its effectiveness. Despite
the challenges posed by nonlinearity, multidimensionality, and extensive datasets, various
neural network structures were configured and experimented with. These configurations
involved exploring different combinations of hidden layers, neurons, and the number of
outputs in the ANN.
A concise comparative analysis was performed among three distinct neural network
architectures: the feedforward multilayer network, the convolutional network, and the
competing self-organizing map. Despite notable configuration differences, efforts were
made to maintain similar features, such as 100 training epochs, five hidden layers (except
for the competing structure), and an equal number of input and output neurons (three
input and four output neurons). The configuration parameters and results are presented in
Table 2. This study delves deeper into the feedforward multilayer ANN due to its superior
classification rate. Further implementations with enhanced features are planned in the
C/C++ language as a compiled program.
Feedforward Competing
Measures Convolutional
Multilayer Self-Organizing Map
Training epochs 100 100 100
Classification rate 69.85% 24.17% 23.88%
Training time rate 1 1 0.9046 0.9861
Number of hidden
5 5 1
layers
Neurons per hidden
20 20 32
layer
Neurons per output
4 4 4
layer
Total neurons 104 104 32
1 The training dataset comprised 103,203 samples. The training time rate was set to 1 as the comparative reference.
To achieve the highest success rate in accurate data classification through experi-
mentation, the final ANN was designed with perceptron units, as depited in Figure 7.
It featured three inputs corresponding to the three EMG patterns γ, λ, Ω and included
Machines 2024, 12, 124 13 of 41
12 hidden layers, each with 20 neurons. The supervised training process, conducted on a
standard-capability computer, took approximately 20–30 min, resulting in nearly 1% error
in pattern classification.
However, in the initial stages of the classification computations, with some mistuned
adaptive parameters, the classification error was notably higher, even with much deeper
ANN structures, such as 100 hidden layers with 99 neurons per layer. To facilitate the
implementation, this work utilized the multilayer feedforward architecture increased to
300 epochs during training and implemented in C/C++, deploying the library Fast Artificial
Neural Networks (FANN), generating extremely fast binary code once the ANN was trained.
In the training process of this research, about 50 datasets from separate experiments for
each type of muscular stimulus were collectively stored, each comprising several thousand
muscular repetitions. A distinct classification label was assigned a priori for each class
type within the pattern space. To demonstrate the reliability of the approach, 16 different
stimuli per ANN were established for classification and recognition, resulting in the ANN
having four combinatory outputs, each with two possible states. Figure 8 depicts mixed
sequences encompassing all types of EMG stimuli, with the ANN achieving a 100% correct
classification rate.
(a) (b)
Figure 8. Sequence of mixed EMG stimuli over time and ANN’s decimal output with 100% classifica-
tion success. (a) Right limb. (b) Left limb.
Moreover, Table 3 delineates the mapping relationship between the ANN’s input,
represented by the EMG stimuli, and the ANN’s output linked to a swimming behavior for
controlling the robotic avatar.
Machines 2024, 12, 124 14 of 41
(a)
(b) (c)
Figure 9. Swimming behavior fuzzy sets. (a) Crisp input (ANN’s output). (b) Input of robot’s thrust
velocity observation. (c) Input of robot’s angular speed observation.
Definition 2 (Input fuzzy sets). The output of the artificial neural network (ANN) corresponds
to the fuzzy input, denoted yC , and only when it falls within the crisp set C = [0, 1, . . . , 15], the
membership in the crisp set is referred to as µy (yC ).
(
0, yC ∈
/C
µy (yC ) = (16a)
1, yC ∈ C .
In relation to the sensor observations of the biorobot, the thrusting velocity v [cm/s] and angular
velocity ω [rad/s] exhibit S-shaped sets modeled by sigmoid membership functions. This modeling
approach is applied consistently to both types of input variables, capturing their extreme-sided
characteristics. Let µs, f (v) define the sets labeled “stop” and “fast” in relation to the thrusting
velocity v, elucidated by
1
µs, f (v) = . (16b)
1 + e± v ∓ a
Likewise, for the sets designated “left–fast” (lr) and “right–fast” (r f ) concerning the angular
velocity ω, the S-shaped sets are modeled as
1
µl f ,r f (ω ) = . (16c)
1 + e± v ∓ b
Machines 2024, 12, 124 16 of 41
In addition, the rest of the sets in between are any of the kth Gauss membership functions
(“slow”, “normal”, and “agile”) for the robot’s thrusting velocity with parametric mean v̄ and
standard deviation σvk , !
(v̄−v)2
−
2σv2
µk (v) = e k , (16d)
and for its angular velocity (“left–slow”, “no turn”, and “right-slow”), with parametric mean ω̄
and standard deviation σωk , !
(ω̄ −v)2
− 2
2σω
µk (ω ) = e k . (16e)
Therefore, the reasoning rules articulated in the inference engine have been devised by
applying inputs derived from Table 3, specifically tailored to generate desired outputs that
align with the oscillation periods T [s] of the fish’s undulation frequency (see Figure 10).
Definition 3 (v, ω = any). For any linguistic value v representing sensor observations of the
fish’s thrusting velocity,
.
v = any = stop or slow or normal or agile or fast .
Likewise, for any linguistic value ω representing sensor observations of the fish’s angular
velocity,
.
ω = any = left-fast or left-slow or no turn or right-slow or right-fast .
Therefore, the following inference rules describe the essential robot fish swimming behavior.
sets with distinct concurrent values correspond to the periods of three distinct periodic
functions: forward (f), right (r), and left (l), as subsequently defined by Equations (24a),
(24b), and (24c).
Figure 10. Three identical output fuzzy sets depicting the oscillation period of the caudal tail’s
undulation.
The inference engine’s rules dictate the application of fuzzy operators to assess the
terms involved in fuzzy decision-making. As for the thrusting velocity fuzzy sets, where
v = any was previously established and by applying Definition 3, the fuzzy operator is
described by
µmax
v = max µstop , µslow , µnormal , µ agile , µ f ast .
µk ∈v
Likewise, the angular velocity fuzzy sets, where previously ω = any was stated, by
applying the second part of Definition 3, the following fuzzy operator is described by
µmax
ω = max µle f t− f ast , µle f t−slow , µnoturn , µright−slow , µright− f ast .
µk ∈ω
Essentially, the fuzzification process applies strictly similar for the rest of the inference
rules, according to the following Proposition.
Proposition 5 (Combined rules µi∗ ). The general fuzzy membership expression for the ith infer-
ence rule that combines multiple inference propositions is
In any crisp set, each µi attains a distinct value of 1, irrespective of the corresponding inference
rule indexed by i. This value aligns with the ith entry in the neural network outputs outlined in
Table 3. Additionally, k represents a specific fuzzy set associated with the same input.
Remark 1 (Proposition 5 example). Let us consider rule i = 1, where yC1 = “sink” and either
(v = 10.0 cm/s or ω = 0.0 rad/s). Thus, articulated in the context of the resulting fuzzy operator,
µ1∗ = min µsink (yC ), max µv,ω (1, 1) = min (1, 1) = 1.
µk ∈yC ∩v ∪ω v∪ω µk ∈yC ∩v ∪ω
Here, µv (10.0) = 1 and µω (0.0) = 1. Based on the earlier proposition, the resulting µ1∗ = 1,
and given that rule 1 indicates an output period T = “too-slow”, its inverse outcome T (µ1∗ ) = 6.0
s. This outcome is entirely accurate, because the fish’s swim undulation will slow down up to 6 s,
which is the slowest period oscillation.
Moving forward, during the defuzzification process, the primary objective is to attain
an inverse solution. The three output categories for periods T include “forward”, “right”,
and “left”, all sharing identical output fuzzy sets of T (Figure 10). Nevertheless, the output
fuzzy sets consist of two categories of distributions: Gauss and sigmoid distribution sets,
as outlined in Definition 4. Regarding the Gauss distributions, their functional form is
specified by:
Definition 4 (Output fuzzy sets). The membership functions for both extreme-sided output sets
are defined as “fast” with µ f and “too slow” with µts , such that
1
µ f ,ts ( T ) = . (18)
1 + e±Tk ∓ck
Here, T [s] denotes the period of time for oscillatory functions, with the slope direction
determined by its sign. The parameter c represents an offset, and k is the numerical index of a
specific set.
Furthermore, the membership functions for three intermediate output sets are defined as “agile”
with µ a , “normal” with µn , and “slow” with µs , such that:
!
( T̄k − T )2
−
2σ2
Tk
µ a,n,s ( T ) = e . (19)
Here, T̄k represents the mean value, and σTk denotes the standard deviation of set k, with k
serving as the numeric index of a specific set.
Hence, exclusively for the jth category among the output fuzzy sets affected by the
fuzzy inference rule essential for estimating the value of T, the centroid method is employed
for defuzzification through the following expression:
∑ j µ∗j Tj (µ∗j )
T f ,r,l = , (22)
∑ j µ∗j
Machines 2024, 12, 124 19 of 41
or more specifically,
µ∗f T (µ∗f ) + µ∗a T (µ∗a ) + µ∗n T (µ∗n ) + µ∗s T (µ∗s ) + µ∗ts (µ∗ts )
T f ,r,l = . (23)
µ∗f + µ∗a + µ∗n + µ∗s + µ∗ts
For terms T (µ f ) and T (µts ), the inverse membership function (20) is applicable,
whereas for the remaining sets in the jth category, the inverse membership (21) is applied.
Reference [38] reported a CPG model to control a robot fish’s motion in swimming
and crawling and let it perform different motions influenced by sensory input from light,
water, and touch sensors. The study of [39] reported a CPG controller with proprioceptive
sensory feedback for an underactuated robot fish. Oscillators and central pattern generators
(CPGs) are closely related concepts. Oscillators are mathematical or physical systems
exhibiting periodic behavior and are characterized by the oscillation around a stable
equilibrium point (limit cycle). In the context of CPGs, these are neural networks that utilize
oscillators that create positive and negative feedback loops, allowing for self-sustaining
oscillations and the generation of rhythmic patterns, particularly implemented in numerous
robotic systems [40]. CPGs are neural networks found in the central nervous system of
animals (e.g., fish swimming [41]) that generate rhythmic patterns of motor activity and
are responsible for generating and coordinating optimized [42] repetitive movements.
The present research proposes a different approach from the basic CPG model, and as
a difference from other wire-driven robot fish motion approaches [43], this study introduces
three fundamental undulation functions: forward, right-turning, and left-turning. These
functions are derived from empirical measurements of the robot’s caudal fin oscillation
angles. However, a distinctive behavioral undulation swim is achieved by blending these
three oscillation functions, each incorporating corresponding estimation magnitudes de-
rived from the fuzzy controller outputs. The formulation of each function involves fitting
empirical data through Fourier series. As a difference from other approaches on CPG
parameter adjustment [44], the preceding fuzzy outputs obtained from (23) to estimate time
periods T f , Tr , Tl play a pivotal role in parameterizing the time periods for the periodic
oscillation functions, as outlined in Proposition 6.
! ! !
2π 2π 2π
ψ f (ϕ, T f ) = 0.0997 + 0.3327 cos ϕ − 0.1297 sin ϕ − 0.5760 cos 2ϕ +
Tf Tf Tf
! ! ! !
2π 2π 2π 2π
0.3701 sin 2ϕ − 0.1431 cos 3ϕ + 0.1055 sin 3ϕ − 0.0870 cos 4ϕ + (24a)
Tf Tf Tf Tf
! ! !
2π 2π 2π
0.06323 sin 4ϕ − 0.0664 cos 5ϕ − 0.0664 sin 5ϕ .
Tf Tf Tf
Likewise, the undulation pattern for right-turn motion is given by the function
Machines 2024, 12, 124 20 of 41
!
2π 2π 2π
ψr (ϕ, Tr ) = 0.3324 + 0.1915 cos ϕ + 0.0622 sin ϕ + 0.4019 cos 2ϕ +
Tf T T
2π 2π 2π 2π (24b)
0.2920 sin 2ϕ − 0.264 cos 3ϕ − 0.3634 sin 3ϕ − 0.0459 cos 4ϕ
T T T T
2π 2π
− 0.1413 sin 4ϕ + 0.0665 sin 5ϕ .
T T
Finally, the undulation pattern for left-sided turning motion is established by the expression
2π 2π 2π
ψl (ϕ, Tl ) = −0.1994 + 0.125 cos ϕ − 0.0622 sin ϕ + 0.3354 cos 2ϕ −
Tl T T
l l
2π 2π 2π 2π
0.292 sin 2ϕ − 0.3305 cos 3ϕ + 0.3634 sin 3ϕ − 0.1124 cos 4ϕ + (24c)
Tl Tl Tl Tl
2π 2π 2π
0.1413 sin 4ϕ − 0.0664 cos(5ϕ ) + 0.2659 sin(5ϕ ).
Tl Tl Tl
The approaches to forward, right-turn, and left-turn based on the findings of Propo-
sition 6 are illustrated in Figure 11. Additionally, a novel combined oscillation pattern
emerges by blending these three patterns (25), each assigned distinct numerical weights
through the neuro-fuzzy controller.
The Boolean [0, 1] values derived from (26) are illustrated in Figure 11a–c as sequences
of step signals for both the right (R) and left (L) solenoids of the robot. Each plot maintains
a uniform time scale, ensuring vertical alignment for clarity. In contrast to the work
presented in [45] focusing on the swimming modes and gait transition of a robotic fish,
the current study, as depicted in Figure 11, introduces a distinctive context. The three
oscillatory functions, ψ f ,r,l , are displayed both overlapped and separated, highlighting
their unique decoupled step signals. Assuming ξ = 0 for all ϕ in each case, in Figure 11a,
ψr = ψl ≈ 0, with Tr,l ≥ 6; in Figure 11b, ψ f = ψl ≈ 0, with T f ,l ≥ 6; and in Figure 11c,
ψ f = ψr ≈ 0, with T f ,r ≥ 6. For a more comprehensive understanding, Figure 12b presents
the electromechanical components of the caudal motion oscillator.
Machines 2024, 12, 124 21 of 41
(a)
(b)
(c)
Figure 11. Oscillation functions for the caudal tail (measured in radians) synchronized with dual
Boolean coil step patterns (dimensionless), presented on a consistent time scale: (a) Forward undula-
tions. (b) Right-turn undulations. (c) Left-turn undulations.
Machines 2024, 12, 124 22 of 41
(a)
(b) (c)
Figure 12. Model of the robot fish mechanism, illustrating: (a) Top view of the musculoskeletal
system. (b) Top view of the robot’s head with antagonistic muscle-based electromagnetic oscillator.
(c) Side view of the ballast system device positioned beneath the robot’s head.
Upon restitution contraction, this stored energy propels the rotation of the link rs , which
constitutes the first vertebra of the fish.
Furthermore, the caudal musculoskeletal structure, comprising four links (ℓ1 , ℓ2 , ℓ3 , ℓ4 )
and three passive joints (θ1 , θ2 , θ3 ), facilitates a sequential rotary motion transmitted from
link 1 to link 4. This transmission is accompanied by an incremental storage of energy in
each helical spring that is serially connected. Consequently, the last link (link 4) undulates
with significantly greater mechanical advantage. In summary, a single electrical pulse in
any coil is sufficient to induce a pronounced undulation in the swimming motion of the
robot’s skeleton.
As for the ballasting control device situated beneath the floor of the electromechanical
oscillator, activation occurs only when either of two possible outputs from the artificial
neural network (ANN) is detected: when yC equals “sink” or “buoyancy”. However, the
fuzzy nature of these inputs results in a gradual slowing down of the fish’s undulation to
its minimum speed. Additionally, both actions are independently regulated by a dedicated
feedback controller overseeing the ballasting device.
Now, assuming knowledge of the input train of electrical step signals sr,l applied to
the coils, let us derive the dynamic model of the biorobot, starting from the electromagnetic
oscillator and extending to the motion transmitted to the last caudal link. Thus, as illustrated
in Figure 12a,b, the force f [N] of the solenoid’s magnetic field oscillator is established on
either side (right, denoted R, or left, denoted L),
B2 A
f = . (27)
2µo
In this context, A [m2 ] represents the area of the solenoid’s pole. The symbol µo denotes the
magnetic permeability of air, expressed as µo = 4π × 10−7 H/m (henries per meter). Hence,
the magnetic field B (measured in Teslas) at one extreme of a solenoid is approximated by:
µo iN
B= , (28)
l
where i represents the coil current [A], N is the number of wire turns in a coil, and l denotes
the coil length [m]. Furthermore, a coil’s current is described by the following linear
differential equation as a function of time t (in seconds), taking into account a potential
difference v (in volts):
1 T
Z
i= vdt + i0 . (29)
L 0
Here, L represents the coil’s inductance (measured in henries, H) with an initial current
condition denoted i0 . Additionally, the coil’s induction model is formulated by:
µo N 2 A
L= . (30)
l
In essence, this study states that both lateral solenoids exhibit linear motion character-
ized by an oscillator force f os . This force is expressed as:
i2 Al
f os = (31)
2µo N 2 A
and due to the linear impacts of solenoids on both sides R and L (refer to Figure 12a), the
first bar of the oscillator mechanism generates a torque, expressed as:
It is theorized that the restitution/elongation force along the muscle is denoted f m (R or L),
with this force being transmitted from the electromechanical oscillator to the antagonistic
Machines 2024, 12, 124 24 of 41
muscle in the opposite direction (refer to Figure 12b). This implies that the force generated
from the linear motion solenoid in the oscillator’s right-sided coil, denoted f osR , is applied at
point R and subsequently reflected towards point L with an opposite direction, represented
as − f os L . Similarly, conversely from the oscillator’s left-sided solenoid, the force f os L is
applied at point L and transmitted to point R as − f os L . For f os L applied in L,
− f os L − f osR
f mR = ; f mL = . (33)
cos(α R ) sin(α L )
Hence, the angles α R,L assume significance as the forces acting along the muscles f m
differ, resulting in distinct instant elongations xm (t). Consequently, the four-bar trapezoid-
shaped oscillator mechanism manifests diverse inner angles, namely, θ1,2 , β, and γ1,2 , as
illustrated in Figure 12b.
Thus, prior to deriving an analytical solution for α, it is imperative to formulate a
theoretical model for the muscle. In this study, a Hill’s model is adopted, as depicted in
Figure 12a (on the right side). The model incorporates a serial element SE (overdamped), a
contractile element CE (critically damped), and a parallel element PE (critically damped),
each representing distinct spring-mass-damper systems.
The generalized model for the antagonistic muscle is conceptualized in terms of the
restitution force, and it is expressed as:
f m = f SE − ( f CE + f PE ). (34)
Furthermore, through the independent solution of each element within the system in terms
of elongations, the SE model can be expressed as:
Here, s1,2 represent arbitrary constants representing the damping amplitude. The
terms λ1,2 denote the root factors,
q
c
2
( cmSEw )2 − 4 kmSEw
λ1,2 = − SE ± . (37)
mw 2
Here, the factors λ1,2 are expressed in relation to the damping coefficient cSE (in kg/s)
and the elasticity coefficient k SE (in kg/s²).
Similarly, for the contractile element CE, its elongation is determined by:
cCE
xCE (t) = (c1 + c2 t)e− mw t . (38)
With amplitude factors c1,2 and damping coefficient cCE , a similar expression is ob-
tained for the parallel element PE:
c PE
x PE (t) = ( p1 + p2 t)e− mw t . (39)
With amplitude factors p1,2 and damping coefficient c PE , the next step involves substituting
these functional forms into the general muscle model,
d2 d2 d2
f m (t) = mw x SE ( t ) + x CE ( t ) + x PE (t) (40)
dt2 dt2 dt2
Machines 2024, 12, 124 25 of 41
s1 λ21 λ1 t s2 λ22 λ2 t c2 −c −c −c
f m (t) = e + e + c1 CE e mw t − c2 cCE e mw t − c2 cCE e mw t +
mw mw mw
c 2 2
−c c −c −c −c
c2 CE te m t + p1 PE e mw t − p2 c PE e mw t − p2 c PE e mw t + (41)
mw mw
c2 −c
p2 ( PE )te mw t .
mw
Proposition 7 (Muscle force model). The solution to the muscle force model, based on a Hill’s
approach, is derived as a time-dependent function f m (t) encompassing its three constituent elements
(serial, contractile, and parallel). This formulation is expressed as:
The arc displacement at point R or L is given by s1 = ros θos . Consequently, the rotation
angle of the input oscillator is expressed as:
s1
θos = . (44)
ros
1
ZZ
θos = ÿdt2 . (45)
ros t
Here, ÿ denotes the linear acceleration of either point R or L along the robot’s Y axis.
By replacing the solenoid’s mass-force formulation,
1 f t2
ZZ
θos = f os dt2 = . (46)
mros t 2mros
rs f os t2
s2 = 2
. (47)
2mros
Machines 2024, 12, 124 26 of 41
Without loss of generality, the inner angle θ1 of the oscillator mechanism (refer to
Figure 12b) is derived as:
π
θ1 = ± θos . (48)
2
Initially, when the oscillator bars are aligned with respect to the X axis, an angular
displacement denoted by θ1 occurs as a result of the transfer of motion from the solenoid’s
tangential linear motion to the input bar. Similarly, in the output bar, the corresponding
angular displacement is represented by θ2 ,
θ2 = π ± ( θ + ∆ θ ). (49)
Here, ∆θ signifies a minute variation resulting from motion perturbation along the various
links of the caudal spine. The selection of the ± operator depends on the robot’s side,
whether it is denoted R or L. As part of the analysis strategy, the four-bar oscillator was
geometrically simplified to half a trapezoid for the purpose of streamlining deductions
(refer to Figure 12b). Within this reduced mechanism, two triangles emerge. One triangle is
defined by the parameters ros , ℓ, d, while the other is characterized by xm (t), ℓ, rs , where ℓ
serves as the hypotenuse and the sides d, ros , and rs remain constant. Consequently, the
instantaneous length of the hypotenuse is deduced as follows:
ℓ2 = ros
2
+ d2 − 2dros cos(θ1 ). (50)
Upon determining the value of ℓ, the inner angle γ1 can be derived as follows:
Therefore, by isolating γ1 ,
ros − ℓ2 − d2
γ1 = arccos . (52)
−2dℓd
Until this point, given the knowledge of γ1 and θ2 , it is feasible to determine the inner
complementary angle γ2 through the following process:
γ2 = θ2 − γ1 . (53)
Subsequently, the angle formed by the artificial muscle and the output bar can be
established according to the following principle:
sin(γ2 ) sin( β)
= . (54)
xm ℓ
Thus, the inner angle β is
ℓ
β = arcsin sin(γ2 ) , (55)
xm
or, alternatively, an approximation of the muscle length is
sin(γ2 )
xm = ℓ . (56)
sin β
This is the mechanism through which the input bar transmits a force f m1 , as defined in
expression (33), from the tangent f os to the output bar, achieving a mechanical advantage
denoted f m2 ,
ros
f m2 = f m1 . (57)
rs
Machines 2024, 12, 124 27 of 41
Definition 5 (Angles α R,L ). The instantaneous angle α, expressed as a function of the inner angles
of the oscillator, is introduced by:
α R,L = β R,L − θ1R,L . (58)
It is noteworthy that, owing to the inertial system of the robot, the longitudinal force
output component f y aligns with the input force f os in direction. Consequently, for a
.
right-sided force, we have α R = β R − θ1R , where:
ros sin(α R )
f xR = f os L (59a)
rs cos(α R )
and
ros
f yR = f osR . (59b)
rs
.
Likewise, for the left-sided α L = β L − θ1L ,
ros sin(α L )
f xL = f osR (60a)
rs cos(α L )
as well as
ros
f yL = f os L . (60b)
rs
In this scenario, an inverse solution is only applicable for f xR,L , with no necessity for
determining f yR,L . Consequently, the mechanical advantage transferred between the input
and output bars can be expressed by a simplified coefficient
. ros
κ= . (61)
rs
sin( β − θ1 )
≡ tan( β − θ1 ) (62)
cos( β − θ1 )
can substitute and streamline the ensuing system of nonlinear equations by solving them
simultaneously. Additionally, let θ1 be defined as:
π rs f osR,L t2
θ1R,L = ± 2
. (63)
2 2mros
Hence, the simultaneous nonlinear system is explicitly presented solely for the force
components along the X axis:
and
f x L = κ f osR tan( β L − θ1L ). (64b)
Machines 2024, 12, 124 28 of 41
and
∂ f xR ∂ f xL
f xL ∂β R − f xR ∂β R
β L t +1 = β L t − ∂ f xR ∂ f xL ∂ f xR ∂ f xL
. (65b)
∂β R ∂β L − ∂β L ∂β R
− θ1 R
∂ f xR
= κ f osR , (66a)
∂β R cos2 ( β R − θ1R )
∂ f xR
= 0, (66b)
∂β L
∂ f xL
= 0, (66c)
∂β R
− θ1 L
∂ f xL
= κ f os L . (66d)
∂β L cos2 ( β L − θ1L )
Therefore, by subsequently organizing and algebraically simplifying,
∂ f xL
f xR ∂β L tan( β R − θ1R ) cos2 ( β R − θ1R )
β R t +1 = β R t − ∂ f xR ∂ f xL
= β Rt + (67a)
θ1 R
∂β R ∂β L
and
∂ f xR
f xL ∂β R tan( β L − θ1L ) cos2 ( β L − θ1L )
β L t +1 = β L t − ∂ f xR ∂ f xL
= β Lt + . (67b)
θ1 L
∂β R ∂β L
Thus, given that the torque of the trapezoid’s second bar is τs = f s rs (see Figure 12b),
we establish a torque–angular moment equivalence, denoted τs ≡ M1 . Leveraging this
equivalence and the prior understanding of the torque τs acting on the second bar of
the trapezoid, mechanically connected to the first link ℓ1 , we affirm their shared angular
moment. Consequently, the general expression for the tangential force f k applied at the
end of each link ℓk is:
M
fk = k . (69)
ℓk
Yet, considering the angular moment Mk for each helical-spring joint, supporting the
mass of the successive links, let us introduce equivalent inertial moments, starting with
Iε 1 = I1 + I2 + I3 + I4 . Subsequently, we define Iε 2 = I2 + I3 + I4 , Iε 3 = I3 + I4 , and finally,
Machines 2024, 12, 124 29 of 41
I4 . Thus, in the continuum of the caudal spine, the transmission of energy to each link is
contingent upon the preceding joints, as established by
Each helical spring, connecting pairs of vertebrae, undergoes an input force f = −kx,
directly proportional to the angular spring deformation indicated by elongation x. Here, k
[kg m2 /s2 ] represents the stiffness coefficient. External forces result in an angular moment,
given by τ = −kθ, where torque serves as an equivalent variable to angular momentum,
such that Iα = −kθ. Consequently, when expressing the formula as a linear second-order
differential equation, we have:
k
θ̈ + θ = θ̈ Lk . (71)
I
Here, θ̈ Lk represents undulatory accelerations arising from external loads or residual
motions along the successive caudal links, which are detectable √ through encoders and
2
IMUs. Assuming an angular frequency ω = k/I, a period p = 2π I/k, and moments of
inertia expressed as Ik = rk2 mk , the general equation is formulated as follows:
Mk = Iε k θ̈k , (72)
By algebraically extending, omitting terms, and rearranging for all links in the caudal
spine, we arrive at the following matrix-form equation:
M1 Iε 1 0 0 0 θ̈ L1 k 1 θ1
M2 0 Iε 2 0 0 · θ̈ L2 − k2 θ2 .
= (74)
M3 0 0 Iε 3 0 θ̈ L3 k 3 θ3
M4 0 0 0 Iε 4 θ̈ L4 k 4 θ4
Hence, in accordance with Expression (69), the tangential forces exerted on all the
caudal links of the robotic fish are delineated by the following expression:
I k
ε1
0 0 0 1
0 0 0
f l1 ℓ1 Iε 2 θ̈ L1 ℓ1 θ1
k2
f l2 0 ℓ2 0 0 θ̈ L2 0
ℓ2 0 0 θ2
=
f l3 Iε 3 ·
θ̈ L − 0
k3
· (75)
0 0 0 3 0 ℓ3 0 θ3
ℓ3
f l4
Iε 4 θ̈ L4 k4 θ4
0 0 0 0 0 0 ℓ4
ℓ4
Alternatively, the last expression can be denoted as the following control law:
M(θ̇tt1 − θ̇tt1 )
f = Mθ̈L − Qθt ≡ − Qθt , (76)
t2 − t1
M−1
θ̇Lt+1 = θ̇Lt + (f + Qθt ). (77)
t2 − t1
Machines 2024, 12, 124 30 of 41
Finally, for feedback control, both equations are simultaneously employed within a
computational recursive scheme, and angular observations are frequently derived from
sensors on both joints: encoders and IMUs.
(a) (b)
Figure 13. Ballasting system of the robot fish. (a) Detailed 3D model of the robot fish with the
ballast device positioned beneath its floor. (b) Components of the basic ballasting device designed for
modeling and control purposes.
The core operational functions of the ballasting device involve either filling its con-
tainer chamber with water to achieve submergence or expelling water from the container
to attain buoyancy. Both actions entail the application of a linear force for manipulating
a plunger or hydraulic cylindrical piston, thereby controlling water flow through either
suction or exertion. Consequently, the volume of the liquid mass fluctuates over time,
contingent upon a control reference or desired level marked H, along with quantifying a
filling rate u(t) and measuring the actual liquid level h(t).
Hence, we can characterize the filling rate u(t) as the change in volume V with respect
to time, expressed as
dV (t)
= u ( t ), (78)
dt
and assuming a cylindrical plunger chamber with radius r and area A = πr2 , the volume
is expressed as
V (t) = Ah(t). (79)
Here, h(t) represents the actual position of the plunger due to the incoming hydraulic
mass volume. Consequently, the filling rate can also be expressed as
To solve the aforementioned equation, we employ the integrating factor method, such
that
k k
ḣ(t) + h(t) = H. (82)
A A
In this instance, the integrating factor is determined as follows:
k kt
R
µ(t) = e A dt = eA. (83)
Thus, by applying the integrating factor, we effectively reduce the order of derivatives
in the subsequent steps,
kt kt k k kt
e A ḣ(t) + e A h(t) = He A . (84)
A A
Through algebraic simplification of the left side of the aforementioned expression, the
following result is determined:
kt
′ k kt
h(t)e A = He A . (85)
A
Following this, by integrating both sides of the equation with respect to time,
Z
kt
′ Z
k kt
h(t)e A dt = He A dt, (86)
t t A
where the expression on the left side undergoes a transformation into
k
Z
kt kt
h(t)e A = H e A dt, (87)
A t
and the right side of the equation, once solved, transforms into
kt kt
h(t)e A = He A + c. (88)
kt
Now, to obtain the solution for h(t), it is isolated by rearranging the term e A :
kt
h(t) = H + ce− A (89)
For initial conditions where h(t0 ) = 0 indicates the plunger is completely inside the
contained chamber at the initial time t0 = 0 s, the integration constant c is determined as
0 = H + ce0 . (90)
dv
f e = m(t) + f k + ρ a A, (92)
dt
where f k is the friction force of the piston in the cylindrical piston, and ρ a A refers to the
water pressure at that depth over the piston’s entry area. The instantaneous mass considers
the piston’s mass me and the liquid mass of the incoming water m a :
m(t) = me + m a , (93)
Machines 2024, 12, 124 32 of 41
ma
where the water density is δa = Va and Va = πr2 h(t), thus completing the mass model:
kt
m(t) = me + δa πr2 H (1 − e− A ). (94)
Therefore, the force required to pull/push the plunge device is stated by the control
law, given as
kt
dv
f e = me + δa πr2 H (1 − e− A ) + f k + ρ a A. (95)
dt
Finally, two sensory aspects were considered for feedback in the ballast system control
and depth estimation. Firstly, the ballast piston is displaced by a worm screw with an
advance parameter L p [m] and a motor, whose sensory measurement is taken at motor
rotations ϕ p by a pulses η̂t encoder of angular resolution R, where ϕ p = 2π
R ηt . This allows
for the instantaneous position of the piston to be measured by the observation ĥt ,
2L p π
ĥt = L p ϕ p = η̂t . (96)
R
Secondly, the robot’s aquatic depth estimation, denoted as yt , is acquired using a
pressure sensor as previously described in Figure 2. This involves measuring the variation
in pressure according to Pascal’s principle between two consecutive measurements taken
at different times.
f t − f t−1 − m f g = 0, (97)
where the instantaneous fluid force at a certain depth is expressed in terms of the measured
pressure ρ̂. Additionally, the area of the fish body subjected to fluid force f t , and robot of
mass m f and averaged area A f ,
f t = ρ̂ a A f . (98)
Assuming an ideal fluid, and substituting the fluid forces in terms of measurable
pressures, as well as expressing the fish mass in terms of its density δ f and volume Vf ,
which describes the weight of the robot’s body. It follows
ρ2 A f − ρ1 A f − δa Vf g = 0, (99)
where y1 and y2 represent the surface depth and the actual depth of the robot, respectively,
such that y = y2 − y1 denotes the total depth. Therefore,
and by determining the pressure in the liquid as a function of depth, we select the level of
the liquid’s free surface such that ρ1 ≡ ρ A , where ρ A represents the atmospheric pressure,
resulting in
ρ̂ − ρ A − δ f yg = 0. (101)
.
Establishing ρ̂ = ρ2 as the water pressure sensor measurement. Thus, the feedback
robot fish depth estimation is given by
ρ̂ − ρ A
y(ρ̂) = . (102)
δa g
sophisticated deep learning artificial neural network structures to achieve superior signal
recognition. This advancement promises to unlock new levels of immersive interaction
and control, paving the way for transformative applications in fields such as virtual reality,
human-robot interaction, and neurotechnology.
(a)
Figure A1. Cont.
Machines 2024, 12, 124 35 of 41
(b)
Figure A1. Components of thumb pattern space: filters γ, λ, and Ω. (a) Filters applied to the left
thumb. (b) Filters applied to the right thumb.
(a)
Figure A2. Cont.
Machines 2024, 12, 124 36 of 41
(b)
Figure A2. Components of index finger pattern space: filters γ, λ, and Ω. (a) Filters applied to the
left index finger. (b) Filters applied to the right index finger.
(a)
Figure A3. Cont.
Machines 2024, 12, 124 37 of 41
(b)
Figure A3. Components of middle finger pattern space: filters γ, λ, and Ω. (a) Filters applied to the
left middle finger. (b) Filters applied to the right middle finger.
(a)
Figure A4. Cont.
Machines 2024, 12, 124 38 of 41
(b)
Figure A4. Components of ring finger pattern space: filters γ, λ, and Ω. (a) Filters applied to the left
ring finger. (b) Filters applied to the right ring finger.
(a)
Figure A5. Cont.
Machines 2024, 12, 124 39 of 41
(b)
Figure A5. Components of little finger pattern space: filters γ, λ, and Ω. (a) Filters applied to the left
little finger. (b) Filters applied to the right little finger.
References
1. Pan, Y.; Steed, A. A Comparison of Avatar-, Video-, and Robot-Mediated Interaction on Users’ Trust in Expertise. Front. Robot. AI
2016, 6, 12. [CrossRef]
2. Tanaka, K.; Nakanishi, H.; Ishiguro, H. Comparing Video, Avatar, and Robot Mediated Communication: Pros and Cons of
Embodiment. In Collaboration Technologies and Social Computing. CollabTech; Yuizono, T., Zurita, G., Baloian, N., Inoue, T., Ogata,
H., Eds.; Communications in Computer and Information Science; Springer: Berlin/Heidelberg, Germany, 2014.
3. Khatib, O.; Yeh, X.; Brantner, G.; Soe, B.; Kim, B.; Ganguly, S.; Stuart, H.; Wang, S.; Cutkosky, M.; Edsinger, A.; et al. Ocean One: A
Robotic Avatar for Oceanic Discovery. IEEE Robot. Autom. Mag. 2016, 23, 20–29. [CrossRef]
4. Baba, J.; Song, S.; Nakanishi, J.; Yoshikawa, Y.; Ishiguro, H. Local vs. Avatar Robot: Performance and Perceived Workload of
Service Encounters in Public Space. Front. Robot. AI 2021, 8, 778753. [CrossRef]
5. Wiener, N. Cybernetics. Bull. Am. Acad. Arts Sci. 1950, 3, 2–4. [CrossRef]
6. Novikov, D.A. Cybernetics: From Past to Future; Springer: Berlin, Germany, 2015.
7. Tamburrini, G.; Datteri, E. Machine Experiments and Theoretical Modelling: From Cybernetic Methodology to Neuro-Robotics.
Mind Mach. 2005, 15, 335–358. [CrossRef]
8. Skogerson, G. Embodying robotic art: Cybernetic cinematics. IEEE MultiMedia 2001, 8, 4–7. [CrossRef]
9. Beer, S. What is cybernetics? Kybernetes 2002, 31, 209–219. [CrossRef]
10. Fradkov, A.L. Application of cybernetic methods in physics. Physics-Uspekhi 2005, 48, 103. [CrossRef]
11. Romano, D.; Benelli, G.; Kavallieratos, N.G.; Athanassiou, C.G.; Canale, A.; Stefanini, C. Beetle-robot hybrid interaction: Sex,
lateralization and mating experience modulate behavioural responses to robotic cues in the larger grain borer Prostephanus
truncatus (Horn). Biol. Cybern. 2020, 114, 473–483. [CrossRef] [PubMed]
12. Park, S.; Jung, Y.; Bae, J. An interactive and intuitive control interface for a tele-operated robot (AVATAR) system. Mechatronics
2018, 55, 54–62. [CrossRef]
13. Escolano, C.; Antelis, J.M.; Minguez, J. A telepresence mobile robot controlled with a noninvasive brain–computer interface. IEEE
Trans. Syst. Man Cybern. Part B (Cybern.) 2011, 8, 793–804. [CrossRef]
14. Moniruzzaman, M.; Rassau, A.; Chai, D.; Islam, S.M. Teleoperation methods and enhancement techniques for Mobile Robots: A
comprehensive survey. Robot. Auton. Syst. 2022, 150, 103973. [CrossRef]
15. A Perspective on Robotic Telepresence and Teleoperation Using Cognition: Are We There Yet? Available online: https://arxiv.
org/abs/2203.02959 (accessed on 10 August 2023).
16. Williamson, R. MIT SoFi: A Study in Fabrication, Target Tracking, and Control of Soft Robotic Fish. Bachelor’s Thesis, Mas-
sachusetts Institute of Technology, Cambridge, MA, USA, February 2022.
17. Mi, J.; Sun, Y.; Wang, Y.; Deng, Z.; Li, L.; Zhang, J.; Xie, G. Gesture recognition based teleoperation framework of robotic fish.
In Proceedings of the 2016 IEEE International Conference on Robotics and Biomimetics, Qingdao, China, 3–7 December 2016;
pp. 1–6.
Machines 2024, 12, 124 40 of 41
18. An Avatar Robot Overlaid with the 3D Human Model of a Remote Operator. Available online: https://arxiv.org/abs/2303.02546
(accessed on 14 September 2023).
19. Pang, G.; Yang, G.; Pang, Z. Review of Robot Skin: A Potential Enabler for Safe Collaboration, Immersive Teleoperation, and
Affective Interaction of Future Collaborative Robots. IEEE Trans. Med. Robot. Bion. 2021, 3, 681–700. [CrossRef]
20. Schwarz, M.; Lenz, C.; Rochow, A.; Schreiber, M.; Behnke, S. NimbRo avatar: Interactive immersive telepresence with force-
feedback telemanipulation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Praque,
Czech Republic, 27 September–1 October 2021.
21. Li, H.; Nie, X.; Duan, D.; Li, Y.; Zhang, J.; Zhou, M.; Magid, E. An Admittance-Controlled Amplified Force Tracking Scheme for
Collaborative Lumbar Puncture Surgical Robot System. Int. J. Med. Robot. Comp. Assis. Surg. 2022, 18, e2428. [CrossRef] [PubMed]
22. Kristoffersson, A.; Coradeschi, S.; Loutfi, A. A Review of Mobile Robotic Telepresence. Adv. Hum.-Comp. Interact. 2013, 2013,
902316. [CrossRef]
23. Ryu, H.X.; Kuo, A.D. An optimality principle for locomotor central pattern generators. Sci. Rep. 2021, 11, 13140. [CrossRef]
24. Ijspeert, A.J. Central pattern generators for locomotion control in animals and robots: A Review. Neural Netw. 2008, 21, 642–653.
[CrossRef]
25. Nassour, J.; Henaff, P.; Ouezdou, F.B.; Cheng, G. Multi-layered multi-pattern CPG for adaptive locomotion of humanoid robots.
Biol. Cybern. 2014, 108, 291–303. [CrossRef]
26. Yu, J.; Wang, M.; Dong, H.; Zhang, Y.; Wu, Z. Motion Control and Motion Coordination of Bionic Robotic Fish: A Review. J. Biol.
Eng. 2018, 15, 579–598. [CrossRef]
27. Mulder, M.; Pool, D.M.; Abbink, D.A.; Boer, E.R.; Zaal, P.M.T.; Drop, F.M.; El K.; Van Paassen, M.M. Manual Control Cybernetics:
State-of-the-Art and Current Trends. IEEE Tran. Hum.-Mach. Syst. 2017, 48, 1–18. [CrossRef]
28. Kastalskiy, I.; Mironov, V.; Lobov, S.; Krilova, N.; Pimashkin, A.; Kazantsev, V. A Neuromuscular Interface for Robotic Devices
Control. Comp. Math. Meth. Med. 2018, 2018, 8948145. [CrossRef]
29. Inami, M.; Uriu, D.; Kashino, Z.; Yoshida, S.; Saito, H.; Maekawa, A.; Kitazaki, M. Cyborgs, Human Augmentation, Cybernetics,
and JIZAI Body. In Proceedings of the AHs 2022: Augmented Humans, Chiba, Japan, 13–15 March 2022; pp. 230–242.
30. De la Rosa, S.; Lubkull, M.; Stephan, S.; Saulton, A.; Meilinger, T.; Bülthoff, H.; Cañal-Bruland, R. Motor planning and control:
Humans interact faster with a human than a robot avatar. J. Vis. 2015, 15, 52. [CrossRef]
31. Osawa, H.; Sono, T. Tele-Nininbaori: Intentional Harmonization in Cybernetic Avatar with Simultaneous Operation by Two-
persons. In Proceedings of the HAI’21: Proceedings of the 9th International Conference on Human-Agent Interaction, Virtual
Event, Japan, 9–11 November 2021; pp. 235–240.
32. Wang, G.; Chen, X.; Han, S. Central pattern generator and feedforward neural network-based self-adaptive gait control for a
crab-like robot locomoting on complex terrain under two reflex mechanisms. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417723440.
[CrossRef]
33. Aymerich-Franch, L.; Petit, D.; Ganesh, G.; Kheddar, A. Object Touch by a Humanoid Robot Avatar Induces Haptic Sensation in
the Real Hand. J. Comp.-Med. Commun. 2017, 22, 215–230. [CrossRef]
34. Talanov, M.; Suleimanova, A.; Leukhin, A.; Mikhailova, Y.; Toschev, A.; Militskova, A.; Lavrov, I.; Magid, E. Neurointerface
implemented with Oscillator Motifs. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems,
Prague, Czech Republic, 27 September–1 October 2021; pp. 4150–4155.
35. Ladrova, M.; Martinek, R.; Nedoma, J.; Fajkus, M. Methods of Power Line Interference Elimination in EMG Signals. Trans. Tech.
Publ. 2019, 40, 64–70. [CrossRef]
36. Tigrini, A.; Verdini, F.; Fioretti, S.; Mengarelli, A. On the decoding of shoulder joint intent of motion from transient EMG: Feature
evaluation and classificatio. IEEE Trans. Med. Robot. Bion. 2023, 5, 1037–1044. [CrossRef]
37. Ivancevic, V.; Beagley, N. Brain-like functor control machine for general humanoid biodynamics. Intl. J. Math. Math. Sci. 2005,
2005, 171485. [CrossRef]
38. Crespi, A.; Lachat, D.; Pasquier, A.; Ijspeert, A.J. Controlling swimming and crawling in a fish robot using a central pattern
generator. Auton. Robot. 2008, 25, 3–13. [CrossRef]
39. Manduca, G.; Santaera, G.; Dario, P.; Stefanini, C.; Romano, D. Underactuated Robotic Fish Control: Maneuverability and
Adaptability through Proprioceptive Feedback. In Biomimetic and Biohybrid Systems. Living Machines 2023; Meder, F., Hunt, A.,
Margheri, L., Mura, A., Mazzolai, B., Eds.; LNCS; Springer: Berlin, Germany, 2023; p. 14157. [CrossRef]
40. Pattern Generators for the Control of Robotic Systems. arXiv 2015, arXiv:1509.02417.
41. Uematsu, K. Central nervous system underlying fish swimming [A review]. In Bio-Mechanisms of Swimming and Flying; Kato, N.,
Kamimura, S., Eds.; Springer: Tokyo, Japan, 2008; pp. 103–116.
42. Ki-In, N.; Chang-Soo, P.; In-Bae, J.; Seungbeom, H.; Jong-Hwan, K. Locomotion generator for robotic fish using an evolutionary
optimized central pattern generator. In Proceedings of the IEEE International Conference on Robotics and Biomimetics, Tianjin,
China, 14–18 December 2010.
43. Xie, F.; Zhong, Y.; Kwok, M.F.; Du, R. Central Pattern Generator Based Control of a Wire-driven Robot Fish. In Proceedings of the
IEEE International Conference on Information and Automation, Wuyishan, China, 11–13 August 2018; pp. 475–480.
Machines 2024, 12, 124 41 of 41
44. Wang, M.; Yu, J.; Tan, M. Parameter Design for a Central Pattern Generator Based Locomotion Controller. In Proceedings of the
ICIRA 2008, Wuhan, China, 15–17 October 2008; pp. 352–361.
45. Wang, W.; Guo, J.; Wang, Z.; Xie, G. Neural controller for swimming modes and gait transition on an ostraciiform fish robot. In
Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Wollongong, NSW, Australia,
9–12 July 2013; pp. 1564–1569.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.