sensors-21-02575
sensors-21-02575
Article
Developing a Modern Greenhouse Scientific Research
Facility—A Case Study
Davor Cafuta 1,2, * , Ivica Dodig 1,2, * , Ivan Cesar 1 and Tin Kramberger 1
Abstract: Multidisciplinary approaches in science are still rare, especially in completely different
fields such as agronomy science and computer science. We aim to create a state-of-the-art floating
ebb and flow system greenhouse that can be used in future scientific experiments. The objective is
to create a self-sufficient greenhouse with sensors, cloud connectivity, and artificial intelligence for
real-time data processing and decision making. We investigated various approaches and proposed
an optimal solution that can be used in much future research on plant growth in floating ebb and
flow systems. A novel microclimate pocket-detection solution is proposed using an automatically
guided suspended platform sensor system. Furthermore, we propose a methodology for replacing
sensor data knowledge with artificial intelligence for plant health estimation. Plant health estimation
allows longer ebb periods and increases the nutrient level in the final product. With intelligent design
and the use of artificial intelligence algorithms, we will reduce the cost of plant research and increase
the usability and reliability of research data. Thus, our newly developed greenhouse would be more
suitable for plant growth research and production.
Citation: Cafuta, D.; Dodig, I.; Cesar, Keywords: internet of things; artificial intelligence; sensors smart agriculture; cloud computing
I.; Kramberger, T. Developing a
Modern Greenhouse Scientific
Research Facility—A Case Study.
Sensors 2021, 21, 2575. https://
1. Introduction
doi.org/10.3390/s21082575
Advances in computing technologies based on embedded systems with the recent
Academic Editor: Giovanni Agati development in smart sensors are leading to cost-effective solutions for the Internet of
Things (IoT). The Internet of Things is an essential component of smart home systems,
Received: 20 March 2021 smart transportation, healthcare, and smart agronomy. In any production environment,
Accepted: 2 April 2021 especially in agronomy, Internet of Things devices enable efficient planning and resource
Published: 7 April 2021 allocation, providing economic benefits and increasing competitiveness in the market [1,2].
The continuous fusion of computing and agronomy science opened a new field called
Publisher’s Note: MDPI stays neutral precision agriculture, leading to higher crop yield within the greenhouse facility [3]. An
with regard to jurisdictional claims in innovative approach using IoT as a data source and deep learning as a decision maker can
published maps and institutional affil- optimize the greenhouse environment such as temperature, humidity and nutrients [4]. By
iations. monitoring the growing process in the greenhouse, better quality of food, cosmetic products
and medicinal substances can be achieved by increasing the plant nutrient levels [5].
According to related work in greenhouse design, the sensors and their location inside
the greenhouse are essential components since some parts of the greenhouse contain micro-
Copyright: © 2021 by the authors. climate pockets. The sensors are organized in several combinations of horizontal, vertical
Licensee MDPI, Basel, Switzerland. and hybrid arrangements to detect and eliminate microclimate pockets [6]. Additionally,
This article is an open access article camera positioning system should be flexible enough to allow precise and diverse image
distributed under the terms and acquisition for successful deep learning model training. Image quality, especially noise
conditions of the Creative Commons levels, can reduce the deep learning model precision [7].
Attribution (CC BY) license (https:// In this paper, we present the system architecture and design of a modern scientific
creativecommons.org/licenses/by/ greenhouse research facility for the purpose of Croatian Science Foundation’s Project
4.0/).
Urtica—BioFuture. Several sensor nodes are proposed in different locations: nutrient so-
lution, environmental (inside and outside the greenhouse), and sensor nodes for energy
efficiency and power supply. They are connected via a dedicated central node. As a main
contribution, we propose a novel system architecture concept for automated sensor posi-
tioning using suspended platform concept to measure accurate environmental data in any
available position to achieve the best possible automated hybrid arrangement for microcli-
mate pocket isolation. Using this measurement microclimate pockets will be detected and
isolated. The proposed positioning system enables precise image acquisition from multiple
angles, thus resulting in image data diversity. Image diversity plays an important role in
deep learning model regularization.
Sensor sensing techniques and communication technologies are also considered in
this paper. Precise sampling techniques are used resulting in Big Data due to the scientific
nature of this data acquisition. To enable a steady flow of this data, a constant power supply
and uninterrupted connection are essential. The data is stored locally and continuously
synchronized with the cloud service.
The cloud will provide plant health calculations according to sensor data analysis.
Opposite to calculation, we propose a methodology to use a deep learning method that uses
RGB camera images, chlorophyll leaf images, and thermal camera images to estimate plant
health. Such methodology can lead to equivalently precise, yet more affordable solutions
applicable in the production. Common benefits of deep learning models and Big Data
mining are proactive alerting and monitoring systems or autonomous decision making,
which are particularly useful in smart agriculture [8]. Additionally, combining visual data
such as images and using sensors to train the corresponding deep neural network model
based on visual information proves essential for building an affordable smart agriculture
system [9]. Visual information analysis reduces the monitoring complexity and overall
price while maintaining the precision achieved with the sensor cluster.
This calculation of plant health is used in the project to optimize ebb timing periods.
The decision when to make a phase change is a key issue in the project. The goal is to
achieve extended ebb periods for higher plant nutrient levels while avoiding plant wilting.
This is the main challenge to be addressed in the upcoming project experiment.
We wrote this paper as part of Croatian Science Foundation’s Project Urtica—
BioFuture [10]. The project focuses on the development of a modern greenhouse research
facility as a quality basis for future research at the Faculty of Agricultural Sciences, Uni-
versity of Zagreb, Croatia, with the support in computer sciences from Zagreb University
of Applied Sciences. This project focuses on the nutritional and functional Urtica Dioica
(common nettle) values in modern hydroponic cultivation techniques [10].
This paper is organized as follows. Related work on existing greenhouse solutions
is discussed in Section 2. Then, key highlights of the system architecture and design are
presented in Section 3. In this section, a detailed description of the sensors and data acqui-
sition follows, highlighting the greenhouse layout where a new sensor data positioning is
proposed to capture all microclimate pockets. Later, a cloud communication and storage
is described. Finally, the cost of the system is approximated. In Section 4 we presented
an experiment with a model of suspended platform. The paper is concluded in Section 5,
where the advantages of our proposed system and suspended platform are discussed, and
finally some future research directions are given.
2. Related Work
The sensor system is a crucial element of smart agriculture. In greenhouse cultivation,
especially in the laboratory environment, any value in an experiment can be significant.
Majority of the current greenhouse solutions use sensors in different stages of farming for
information gathering, effective monitoring and decision making. The main drawback of
these greenhouse solutions is the lack of sensory diversification.
Wei et al. [11] presented a review of the current development of technologies and
methods in aquaponics. In the greenhouse environment, water quality, environmental data
Sensors 2021, 21, 2575 3 of 23
and nutrient information are involved in intelligent monitoring and control. The paper
summarizes intelligent, intensive, accurate and efficient aquaponics concepts that we used
as a start point for our greenhouse development.
2.1. Sensors
In modern scientific greenhouse research experiments, a vast number of different sen-
sors must be used to reduce the possibility of inadequate research results. The significant
number of sensors is used to reduce the influence factors on different greenhouse locations
and to detect different influence factors in the plant growth. Due to the nature of any
scientific development, it is of great importance to keep the expenses within the project
limits. Therefore, experimenting with expensive and complicated sensors may be uneco-
nomical in such projects. Additionally, it can be challenging to apply such an environment
to production facilities [12]. Various sensors are essential for science-based approaches to
smart and precision agriculture. These sensors include environmental, power supply (for
energy efficiency), nutrient solution sensors, and sensors that determine the chlorophyll
content of plants [13].
Almost all environmental variables (temperature, humidity, amount of light in com-
mon and individual spectral regions, atmospheric pressure and air quality) in the green-
house system can be used as sensed data. Due to the specific requirements of the green-
house experiment, different types of environmental variables need to be monitored, and
thus different values of sensors need to be measured [13]. Many different combinations
are sampled based on experience and experimental parameters: Temperature, humidity,
CO2 concentration, illumination, illuminance (limited to a specific part of the spectrum).
Other sensors include barometric pressure, specific gas concentration (oxygen, nitrogen,
ozone) [13,14].
In addition to environmental sensors, there are other sensors that are used to increase
the environmental and energy consumption awareness of the project (green-it solutions),
resulting in an advantage in economic costs. For this purpose, power supply sensors are
used to determine the energy footprint of the greenhouse. The building strategy of the mod-
ern greenhouse is focused on equipment, sensors and processes that are energy efficient.
Bersani et al. [15] wrote an article on precision and sustainable agriculture approaches that
focuses on the current advanced technological solution to monitor, track and control green-
house systems to make production more sustainable. Pentikousis et al. [16] discusses the
communication environment of the sensors to transmit their data and propose server-side
data aggregation methods. In addition, the article presents sustainable approaches to
achieve near-zero energy consumption while eliminating water and pesticide use.
In production greenhouses, environmental and power supply sensors are used as part
of monitoring control processes to delay or accelerate decisions about opening windows,
blinds, or switching thermal processes such as cooling or heating. An example of the
monitoring and control system is presented in [17,18]. The collected data can be processed
using hybrid AI methods [19] or by applying mathematical models [20]. With the usage of
the monitoring and control system, a zero-energy footprint can be achieved. In addition,
the power supply sensors can be used as an alert medium for a power outage warning,
which may cause irreparable damage and loss of scientific research data. As presented
in [21], in case of main power failure, adaptive power management can be implemented to
extend backup power supply lifespan.
In greenhouses, power supply is used for nutrient delivery to the plants and main-
tenance of proper level of nutrient solutions in the floating system. Therefore, solution
level sensor is used to monitor the level of solution in the floating system [11]. Nutrient
solution sensors are used to determine the properties of the nutrient solution. The most
common properties measured in the nutrient solution are temperature, dissolved oxygen,
total dissolved solids (TDS), and hydrogen strength (pH) values [11].
In the hydroponic floating system, the root of the plant is partially immersed or
sprayed in the nutrient solution and in most cases lies in a growing medium. This growing
Sensors 2021, 21, 2575 4 of 23
medium draws moisture from the nutrient solution. The moisture content can be measured
with a soil hygrometer ( humidity detection sensor) [22] which is inserted into the growing
medium. The sensor consists of an EC probe and a soil resistance metric. It is used to
measure the electrical resistance of the soil, which is an indicator of soil salinity. The salt
concentration of the nutrient solution can change over time, affecting the sensor reading.
Therefore, the differential values of the sensor over time are more relevant than directly
measured results [23].
A well-balanced plant nutrient growing solution results in a healthier plant. The
plant health can be observed by monitoring the visual physiognomy of the plant, and
this system can also be used to analyze and detect plant diseases or crop damage [24].
Nutrient solution should be inspected and changed frequently to enhance the elimination
of phytopathogens [25].
Visual monitoring ranges from custom-made devices such as LeafSpec [26,27], the
use of a normal camera combined with a microcontroller, a processor board [28–30] or
a smartphone camera [31–33]. Papers propose monitoring plants with different types of
cameras: standard spectral camera, infrared camera, thermal imaging camera, or color
component camera.
A hyperspectral and spectroscopy system camera is used [34,35] to obtain better
results. There is also an example of a custom-made system used in [36]. The camera can
observe the plant as a whole or just a part of it, such as the leaves. The context is also
distinguished by image precision. The image can be taken in a precise position with
little background noise, or from a distance with somewhat unpredictable background and
viewing conditions.
Opposite to camera systems, RGB color sensors are used in [37,38]. A RGB color
sensor or infrared sensor provides a direct numerical value for a specific detail on the
captured image.
repetitive results using sensor data sampling techniques. Similar measurements of the
neighboring locality can be excluded if the difference is below the context-specific threshold,
which depends on the required data quality. Another approach in the sampling procedure
assumes a small hysteresis around the last measurement result. If the result remains within
the given frame, it is discarded since no change is detected [47]. There is also a proposal that
small anomalies can be discarded [6]. By using the algorithm proposed by Kochhar et al.
the sensor frequency sampling can be maximized to capture specific events and redundant
data is discarded [6].
Data acquisition, processing, and sampling require computational power in the form
of data processing and storage. Computing power board equipped with microcontroller or
processor with an operating system is essential to link sensor data and the database. The
database can be available on-site or through a connection to a remote database in the cloud.
Depending on the requirements, each system can be based on microcontrollers, a processor
board with an operating system, or a hybrid system.
Microcontrollers provide better connection interface options with sensors. Most of
them are equipped with multiple connection interfaces such as I2C, SPI or UART. The
most commonly used microcontrollers are based on Arduino. The most popular Arduino
compatible boards include Arduino UNO, Arduino Yun, Arduino Nano, Arduino Mega,
ESP8266, ESP32, Intel Galileo Gen 2, Intel Edison, Beagle Bone Black and Electric Imp
003 [48].
Microcontrollers provide direct analogue input interfaces as they are equipped with
analogue-to-digital converters. However, they lack storage, multithreading and multipro-
cessing capabilities. Rabadiya et al. [49] proposed a system implemented using ESP8266
and Arduino support. There are also multiple papers using Arduino boards for data
processing in greenhouses [13,50,51].
Another approach opposite to microcontrollers is the processor boards with the op-
erating system. The most common operating systems are specific Linux distributions
without graphical interface. In such environments there is the possibility of local database
storage with multi-thread and multiprocessing capabilities. The most popular processor
boards that include the operating system are Raspberry Pi, Orange Pi, Banana Pi, Odroid.
However, these boards have a smaller number of pins than microcontroller boards. They
have I2C, SPI, and UART interfaces, but they lack analogue input pins that are equipped
with analogue to-digital converters. These types of boards usually have larger power
requirements and dimensions. There are hybrid solutions based on a microcontroller board
with a tiny OS (e.g., RTOS, MicroPython) [23].
In multiple papers, a combination of microcontrollers and processor boards is pro-
posed to reduce power requirements and provide multiple analogue interface sensors.
Systems with lower power requirements are usually based on solar or battery powered
concepts [52].
In a combination system, a node consists of a set of microcontrollers that provide
sensor interfaces to processor boards that aggregate and send data to the cloud. Each node
collects data from multiple sensors connected via interfaces. The nodes can be connected
to power, battery or be solar powered. In a combined system, a central node based on the
processor board node is required [52].
There is a need for interconnections between the nodes to enable communication.
These connections can be classified into wireless and wired. There are multiple wireless
standards available for IoT devices. The proposed wireless connection depends on the
availability of the microcontroller or processor board interface, the required power require-
ment, the required connection bandwidth, the communication distance, and the common
obstacles in the communication [53]. The connection protocols vary from Bluetooth and
WiFi to GSM, radio (NRF) or ZigBee [6,54].
There are also mobile network protocols such as GPRS, 3G, 4G and 5G [55]. A particu-
lar protocol can be invented, but it is not a standard solution for use due to incompatibility
Sensors 2021, 21, 2575 6 of 23
with other systems. When wireless communication is used, more power node consumption
is expected.
In contrast, a wired connection may use a connecting wire to supply power. The most
known protocol is power over ethernet. However, there are other options that are not
standardized. The wired connection provides an uninterruptible power supply (UPS),
which ensures system availability in the event of a power failure. The UPS also provides
information about a power failure or low UPS battery to the nodes. This information can be
used to gracefully shut down all nodes and alert maintenance personnel in a timely manner.
Each communication is composed of a physical link layer and a logical link layer. The
physical link layer can be used as a wired or wireless link. Above the physical layer is a
logical layer in the form of a communication protocol. In most simple solutions, a specific
protocol can be programmed specifically for the solution at hand. In most cases, standard
networking protocols and addressing are used, such as Internet Protocol (IP). IoT devices
have standardized specific protocols. The most commonly used specific protocol is Message
Queuing Telemetry Transport (MQTT) [50]. Despite the specific IoT protocols, standard
web service communication protocols such as HTTP, HTTPS, and SOAP are common.
When working with publicly available services, it is necessary to pay attention to
security. In any network architecture, there is a risk of cybersecurity threats. To make
a system more secure, Astillo et al. [56] proposed a lightweight specification-based dis-
tributed detection to efficiently and effectively identify the misbehavior of heterogeneous
embedded IoT nodes in a closed-loop smart greenhouse agriculture system.
successful training and validation of the computer vision neural model. Specific sensors
can be used to provide numerical data in correlation with the obtained images [37,38].
Figure 1. System design and physical architecture scheme. The image describes the organization of
major greenhouse nodes with short descriptions. All nodes are interconnected through the local area
network and communicate with cloud via the wide area network.
pockets. Using the results data in time intervals, specific microclimate pockets can be
identified, and their variations estimated.
Internal environmental and leaf sensor nodes in the laboratory greenhouse are placed
together on a suspended platform. The suspended platform is used to detect microclimate
pockets as their position changes, and cameras simultaneously capture images of the plants.
Using an automatic guidance system for the suspended platform, plant images are captured
in time frames and uploaded to the cloud. At the same time, the real data is measured and
linked to the images in a database. This technique can be used for data preparation for
the AI learning process and later as a verification technique. Additionally, microclimate
pockets can be discovered by analyzing this data.
Figure 2. The proposed suspended platform concept. The suspended platform uses a six-degree-of-
freedom cable-suspended robot for positioning. Cable-positioning systems can be easily applied in
different greenhouse layouts since they provide large ranges of motion.
The nodes of the external environmental sensors outside the greenhouse should be
placed in an optimal position, e.g., above the roof or in a more remote location without the
influence of internal factors of the greenhouse. In our case, one external environmental node
is sufficient because the greenhouse is directly exposed to the sun without any obstacles.
If the greenhouse has a specific orientation or obstacles that partially block part of the
greenhouse during the day, multiple sensor nodes would be a mandatory solution.
The nutrient sensor node that has emerged in the prepared solution is placed on the
floating platform inside the holding tank. Nutrient sensor nodes that have emerged in the
nutrient solution for plant growth are placed on the floating platform within the floating
system. Nutrient sensors require special treatment due to sediment formation on the probes.
pH and oxygen probes should not be continuously immersed in the nutrient solution. After
successful measurement, the probes must be removed from the nutrient solution and
immersed in clean distilled water before used in the same or a different nutrient solution.
The cleaning process of the probes can be done manually or automatically using the robotic
arm concept. We propose using high-quality probes that can be immersed in the nutrient
solution for extended periods of time without negatively affecting the measurement results.
Sensors 2021, 21, 2575 11 of 23
An exception to storing in the database are images which are stored in the local file
system. Images are not stored in the database because the database engine is not capable
of handling a large blob. Path and name are placed in the local database table instead of
result data to track images stored in the file system.
Script queries sensors and stores result in local database along with current timestamp.
Nodes are synchronized with atomic clock daily to ensure accurate timestamp. All scripts
are adjusted to discard values that deviate significantly from the estimated threshold during
test measurement periods. Repetitive values are not recorded because they take up space in
the database and would slow down query execution. Their absence from the database does
not affect the final result, as the system assumes that the value has not changed during the
queried period.
Sensors 2021, 21, 2575 12 of 23
operating system will not boot. Each node acknowledges the orderly shutdown request
and starts the shutdown process. After losing network connectivity with the sensor node,
the energy efficiency and power node knows that a graceful shutdown has been completed
on a sensor node. After determining that all nodes have completed the shutdown process,
the energy efficiency and power supply node will shut down. This process must begin in
time before the complete power failure of UPS to complete successfully. The shutdown
period must be extended as the batteries of UPS lose capacity over time.
Energy efficiency and power nodes inform maintenance personnel with alerts of
the following priorities: fatal, technical, and anomaly. Fatal faults such as power supply
failure are immediately sent to maintenance personnel. Technical and anomaly faults are
collected and presented to cloud users upon connecting. Technical faults are associated
with technical system architecture and maintenance. Anomaly faults are linked to outlier
sensor readings. Low priority errors may increase. For example, an incorrect sensor reading
is an anomaly fault. If multiple anomaly warnings are repeatedly detected within a short
period of time, an anomaly fault is elevated to a technical fault. If an anomaly is detected
over an extended period of time, it is upgraded to a major fault because it may indicate
equipment failure and require intervention.
All sensor data is collected and stored in the local database for each sensor node. The
energy efficiency and power supply node hold information about other sensor nodes and
local sensor data. This data needs to be transferred to the cloud for detailed analysis.
factor for training and fitting the deep learning model. Calculating plant health only from
multiple statically placed camera devices significantly reduces the implementation cost of
the greenhouse.
Even without the sensor node system, it is possible to detect a malfunction of the
system based on the calculation of plant health over a period of time. During this period,
sudden deviations in the plant health calculation will alert the researchers because there
is a possible problem with the proposed calculation model or serious problems within
the greenhouse system, such as nutrient solution level, temperature, or artificial light
error. Ultimately, the images processed with the deep neural network model should be
sufficient to replace the sensor node system for determining plant health in the production
greenhouse.
from a reliable sensor source. For image processing, the deep learning model consists of
a backbone based on convolutional neural networks using one of the proven backbone
architectures such as ResNet [96], Inception [97], DenseNet [98] or an efficient concept of
backbone network scaling [99]. Since the plant health value is a single number, the model
contains a regression head with MSE loss function. Due to catastrophic forgetting, small
periodic model updates are not easy to achieve. Therefore, we tended to use large periodic
updates over a longer period of time. We leave a detailed analysis of the model update
time frame to future work.
4. Experimental Findings
In every greenhouse the temperature and the humidity are measured. These two
sensors form the minimum measurement setup, although each specific greenhouse might
require a specific set of sensors. Each sensor from set provides specific values that depend
on the element measured. Very often, the elements measured depend on the measurement
position and corresponding spatial variations. For instance, the temperature next to a
window or door, next to a glass wall or in a corner in the shade will report different results.
The measurement differences acquired this way form microclimate pockets.
Therefore, the sensor positioning within the greenhouse is extremely important, since
our primary goal is to locate and isolate the microclimate pockets. Variations measured in
the microclimate pockets affect plant health and should be included in the calculations and
data analysis. This requires automated sensor positioning as opposed to the horizontal or
vertical fixed positioning. Related research focuses on autonomous vehicles, conveyors and
drones to find and isolate microclimate pockets. We have proposed the suspended platform
architecture that allows flexible spatial positioning, covering all three spatial dimensions
as it can be seen in Figure 5. Additionally, the flexible positioning concept is essential to
ensure diverse plant image acquisition to improve deep learning model applicability to
new and unseen environments.
The suspended platform is equipped with internal sensor node sensors. The sensor
node consists of CO2 , temperature, humidity, pressure, multichannel gas sensor, ultra-
violet (UV) and visible light, sensor for visible light with IR cut filter and RGB color sensor
to detect leaf color. Additionally, to collect images, RGB and thermal camera are attached.
During installation, sensors are mounted to prevent the influence on the camera’s field
of view. Due to the suspended platform positioning concept, sensor node heat output or
sunlight blockage is not an issue since positioning in single location is short. This makes
our proposed suspended platform very flexible and precise while not being invasive for
plant or plant environment.
In previous articles, we found mainly targeted measurements of microclimatic points
based on the specific orientation of the greenhouse or some specific parts such as curtains,
Sensors 2021, 21, 2575 17 of 23
Figure 5. Suspended platform model. View from above and below on mounted internal sensor node.
Suspended platform model during experimental positioning—test of cameras and platform stability
during image acquire.
For a proposed square size of 50 × 50 cm, we can conclude that the allowable deviation
of the suspended platform positioning is equal to half the side of the square: 25 cm. The
suspended platform is implemented as a cable-driven parallel robot. Essentially, it is a
set of at least 6 cables that are wound and unwound by winches and connect a frame and
a platform. By synchronously adjusting the length of the various cables, the load can be
moved smoothly over a wide area of the footprint, with control and stability in all 6 degrees
of freedom.
To confirm the concept and determine the variations in positioning, we propose an
experiment to build a model of the suspended platform and to test its positioning abilities.
Three laser pointers are mounted on the platform and the printer is guided through wires
by hand to specific position. Each laser pointer covers one axis: the display on the right
wall, the display on the end wall, and the display on the ground. Each time the suspended
platform was moved, we marked the previous point and measured the deviation of the new
position from the previous point. Through several cycles of guiding, we reduced the results
to acceptable average of 2.7% ± 2% deviation in positioning after full grid positioning cycle
and before next calibration. The measurement provided allows for grid size slightly over
800 cm between opposite grid sectors. Additionally, in our laboratory surroundings we
tested the possibility of positioning in diverse location, especially near the corners of the
laboratory. Precise height positioning of the suspended platform is also satisfactory to be
able to provide a closer leaf inspection. With the model experiment we established that it is
possible to cover the laboratory ground except for corners.
As a comparison, positioning deviation for the similar process in the field of 3D print-
ing spans up to 1.5%, with isolated outlier of 9.4% [79]. We believe that after motorization
with additional calibration, through test experience the better results can be achieved.
The network cable provides power and secures the network connection to the internal
sensor node located on the platform. Although we have designed the cable to be flexible, it
is obvious during positioning that it affects the balance of the suspended platform, since the
results are slightly improved in the experiment without the ethernet cable. After a few tests
Sensors 2021, 21, 2575 18 of 23
we abandoned the use of ethernet cable and decided to use a wireless connection instead.
In the presented figure only power cable provided as connection to suspended platform.
Our experimental testing on model also showed that the flax cable is more suitable
than thin rope as a wire guide. There is a possibility to power the internal sensor node
through two wires from which the platform is suspended. This solution would be tested
later by replacing two wires with thin steel cable.
This device to be able to reset its positioning should have a zero point. The edge zero
point is extremely impractical for the zero point. For this reason, we propose to install
a 3-axis sensor and secure a point in the greenhouse where the unique position of the
suspended platform can be confirmed. The sensor could be implemented via ultrasound or
laser. In addition, such a sensor could detect obstacles during the movement of the platform
and stop the operation of the device, i.e., avoid the obstacle by positioning it via the second
axis. With this experimental finding we can conclude that suspending platform can be
used to detect microclimate pockets and to provide diverse and high-quality close-up plant
images for deep learning model training as presented in Figure 5.
5. Conclusions
With a higher market for organic food production, there are demands for greenhouse
growing in sterile environments, pesticide and fertilizer free, which is hard to find in our
surroundings. The integration of IoT devices into non-computational domains provides
the opportunity to obtain Big Data analytics of every measurable section of an internal
greenhouse process. Such analysis with deep learning models provides valuable insights
and scientific knowledge [100].
The main goal of this paper was to present a state-of-the-art scientific greenhouse
research facility that can be used during and after Project Urtica-BioFuture. In this paper,
we have analyzed related work to gain knowledge about the most commonly used sensors
and greenhouse equipping projects in precision agriculture. A detailed sensor node system
architecture to cover all internal greenhouse processes and to obtain Big Data, which is
subsequently analyzed in the cloud, is presented. The system architecture is presented to
describe the design of the components and their interconnection.
The collected data is synchronized with the cloud in real time, which enables addi-
tional calculation in the cloud. A deep neural model will be trained on sensor data to
estimate plant health from RGB camera images only. This is one of the primary Project
Urtica-BioFuture goals. The trained model can be used as a replacement for the sensor
system to make the greenhouse system more energy and cost efficient in the production
environment.
Microclimatic influences can become a problem in measurement evaluation. To detect
microclimates, different layouts for sensor organization are proposed. In this paper, we
propose an automated hybrid sensor layout based on a suspended platform to detect
microclimate pockets. The proposed layout covers the greenhouse area and allows precise
positioning throughout the greenhouse. In addition, it allows camera positioning above
the plants thus enabling better plant coverage.
The automated hybrid layout with suspended platform offers the advantage of posi-
tioning the sensor node above the plant growing area in all axes. With the introduction of
the system, we eliminate problems with fixed horizontal and vertical layouts, problems
with expensive conveyor systems, problems with floor leverage and obstacles with auto-
mated robotic vehicles, and sensor compensation by drone propulsion. In addition, the
proposed suspended platform is powered by wires, eliminating the concept of battery
replacement and recharging.
To validate the concept, we conducted a simple experiment by building a model of
the suspended platform. In this experiment, we verified positioning errors to confirm the
use of the system according to the proposed grid system over the plant growing area in
the greenhouse. During the experiment, we also identified raised problems and made
suggestions for them. We believe that this paper will enable us to collect better plant images
Sensors 2021, 21, 2575 19 of 23
for AI and detect microclimate pockets and enabling their elimination. This would make
the proposed greenhouse system more effective and provide a novel starting point for the
Urtica-BioFuture project.
For future work, we propose several possible avenues. A detailed analysis of the
microclimate pockets in the greenhouse to obtain a mathematical model describing their
influence in the surrounding areas of the greenhouse is worth considering. With analyses of
the collected data, the sensor system can be further optimized by eliminating or introducing
an additional sensor to replace the sensor group. The deep neural network model can be
further optimized to provide exact mathematical model for plant health calculations by
collecting additional training data from multiple greenhouses and different plant crops.
With this approach a sensor data network simplification with the introducing of a deep
neural network model will be achieved.
Author Contributions: Conceptualization, D.C. and I.D.; methodology, D.C. and I.D.; software, I.C.
and T.K.; validation, I.D., T.K. and I.C.; formal analysis, D.C. and I.C.; investigation, D.C., I.C. and I.D.;
resources, D.C., I.D., I.C. and T.K.; data curation, T.K. and I.C.; writing—original draft preparation,
D.C. and I.D.; writing—review and editing, I.C. and T.K.; visualization, T.K. and D.C.; supervision,
D.C.; project administration, D.C.; funding acquisition, D.C., I.D., I.C. and T.K. All authors have read
and agreed to the published version of the manuscript.
Funding: This research was funded by Croatian Science Foundation, grant number IP-2019-04. The
APC was funded by University of Applied Sciences, Zagreb, Croatia.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Acknowledgments: This paper is a part of the Project Urtica-BioFuture. The authors would like to
thank all project participants for their cooperation and support.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Masoud, S.; Chowdhury, B.D.B.; Son, Y.J.; Kubota, C.; Tronstad, R. Simulation based optimization of resource allocation and
facility layout for vegetable grafting operations. Comput. Electron. Agric. 2019, 163, 104845. [CrossRef]
2. Maksimovic, M. Greening the Future: Green Internet of Things (G-IoT) as a Key Technological Enabler of Sustainable Development.
In Internet of Things and Big Data Analytics Toward Next-Generation Intelligence. Studies in Big Data; Springer: Cham, Switzerland,
2018; Volume 30. [CrossRef]
3. Somov, A.; Shadrin, D.; Fastovets, I.; Nikitin, A.; Matveev, S.; Hrinchuk, O. Pervasive Agriculture: IoT-Enabled Greenhouse for
Plant Growth Control. IEEE Pervasive Comput. 2018, 17, 65–75. [CrossRef]
4. Guillen, M.A.; Llanes, A.; Imbernon, B.; Martinez-Espana, R.; Bueno-Crespo, A.; Cano, J.-C.; Cecilia, J.M., Performance evaluation
of edge-computing platforms for the prediction of low temperatures in agriculture using deep learning. J. Supercomput. 2021.
[CrossRef]
5. Ayaz, M.; Ammad-Uddin, M.; Sharif, Z.; Mansour, A.; Aggoune, E.M. Internet-of-Things (IoT)-Based Smart Agriculture: Toward
Making the Fields Talk. IEEE Access 2019, 7, 129551–129583. [CrossRef]
6. Kochhar, A.; Kumar, N. Wireless sensor networks for greenhouses: An end-to-end review. Comput. Electron. Agric. 2019, 163,
104877. [CrossRef]
7. Kramberger, T.; Potočnik, B. LSUN-Stanford Car Dataset: Enhancing Large-Scale Car Image Datasets Using Deep Learning for
Usage in GAN Training. Appl. Sci. 2020, 10, 4913. [CrossRef]
8. Ghosh, A.; Chakraborty, D.; Law, A. Artificial intelligence in Internet of things. CAAI Trans. Intell. Technol. 2018, 3, 208–218.
[CrossRef]
9. Story, D.; Kacira, M. Design and implementation of a computer vision-guided greenhouse crop diagnostics system. Mach. Vis.
Appl. 2015, 26, 495–506. [CrossRef]
10. URTICA—BioFuture. 2020. Available online: http://urtica.agr.hr/en/naslovnica-english/ (accessed on 25 December 2020).
11. Wei, Y.; Li, W.; An, D.; Li, D.; Jiao, Y.; Wei, Q. Equipment and Intelligent Control System in Aquaponics: A Review. IEEE Access
2019, 7, 169306–169326. [CrossRef]
12. Saiz-Rubio, V.; Rovira-Más, F. From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management. Agronomy
2020, 10, 207. [CrossRef]
Sensors 2021, 21, 2575 20 of 23
13. Miranda, J.; Ponce, P.; Molina, A.; Wright, P. Sensing, smart and sustainable technologies for Agri-Food 4.0. Comput. Ind. 2019,
108, 21–36. [CrossRef]
14. Wei, L.Y.; Sheng-Kai, T.; Jyun-Kai, L.; Ta-Hsien, H. Delopoing Smart Home Applications. Mob. Netw. Appl. 2020. [CrossRef]
15. Bersani, C.; Ouammi, A.; Sacile, R.; Zero, E. Model Predictive Control of Smart Greenhouses as the Path towards Near Zero
Energy Consumption. Energies 2020, 13, 3647. [CrossRef]
16. Oliver, P.; Kostas, B.; Calvo, R.A.; Papavassiliou, S. (Eds.) Mobile Networks and Management; Springer: Berlin/Heidelberg, Germany,
2010. [CrossRef]
17. Wang, L.; Wang, B. Construction of greenhouse environment temperature adaptive model based on parameter identification.
Comput. Electron. Agric. 2020, 174, 105477. [CrossRef]
18. Subahi, A.F.; Bouazza, K.E. An Intelligent IoT-Based System Design for Controlling and Monitoring Greenhouse Temperature.
IEEE Access 2020, 8, 125488–125500. [CrossRef]
19. Castañeda-Miranda, A.; Castaño, V. Smart frost measurement for anti-disaster intelligent control in greenhouses via embedding
IoT and hybrid AI methods. Measurement 2020, 164, 108043. [CrossRef]
20. Villarreal-Guerrero, F.; Pinedo-Alvarez, A.; Flores-Velázquez, J. Control of greenhouse-air energy and vapor pressure deficit with
heating, variable fogging rates and variable vent configurations: Simulated effectiveness under varied outside climates. Comput.
Electron. Agric. 2020, 174, 105515. [CrossRef]
21. Vamvakas, P.; Tsiropoulou, E.E.; Vomvas, M.; Papavassiliou, S. Adaptive power management in wireless powered commu-
nication networks: A user-centric approach. In Proceedings of the 2017 IEEE 38th Sarnoff Symposium, Newark, NJ, USA,
18–20 September 2017. [CrossRef]
22. DFRobot, Gravity: Analog Capacitive Soil Moisture Sensor-Corrosion Resistant SEN-0193. 2019. Available online: https:
//www.dfrobot.com/product-1385.html (accessed on 14 December 2020).
23. Angelopoulos, C.M.; Filios, G.; Nikoletseas, S.; Raptis, T. Keeping Data at the Edge of Smart Irrigation Networks: A Case Study in
Strawberry Greenhouses. Comput. Netw. 2019, 167, 107039. [CrossRef]
24. Dong, Z.; Men, Y.; Liu, Z.; Li, J.; Ji, J. Application of chlorophyll fluorescence imaging technique in analysis and detection of
chilling injury of tomato seedlings. Comput. Electron. Agric. 2020, 168, 105109. [CrossRef]
25. Malewski, T.; Brzezińska, B.; Belbahri, L.; Oszako, T. Role of avian vectors in the spread of Phytophthora species in Poland. Eur. J.
Plant Pathol. 2019, 155, 1363–1366. [CrossRef]
26. Wang, L.; Jin, J.; Song, Z.; Wang, J.; Zhang, L.; Rehman, T.U.; Ma, D.; Carpenter, N.R.; Tuinstra, M.T. LeafSpec: An accurate and
portable hyperspectral corn leaf imager. Comput. Electron. Agric. 2020, 169, 105209. [CrossRef]
27. Ma, D.; Wang, L.; Zhang, L.; Song, Z.; Rehman, T.U.; Jin, J. Stress Distribution Analysis on Hyperspectral Corn Leaf Images for
Improved Phenotyping Quality. Sensors 2020, 20, 3659. [CrossRef] [PubMed]
28. De Castro, A.; Madalozzo, G.A.; Trentin, N.S.; Costa, R.C.; Calvete, E.O.; Spalding, L.E.S.; Rieder, R. BerryIP embedded: An
embedded vision system for strawberry crop. Comput. Electron. Agric. 2020, 173, 105354. [CrossRef]
29. Yu, Z.; Ustin, S.L.; Zhang, Z.; Liu, H.; Zhang, X.; Meng, X.; Cui, Y.; Guan, H. Estimation of a New Canopy Structure Parameter for
Rice Using Smartphone Photography. Sensors 2020, 20, 4011. [CrossRef] [PubMed]
30. Ranjeeta, A.; Cheng, L.; Kirby, K.; Krishna, N. A low-cost smartphone controlled sensor based on image analysis for estimating
whole-plant tissue nitrogen (N) content in floriculture crops. Comput. Electron. Agric. 2020, 169, 105173. [CrossRef]
31. Hassanijalilian, O.; Igathinathane, C.; Doetkott, C.; Bajwa, S.; Nowatzki, J.; Esmaeili, S.A.H. Chlorophyll estimation in soybean
leaves inffield with smartphone digital imaging and machine learning. Comput. Electron. Agric. 2020, 174, 105433. [CrossRef]
32. Chung, S.; Breshears, L.E.; Yoon, J. Smartphone near infrared monitoring of plant stress. Comput. Electron. Agric. 2018, 154, 93–98.
[CrossRef]
33. Tao, M.; Huang, X.; Liu, C.; Deng, R.; Liang, K.; Qi, L. Smartphone-based detection of leaf color levels in rice plants. Comput.
Electron. Agric. 2020, 173, 105431. [CrossRef]
34. Danh, D.N.H.; Vincent, P.; Chi, P.; Rocio, V.; Khoa, D.; Christian, N. Night-based hyperspectral imaging to study association of
horticultural crop leaf reflectance and nutrient status. Comput. Electron. Agric. 2020, 173, 105458. [CrossRef]
35. Liu, B.; Yue, Y.; Li, R.; Shen, W.; Wang, K. Plant Leaf Chlorophyll Content Retrieval Based on a Field Imaging Spectroscopy
System. Sensors 2014, 14, 19910–19925. [CrossRef] [PubMed]
36. Pérez-Patricio, M.; Aguilar-González, A.; Camas-Anzueto, J.L.; Navarro, N.A.M.; Grajales-Coutiño, R. An FPGA-based smart
camera for accurate chlorophyll estimations. Int. J. Circuit Theory Appl. 2018, 46, 1663–1674. [CrossRef]
37. Brambilla, M.; Romano, E.; Buccheri, M.; Cutini, M.; Toscano, P.; Cacini, S.; Massa, D.; Ferri, S.; Monarca, D.; Fedrizzi, M.; et al.
Application of a low-cost RGB sensor to detect basil (Ocimum basilicum L.) nutritional status at pilot scale level. Precis. Agric. 2020.
[CrossRef]
38. Ye, X.; Abe, S.; Zhang, S.; Yoshimura, H. Hiroyuki. Rapid and non-destructive assessment of nutritional status in apple trees
using a new smartphone-based wireless crop scanner system. Comput. Electron. Agric. 2020, 173, 105417. [CrossRef]
39. Kangji, L.; Zhengdao, S.; Wenping, X.; Xu, C.; Hanping, M.; Gang, T. A fast modeling and optimization scheme for greenhouse
environmental system using proper orthogonal decomposition and multi-objective genetic algorithm. Comput. Electron. Agric.
2020, 168, 105096. [CrossRef]
40. Chen, X. Research on Data Interpolation Model with Missing Data for Intelligent Greenhouse Control. In Proceedings of the 2019
International Conference on Robots & Intelligent System (ICRIS), Haikou, China, 15–16 June 2019. [CrossRef]
Sensors 2021, 21, 2575 21 of 23
41. Wu, H.; Li, Q.; Zhu, H.; Han, X.; Li, Y.; Yang, B. Directional sensor placement in vegetable greenhouse for maximizing target
coverage without occlusion. Wirel. Netw. 2020, 26, 4677–4468. [CrossRef]
42. Atefi, A.; Ge, Y.; Pitla, S.; Schnable, J. In vivo human-like robotic phenotyping of leaf traits in maize and sorghum in greenhouse.
Comput. Electron. Agric. 2019, 163, 104854. [CrossRef]
43. Geng, X.; Zhang, Q.; Wei, Q.; Zhang, T.; Cai, Y.; Liang, Y.; Sun, X. A Mobile Greenhouse Environment Monitoring System Based
on the Internet of Things. IEEE Access 2019, 7, 135832–135844. [CrossRef]
44. Ma, D.; Carpenter, N.; Maki, H.; Rehman, T.U.; Tuinstra, M.R.; Jin, J. Greenhouse environment modeling and simulation for
microclimate control. Comput. Electron. Agric. 2019, 162, 134–142. [CrossRef]
45. Uyeh, D.D.; Ramlan, F.W.; Mallipeddi, R.; Park, T.; Woo, S.; Kim, J.; Kim, Y.; Ha, Y. Evolutionary Greenhouse Layout Optimization
for Rapid and Safe Robot Navigation. IEEE Access 2019, 7, 88472–88480. [CrossRef]
46. Nissimov, S.; Goldberger, J.; Alchanatis, V. Obstacle detection in a greenhouse environment using the Kinect sensor. Comput.
Electron. Agric. 2015, 113, 104–115. [CrossRef]
47. Bontsema, J.; van Henten, E.J.; Gieling, T.H.; Swinkels, G.L.A.M. The effect of sensor errors on production and energy consumption
in greenhouse horticulture. Comput. Electron. Agric. 2011, 79, 63–66. [CrossRef]
48. Pratim, R.P. Internet of Things for Smart Agriculture: Technologies, Practices and Future Direction. J. Ambient. Intell. Smart
Environ. 2017, 9, 395–420. [CrossRef]
49. Kinjal, A.R.; Patel, B.S.; Bhatt, C.C. Smart Irrigation: Towards Next Generation Agriculture. In Internet of Things and Big Data
Analytics Toward Next-Generation Intelligence; Springer: Cham, Switzerland, 2018; Volume 30. [CrossRef]
50. Mishra, B.; Kertesz, A. The Use of MQTT in M2M and IoT Systems: A Survey. IEEE Access 2020. [CrossRef]
51. Dobrescu, R.; Merezeanu, D.; Mocanu, S. Context-aware control and monitoring system with IoT and cloud support. Comput.
Electron. Agric. 2019, 160, 91–99. [CrossRef]
52. Yang, J.; Liu, M.; Lu, J.; Miao, Y.; Hossain, M.A.; Alhamid, M.F. Botanical Internet of Things: Toward Smart Indoor Farming by
Connecting People, Plant, Data and Clouds. Mob. Netw. Appl. 2018, 23, 188–202. [CrossRef]
53. Akyildiz, I.F.; Su, W.; Sankarasubramaniam, Y.; Cayirci, E. Wireless sensor networks: A survey. Comput. Netw. 2002, 38, 393–422.
[CrossRef]
54. Zhou, Y.; Duan, J. Design and Simulation of a Wireless Sensor Network Greenhouse-Monitoring System Based on 3G Network
Communication. Int. J. Online Eng. (iJOE) 2016, 12, 48. [CrossRef]
55. Zhou, Y.; Xie, Y.; Shao, L. Simulation of the Core Technology of a Greenhouse-Monitoring System Based on a Wireless Sensor
Network. Int. J. Online Eng. (iJOE) 2016, 12, 43–47. [CrossRef]
56. Astillo, P.V.; Kim, J.; Sharma, V.; You, I. SGF-MD: Behavior Rule Specification-Based Distributed Misbehavior Detection of
Embedded IoT Devices in a Closed-Loop Smart Greenhouse Farming System. IEEE Access 2020, 8, 196235–196252. [CrossRef]
57. Kocian, A.; Massa, D.; Cannazzaro, S.; Incrocci, L.; Milazzo, P.; Chessa, S.; Ceccanti, C. Dynamic Bayesian network for crop
growth prediction in greenhouses. Comput. Electron. Agric. 2020, 160, 105167. [CrossRef]
58. Lekbangpong, N.; Muangprathub, J.; Srisawat, T.; Wanichsombat, A. Precise Automation and Analysis of Environmental Factor
Effecting on Growth of St. John’s Wort. IEEE Access 2019, 7, 112848–112858. [CrossRef]
59. Chen, M.; Yang, J.; Zhu, X.; Wang, X.; Liu, M.; Song, J. Smart Home 2.0: Innovative Smart Home System Powered by Botanical IoT
and Emotion Detection. Mob. Netw. Appl. 2017, 22, 1159–1169. [CrossRef]
60. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [CrossRef]
61. Liakos, K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674.
[CrossRef]
62. Yao, C.; Zhang, Y.; Zhang, Y.; Liu, H. Application of Convolutial Neural Network in Classification of High Resolution Agricultural
Remote Sensing Images. In International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Proceedings
of the 2017 ISPRS Geospatial Week 2017, Wuhan, China, 18–22 September 2017; Copernicu Publications: Göttingen, Germany, 2017;
Volume XLII-2/W7, pp. 989–992. [CrossRef]
63. Boulent, J.; Foucher, S.; Théau, J.; St-Charles, P. Convolutional Neural Networks for the Automatic Identification of Plant Diseases.
Front. Plant Sci. 2019, 10, 941. [CrossRef] [PubMed]
64. Singh, A.K.; Ganapathysubramanian, B.; Sarkar, S.; Singh, A. Deep Learning for Plant Stress Phenotyping: Trends and Future
Perspectives. Trends Plant Sci. 2018, 23, 883–898. [CrossRef] [PubMed]
65. Zhang, J.; Zhu, Y.; Zhang, X.; Ye, M.; Yang, J. Developing a Long Short-Term Memory (LSTM) based model for predicting water
table depth in agricultural areas. J. Hydrol. 2018, 561, 918–929. [CrossRef]
66. An, J.; Li, W.; Li, M.; Cui, S.; Yue, H. Identification and Classification of Maize Drought Stress Using Deep Convolutional Neural
Network. Symmetry 2019, 11, 256. [CrossRef]
67. Zhuang, S.; Wang, P.; Jiang, B.; Li, M.; Gong, Z. Early detection of water stress in maize based on digital images. Comput. Electron.
Agric. 2017, 140, 461–468. [CrossRef]
68. Zhu, N.; Liu, X.; Liu, Z.; Hu, K.; Wang, Y.; Tan, J.; Huang, M.; Zhu, Q.; Ji, X.; Jiang, Y.; et al. Deep learning for smart agriculture:
Concepts, tools, applications, and opportunities. Int. Agric. Biol. Eng. 2018, 11, 32–44. [CrossRef]
69. Mora-Fallas, A.; Goëau, H.; Joly, A.; Bonnet, P.; Mata-Montero, E. Segmentación de instancias para detección automática de
malezas y cultivos en campos de cultivo. Revista Tecnología En Marcha 2020, 33, 13–17. [CrossRef]
Sensors 2021, 21, 2575 22 of 23
70. Sa, I.; Ge, Z.; Dayoub, F.; Upcroft, B.; Perez, T.; McCool, C. DeepFruits: A Fruit Detection System Using Deep Neural Networks.
Sensors 2016, 16, 1222. [CrossRef] [PubMed]
71. Kang, H.; Chen, C. Fruit Detection and Segmentation for Apple Harvesting Using Visual Sensor in Orchards. Sensors 2019,
19, 4599. [CrossRef]
72. Chen, S.W.; Shivakumar, S.S.; Dcunha, S.; Das, J.; Okon, E.; Qu, C.; Taylor, J.; Kumar, V. Counting Apples and Oranges With Deep
Learning: A Data-Driven Approach. IEEE Robot. Autom. Lett. 2017, 2, 781–788. [CrossRef]
73. Rodrigues, E.R.; Oliveira, I.; Cunha, R.; Netto, M. DeepDownscale: A Deep Learning Strategy for High-Resolution Weather
Forecast. In Proceedings of the 2018 IEEE 14th International Conference on e-Science (e-Science), Amsterdam, The Netherlands,
29 October–1 November 2018; pp. 415–422. [CrossRef]
74. Wibisono, M.N.; Ahmad, A.S. Weather forecasting using Knowledge Growing System (KGS). In Proceedings of the 2017 2nd
International Conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE), Yogyakarta,
Indonesia, 1–2 November 2017; pp. 35–38. [CrossRef]
75. Rousseau, D.; Dee, H.; Pridmore, T. Imaging Methods for Phenotyping of Plant Traits. In Phenomics in Crop Plants: Trends, Options
and Limitations; Kumar, J., Pratap, A., Kumar, S., Eds.; Springer: New Delhi, India, 2015. [CrossRef]
76. Kalaji, H.M.; Schansker, G.; Brestic, M.; Bussotti, F.; Calatayud, A.; Ferroni, L.; Goltsev, V.; Guidi, L.; Jajoo, A.; Li, P.; et al. Frequently
asked questions about chlorophyll fluorescence, the sequel. In Photosynthesis Research; Springer: Dordrecht, The Netherlands,
2017; Volume 132, pp. 13–66. [CrossRef]
77. Made in China, Ningbo Peacefair Elevtronic Technology CO.LTD. PZEM004T, Single Phase TTL Modbus Electric Power Meter.
2019. Available online: https://peacefair.en.made-in-china.com/product/zygxPIcSbuhV/China-Peacefair-Pzem-004t-Single-
Phase-Ttl-Modbus-Electric-Power-Meter.html (accessed on 14 December 2020).
78. Greene, B.R.; Segales, A.R.; Waugh, S.; Duthoit, S.; Chilson, P.B. Considerations for temperature sensor placement on rotary-wing
unmanned aircraft systems. Atmos. Meas. Tech. 2018, 11, 5519–5530. [CrossRef]
79. Barnett, E.; Gosselin, C. Large-scale 3D printing with a cable-suspended robot. Addit. Manuf. 2015, 7, 27–44. [CrossRef]
80. Bosch-Sensortec, BME280 Combined Humidity and Pressure Sensor, Version 1.6 BST-BME280-DS002-15. 2018. Available
online: https://www.bosch-sensortec.com/media/boschsensortec/downloads/datasheets/bst-bme280-ds002.pdf (accessed
on 14 December 2020).
81. DF-Robot, Gravity Analog Infrared CO2 Sensor for Arduino SKU SEN0219. 2020. Available online: https://www.dfrobot.com/
product-1549.html (accessed on 14 December 2020).
82. Sparkfun-Vishay Semiconductors VEML6075, Datasheet VEML6075, Document Number: 84304. 2016. Available online: https:
//cdn.sparkfun.com/assets/3/c/3/2/f/veml6075.pdf (accessed on 14 December 2020).
83. Vishay Semiconductors VEML7700, Datasheet VEML7700, Document Number: 84286. 2019. Available online: https://www.
vishay.com/docs/84286/veml7700.pdf (accessed on 14 December 2020).
84. Seeed Studio the IoT Hardware Enabler, Groove Sensor, Groove Gas Sensor V2 (Multichannel). 2019. Available online: https:
//wiki.seeedstudio.com/Grove-Multichannel-Gas-Sensor-V2/ (accessed on 14 December 2020).
85. Raspberry PI, Accessories, PI NoIR Camera v2. 2016. Available online: https://www.raspberrypi.org/products/pi-noir-camera-
v2/?resellerType=home (accessed on 14 December 2020).
86. LEPRON FLIR, LWIR Micro Thermal Camera Module 2.5. 2015. Available online: https://lepton.flir.com/wp-content/uploads/
2015/06/lepton-2pt5-datasheet-04195.pdf (accessed on 14 December 2020).
87. DFRobot, Waterproof DS18B20 Digital Temperature Sensor for Arduino SEN-0198. 2019. Available online: https://www.dfrobot.
com/product-689.html (accessed on 14 December 2020).
88. DFRobot, Gravity: Analog TDS Sensor/Meter SEN-0244. 2019. Available online: https://www.dfrobot.com/product-1662.html
(accessed on 14 December 2020).
89. DFRobot, Gravity: Analog Spear Tip pH Sensor/Meter Kit SEN-0249. 2019. Available online: https://www.dfrobot.com/
product-1668.html (accessed on 14 December 2020).
90. DFRobot, Gravity: Analog pH Sensor/Meter Kit V2 SEN-0237A. 2019. Available online: https://www.dfrobot.com/product-16
28.html (accessed on 14 December 2020).
91. DFRobot, Gravity: Analog Turbidity Sensor For Arduino SEN-0189. 2019. Available online: https://www.dfrobot.com/product-
1394.html (accessed on 14 December 2020).
92. DFRobot, TCS3200 RGB Color Sensor for Arduino SEN-0101. 2019. Available online: https://www.dfrobot.com/product-540.
html (accessed on 14 December 2020).
93. DFRobot, TOF Sense Laser Ranging Sensor (5m) SEN-0337. 2019. Available online: https://www.dfrobot.com/product-2004.html
(accessed on 14 December 2020).
94. Li, Y.; Manoharan, S. A performance comparison of SQL and NoSQL databases. In Proceedings of the 2013 IEEE Pacific
Rim Conference on Communications, Computers and Signal Processing (PACRIM), Victoria, BC, Cabada, 27–29 August 2013;
pp. 15–19. [CrossRef]
95. Cukrov, M.; Jerončić, L.; Prelogović, L. Utjecaj Kontroliranog Vodnog Stresa na Sadržaj Bioaktivnih Spojeva u Hidroponskom Uzgoju
Rikole (Eruca Sativa Mill.) i Špinata (Spinacia oleracea L.); Graduate Paper Awarded with Rector Award; Faculty of Agronomy,
University of Zagreb: Zagreb, Croatia, 2017.
Sensors 2021, 21, 2575 23 of 23
96. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on
Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [CrossRef]
97. Szegedy, C.; Ioffe, S.; Vanhoucke, V.; Alemi, A. Inception-v4, Inception-ResNet and the Impact of Residual Connections on
Learning. arXiv 2016, arXiv:1602.07261.
98. Huang, G.; Liu, Z.; Maaten, L.V.D.; Weinberger, K.Q. Densely Connected Convolutional Networks. In Proceedings of the 2017
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 2261–2269.
[CrossRef]
99. Mingxing, T.; Quoc, V.L. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks. arXiv 2020,
arXiv:1905.11946v5.
100. Marjani, M.; Nasaruddin, F.; Gani, A.; Karim, A.; Hashem, I.A.T.; Siddiqa, A.; Yaqoob, I. Big IoT Data Analytics: Architecture,
Opportunities, and Open Research Challenges. IEEE Access 2017, 5, 5247–5261. [CrossRef]