Submit manuscript...
eISSN: 2574-8092

International Robotics & Automation Journal

Research Article Volume 10 Issue 2

Radar based intelligence network for field security and remote detection

Mert Demir

Department of computer programming, Izmir Kavram Vocational School, Türkiye

Correspondence: Mert Demir, Lecturer, Izmir Kavram Vocational School, Izmir, Türkiye, 4449134

Received: June 15, 2024 | Published: June 25, 2024

Citation: Demir M. Radar based intelligence network for field security and remote detection. Int Rob Auto J. 2024;10(2):63-68. DOI: 10.15406/iratj.2024.10.00284

Download PDF

Abstract

Today, security applications are an increasing need and have become indispensable in many fields and sectors. Determining the locations of threat elements in advance is one of the primary security needs. Security experts attach importance to the use of advanced technologies to determine the locations of threat elements. Tools such as unmanned aerial vehicles (UAVs), satellite imaging systems, and thermal cameras play a critical role in monitoring and locating threat elements in the field. However, these are methods generally affected by weather conditions and various elements. In addition, these systems are managed by personnel, so they are field monitoring applications with human factors and whose success depends on people. In this study, a security network based on a strategic approach to detecting and predicting the movements of field elements and taking the necessary precautions is presented based on the radar-based field awareness approach. As a result, a network-based approach is discussed on how to ensure the security of a large area and determine the locations of its elements with an application made with a multi-agent-based robot group. The solutions presented in the study are intended to be an alternative to systems that have high operating costs and are affected by weather conditions.

Keywords: field security, remote detection, multi-agent systems, motion detection, artificial intelligence

Introduction

In today's world, security is of increasing importance in almost every field. In both the civilian and military sectors, security practices and strategies to deal with threats highlight the need to protect the assets of individuals, organizations, and nations. In this context, detecting the locations of threat elements in advance constitutes one of the cornerstones of modern security management and defense strategies. In our age of increasing conflicts, security is interpreted as being away from harm, threats and negative situations and ensuring public order.1–3 Increasing weapons and ongoing conflicts along with wars do not contribute positively to security.1,4,5 As a matter of fact, unsafe environments force people to migrate, and moving masses can increase unsafe environments and cause concerns.6–8 Migrations after the negative events have increased over the years (Figure 1). According to the data obtained, it can be predicted that mass movements will continue to increase worldwide. Mass movements are also an important factor that triggers environments of unrest and insecurity. Security practices should be provided first of all, by taking precautions from places where the resident population is low, to places where the population is more crowded. The reason for this is that densely populated urban environments are more vulnerable to threats due to crowding.9,10

Figure 1 Change in increasing migration over the years (MPI; Migration Policy Institute).

Security experts attach great importance to the use of the latest technologies in identifying and tracking the locations of threat elements. Tools such as UAVs, satellite imaging systems, thermal cameras and radars play a critical role in monitoring and locating threat elements in the field. However, the use of these technologies also brings with it some difficulties and limitations. These limitations often give rise to topics of debate and research.

Among the methods used in field observations, UAVs have become popular in many areas in recent years.11–14 Nowadays, unmanned aerial vehicles are increasingly used in earth observation, exploration, agriculture and surveillance missions (Figure 2). These technological devices have a wide range of applications in military, civil, agricultural, environmental and scientific fields.15,16 However, a number of negative situations encountered by UAVs during earth observation may affect the effectiveness and reliability of these systems.17 UAVs are very sensitive to weather conditions. Adverse weather conditions such as wind, heavy rainfall, fog and storms can seriously affect the flight capabilities of UAVs. This may disrupt UAVs' flight plans and reduce the accuracy of earth observation data. In addition, UAV costs, flight times, GPS outages, foreign vehicles threatening UAVs, etc. For some reasons, UAV ground observations can be problematic. In addition, in manned field observations, the success of the applications made with observation towers and observation cameras placed in the field or at the border depends on the attention and skills of the personnel doing that job.

Figure 2 An UAV used for field observation.

Apart from UAVs, there are also surveillance vehicles that have lower operating costs and are deployed on the ground for observation purposes. Thermal cameras and night vision cameras are used today in many areas such as earth observation, security and military operations.17 These cameras use special technologies to capture images in low light conditions or complete darkness (Figure 3). Thermal cameras are tools frequently used in border observations today.18 However, thermal cameras and night vision cameras can also face their own limitations and downsides. Thermal cameras and night vision cameras can be affected by adverse weather conditions such as rain, snow, fog or dense dust. Thermal cameras, in particular, can greatly limit visibility in dense fog. Snow, raindrops or dust particles can contaminate the cameras' optical lenses, which can reduce image quality. The prices of high-resolution thermal cameras are still much higher than the current camera prices. It is very easy to counter these technologies used in border surveillance and military fields. Some camouflaged clothing and umbrellas that terrorists and immigrants trying to cross the border use to avoid being caught by these cameras are among the methods used against thermal vision cameras. Because these cameras work on the principle of detecting infrared light scattered from objects that emit heat. However, the energy of infrared lights is very low and can be prevented from being emitted and detected by a very thin material or fabric cover.

Figure 3 An observation made with a thermal camera in the field.

Observation tools are indispensable tools in security missions. Today, after increasing conflicts, protecting and monitoring borders in many countries has become a matter of concern and efforts are being increased.19–22 Among the studies examined, country-specific smuggling, terrorism, illegal immigration, espionage, etc. There are events. Countries in different continents have different analyzes and approaches on these issues that are specific to their own geographies.23–26 When the studies were examined, it was seen that there was a need for an effective solution that would gather all needs under one roof.

In this study, considering the deficiencies in existing studies, a fully automatic field observation network is proposed, without the human factor and using herd group communication. The radar-based field awareness approach is a significant advance in security management, remote sensing, remote measurement and threat detection. This approach offers the ability to predict the movements of elements in the field and the ability to take and observe the necessary security measures. Additionally, this approach goes beyond traditional methods and includes an application carried out with a group of robots that work based on multi-agents. This application represents a new approach that will not be affected by weather conditions, how to ensure the security of large areas and to determine the locations of threat elements more precisely.

Material and methods

Material

In order to ensure field security and detect field elements, an observation robot with a 7 km communication range was produced, based on a Mini PC T6 computer with 4GB RAM memory, 1.92 GHz processor speed, 1920x1080 resolution camera and software with motion detection feature. This module can establish point-to-point communication as well as swarm communication between modules (Figure-4). The modules are installed vertically on the ground by digging into the ground on the field. Power consumption is around 220mA when all hardware of the system is running. Powered by two LiFePO4 7000mAh capacity batteries, the modules can charge their batteries with the solar panels on them for night use. Among the reasons why LiFePO4 batteries are preferred are their ability to be charged in a short time, high capacity, and long discharge time compared to traditional lithium-ion batteries.27−30 Solar panels are a clean energy source and are an increasing type of energy source today, making them an ideal solution for low power consumption systems in land applications.31,32 The body of the system is circular in shape with a diameter of 6cm and all electronic equipment and control systems are placed in this body. This design is also to prevent the module from being noticed from afar. The radar network module is 1m high and scans the land with the Doppler radar on its body, while detecting movements in the land with the camera at the top of the module (Figure 4).

Figure 4 Field observation module diagram.

The camera used can scan a 360-degree area on the module (Figure 5). The system camera for terrain observation is located at the top of the module. During night hours when the camera cannot be used or in weather conditions where observation is not possible, a Doppler radar operating in the 10.525 Ghz band located in the middle of the body of the module is used. Doppler radar is used to detect the presence of objects in the field by using the principle of transmission and reception of electromagnetic waves. Doppler radar, like the camera on top of the module, can scan a 360-degree area while inside the body of the module. Camera and doppler radar equipment are controlled by servo motors. The body height can be increased depending on geographical conditions and terrain elements.

Figure 5 Observation module with radar and camera.

Methods

Detection of the target element on the field begins with one of the modules with motion detection feature detecting a moving object on the field (Figure 6). During night periods when the camera cannot be used or in bad weather conditions, the process begins with the detection of the field element by the doppler radar.

Figure 6 Detection of people moving in the field.

In order to determine the location of the element detected in the field, it must be observed by at least two observation modules (M). In this context, the first stage; coordinate of the field element; T( x,y ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaGqaaKqzGeaeaa aaaaaaa8qacaWFubGcpaWaaeWaaeaajugib8qacaWF4bGaa8hlaiaa =LhaaOWdaiaawIcacaGLPaaaaaa@3D93@ is the coordinate of the module that detects this element; It is to determine the coordinate observation angle α 1 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaaeySd8aadaWgaaWcbaWdbiaaigdaa8aabeaaaaa@3978@  between M( x,y ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaGqaaKqzGeaeaa aaaaaaa8qacaWFnbGcpaWaaeWaaeaajugib8qacaWF4bGaa8hlaiaa =LhaaOWdaiaawIcacaGLPaaaaaa@3D8C@ , that is, the first step is to find the linear vector formed by the detected target element and observation module coordinates (1):

tan( 1 )( T x M x1 )=( T y M y1 ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaaeiDaiaabggacaqGUbGaaiikaiabg2Hi1+aadaWgaaWcbaWdbiaa igdaa8aabeaak8qacaGGPaWaaeWaa8aabaWdbiaadsfapaWaaSbaaS qaa8qacaWG4baapaqabaGcpeGaeyOeI0Iaamyta8aadaWgaaWcbaWd biaadIhacaaIXaaapaqabaaak8qacaGLOaGaayzkaaGaeyypa0Zaae Waa8aabaWdbiaadsfapaWaaSbaaSqaa8qacaWG5baapaqabaGcpeGa eyOeI0Iaamyta8aadaWgaaWcbaWdbiaadMhacaaIXaaapaqabaaak8 qacaGLOaGaayzkaaaaaa@4EC2@    (1)

Accordingly, the first determination of the line vector ( Y 1 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape Gaamywa8aadaWgaaWcbaWdbiaaigdaa8aabeaaaaa@391F@ ) is equation (2) can be shown as:

Y 1 =tan( 1 )X+ M y tan( 1 )x  M x MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape Gaamywa8aadaWgaaWcbaWdbiaaigdaa8aabeaak8qacqGH9aqpcaqG 0bGaaeyyaiaab6gacaGGOaGaeyyhIu7damaaBaaaleaapeGaaGymaa WdaeqaaOWdbiaacMcacaWGybGaey4kaSIaamyta8aadaWgaaWcbaWd biaadMhaa8aabeaak8qacqGHsislcaqG0bGaaeyyaiaab6gacaGGOa GaeyyhIu7damaaBaaaleaapeGaaGymaaWdaeqaaOWdbiaacMcacaWG 4bGaaiiOaiaad2eapaWaaSbaaSqaa8qacaWG4baapaqabaaaaa@5121@    (2)

At this stage, the observation module that detects the object in the field makes a broadcast for other observation modules in the field to detect the target. This broadcast information includes planar line vector information and coordinate detection angle information produced by the module that first detected the movement. Other observation modules that receive this information and are within communication range use their cameras or radars to find moving objects in the field by turning them in different directions. Any other observation module that detects a moving object in the field produces a new coordinate line vector calculation from the point of observation. The coordinate of the target is deduced by using the subsequent coordinate vector information and the coordinate vector information it produces (3):

Y 2 =tan( 2 )X+ M y2 tan( 2 )x  M x2 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape Gaamywa8aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacqGH9aqpcaqG 0bGaaeyyaiaab6gacaGGOaGaeyyhIu7damaaBaaaleaapeGaaGOmaa WdaeqaaOWdbiaacMcacaWGybGaey4kaSIaamyta8aadaWgaaWcbaWd biaadMhacaaIYaaapaqabaGcpeGaeyOeI0IaaeiDaiaabggacaqGUb Gaaiikaiabg2Hi1+aadaWgaaWcbaWdbiaaikdaa8aabeaak8qacaGG PaGaamiEaiaacckacaWGnbWdamaaBaaaleaapeGaamiEaiaaikdaa8 aabeaaaaa@529C@    (3)

As a result of these inferences, the intersection point of the two vectors gives the coordinates of the field element detected by the two field observation modules. Accordingly, the latitudinal-x coordinate ( T x ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaaiikaiaadsfapaWaaSbaaSqaa8qacaWG4baapaqabaGcpeGaaiyk aaaa@3ACF@  of the detected field element is calculated as follows (4):

T x =( M y2 M y1 +tan( 1 ) M x1 tan( 2 ) M x2 )/( tan( 1 )tan( 2 )) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape Gaamiva8aadaWgaaWcbaWdbiaadIhaa8aabeaak8qacqGH9aqpdaqa daWdaeaapeGaamyta8aadaWgaaWcbaWdbiaadMhacaaIYaaapaqaba GcpeGaeyOeI0Iaamyta8aadaWgaaWcbaWdbiaadMhacaaIXaaapaqa baGcpeGaey4kaSIaaeiDaiaabggacaqGUbGaaiikaiabg2Hi1+aada WgaaWcbaWdbiaaigdaa8aabeaaaOWdbiaawIcacaGLPaaacaWGnbWd amaaBaaaleaapeGaamiEaiaaigdaa8aabeaak8qacqGHsislcaqG0b Gaaeyyaiaab6gacaGGOaGaeyyhIu7damaaBaaaleaapeGaaGOmaaWd aeqaaOWdbiaacMcacaWGnbWdamaaBaaaleaapeGaamiEaiaaikdaa8 aabeaak8qacaGGPaGaai4lamaabmaapaqaa8qacaqG0bGaaeyyaiaa b6gacaGGOaGaeyyhIu7damaaBaaaleaapeGaaGymaaWdaeqaaaGcpe GaayjkaiaawMcaaiabgkHiTiaabshacaqGHbGaaeOBaiaacIcacqGH DisTpaWaaSbaaSqaa8qacaaIYaaapaqabaGcpeGaaiykaiaacMcaaa a@6976@    (4)

After the ( T x ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaaiikaiaadsfapaWaaSbaaSqaa8qacaWG4baapaqabaGcpeGaaiyk aaaa@3ACF@  coordinate information obtained, the longitudinal-y coordinate ( T y ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaaiikaiaadsfapaWaaSbaaSqaa8qacaWG5baapaqabaGcpeGaaiyk aaaa@3AD0@  of the field element can be found as follows (5,6):

T y =tan( 1 )  T x + M y1 tan( 1 ) M x1 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape Gaamiva8aadaWgaaWcbaWdbiaadMhaa8aabeaak8qacqGH9aqpcaqG 0bGaaeyyaiaab6gacaGGOaGaeyyhIu7damaaBaaaleaapeGaaGymaa WdaeqaaOWdbiaacMcacaGGGcGaamiva8aadaWgaaWcbaWdbiaadIha a8aabeaak8qacqGHRaWkcaWGnbWdamaaBaaaleaapeGaamyEaiaaig daa8aabeaak8qacqGHsislcaqG0bGaaeyyaiaab6gacaGGOaGaeyyh Iu7damaaBaaaleaapeGaaGymaaWdaeqaaOWdbiaacMcacaWGnbWdam aaBaaaleaapeGaamiEaiaaigdaa8aabeaaaaa@5345@    (5)

or

T y =tan( 2 )  T x + M y2 tan( 2 ) M x2 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape Gaamiva8aadaWgaaWcbaWdbiaadMhaa8aabeaak8qacqGH9aqpcaqG 0bGaaeyyaiaab6gacaGGOaGaeyyhIu7damaaBaaaleaapeGaaGOmaa WdaeqaaOWdbiaacMcacaGGGcGaamiva8aadaWgaaWcbaWdbiaadIha a8aabeaak8qacqGHRaWkcaWGnbWdamaaBaaaleaapeGaamyEaiaaik daa8aabeaak8qacqGHsislcaqG0bGaaeyyaiaab6gacaGGOaGaeyyh Iu7damaaBaaaleaapeGaaGOmaaWdaeqaaOWdbiaacMcacaWGnbWdam aaBaaaleaapeGaamiEaiaaikdaa8aabeaaaaa@5349@    (6)

can be shown as (Figure 7).

Figure 7 Detection of field element (T) as a result of observations of field modules.

After obtaining Tx and Ty coordinate information, the exact coordinates of the element detected in the field were obtained. As more modules are included in the process, real-time movement and direction of the field element can be monitored. Since it is not possible to continuously observe the land with a camera due to different light levels in day and night environments, land scanning modes have been adjusted from the camera or radar according to the light level of the environment. Accordingly, if the image taken from the camera falls below a certain light level, land scanning is started with radar instead of the camera. Measurement of the bright level ( B L MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaamOqa8aadaWgaaWcbaWdbiaadYeaa8aabeaaaaa@391E@ ) of the environment  is understood by measuring the brightness of each color pixel (x) value in the image taken from the camera and determining the general ambient brightness. Accordingly, the light level of the camera image is determined by the ratio of the average of the pixel color (Red-Green-Blue) values ​​of the camera image ( P R ,  P G ,  P B , ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaaiikaiaadcfapaWaaSbaaSqaa8qacaWGsbaapaqabaGcpeGaaiil aiaacckacaWGqbWdamaaBaaaleaapeGaam4raaWdaeqaaOWdbiaacY cacaGGGcGaamiua8aadaWgaaWcbaWdbiaadkeaa8aabeaak8qacaGG SaGaaiiOaiaacMcaaaa@4446@ to all camera pixels (7):

B L = (   ( P R + P G + P B ))/3 P x x  P y MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaamOqa8aadaWgaaWcbaWdbiaadYeaa8aabeaak8qacqGH9aqpdaWc aaWdaeaapeGaaiikamaavababeWcpaqaa8qacaGGGcaabeqdpaqaa8 qacqGHris5aaGcdaqadaWdaeaapeGaamiua8aadaWgaaWcbaWdbiaa dkfaa8aabeaak8qacqGHRaWkcaWGqbWdamaaBaaaleaapeGaam4raa WdaeqaaOWdbiabgUcaRiaadcfapaWaaSbaaSqaa8qacaWGcbaapaqa baaak8qacaGLOaGaayzkaaGaaiykaiaac+cacaaIZaaapaqaa8qaca WGqbWdamaaBaaaleaapeGaamiEaaWdaeqaaOWdbiaadIhacaGGGcGa amiua8aadaWgaaWcbaWdbiaadMhaa8aabeaaaaaaaa@50F0@    (7)

During the day, the brightness of the environment changes depending on the position of the sun and the brightness of the air. As the brightness of the camera image decreases when it gets dark, a brightness threshold has been determined as motion detection will fail (Figure 8).

Figure 8 Change of brightness in the camera image depending on the time of day.

In cases where the brightness threshold is 25-30% below, it is not enough to track the detection of movements of objects in the field, so observation modules continue to scan the area with doppler radar instead of cameras. In the morning hours, when camera brightness allows motion measurement, camera tracking mode is switched instead of doppler radar (Figure 9).

Figure 9 System diagram.

Results and discussion

In this study, focusing on the difficulties and limitations of technologies used in security applications and field observations, a fully automatic field scanning observation network that uses multiple observation capabilities without the human factor is proposed. The proposed system was developed using a Mini PC T6 computer-based observation robot with a camera and motion detection feature, with a communication range of 7km. In the application, the detection rate ( D R MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaaiaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape Gaamira8aadaWgaaWcbaWdbiaadkfaa8aabeaaaaa@3926@ ) of a 1x1m object depending on the distance (m) is shown in Figure 10. Accordingly, while the observation range of the system with the observation camera is 1km, the detection range with the doppler radar is around 40m. This range can be increased with the use of more powerful doppler radars.

Figure 10 Graph of detection rate depending on distance.

The measurement obtained is a standard observation made with a 1920x1080 resolution wide-angle camera in open terrain, without terrain obstacles. It is possible to perform better quality and more distant observation checks with higher resolution cameras. The developed system consists of modules with motion detection features, a communication group created with a radar network, and robots that are elements of the multi-agent system that collectively use communication facilities. The radar network module scans and detects the movements of elements in the field through its Doppler radar and camera. These modules, which have motion detection features, communicate and cooperate with each other to determine the coordinates of the elements detected in the field. This is about multi-agent elements that share the same environment working together. Field observation robots with communication capabilities are designed to monitor the movements of elements in the field in real time and take security measures. The motion detection software on the mini computer inside the module tries to detect the presence of movement by performing real-time image processing on the images coming from the camera. By placing the modules along the border at regular intervals, it is possible to provide security control along the border line and determine the locations of field elements remotely (Figure 11).

Figure 11 A model of the observation network created by the use of observation modules in the field (red circles).

The modules, which basically use cameras for detection, perform field scans with their radars at night and under conditions where weather conditions are bad and camera observations are not sufficient. The results of the study show that the proposed fully automatic field observation network is capable of effectively detecting, tracking and calculating the location of field elements. The system can cope with adverse situations thanks to the radar network module that is not affected by weather conditions and modules with motion detection features (Figure 12). In addition, field observation robots with communication capabilities provide more precise detection and monitoring by sharing the coordinates of elements in the field with each other. The aim of the study is to be an alternative to the expensive observations made today, which often have vulnerabilities such as manned observations, thermal camera observations and unmanned aerial vehicles.

Figure 12 Observation module used in the field.

Conclusion

In this study, a camera-based land observation and measurement module with doppler radar support and motion detection feature is proposed as an alternative to today's thermal camera and lidar land observation systems. This system is an alternative to systems affected by weather conditions such as fog and haze. The results obtained are satisfactory. In addition to surveillance, the system is suitable for remotely detecting the location of the target object in the field, creating a safe area for military purposes in the field, and being used in police station and property protection duties. Accordingly, with a single module, the system is capable of measuring and monitoring an area larger than 3km² in camera mode and 5000 m² in radar mode in open areas and flat terrain, based on camera and radar limits. Radar support operating in the 10.585Ghz band, which is not affected by weather conditions, multiple observation capability, intelligence acquisition, and remote location calculation were the important parts emphasized in the study. Particularly as seen in Figure-9, since the observation ability of today's camera equipment is negatively affected under a certain brightness level and in bad weather conditions, solutions based on the cooperation of multiple robot groups are recommended in this study.

Acknowledgments

None.

Conflicts of interest

There is no conflict of interest.

References

  1. Jarvis L, Holland J. Security: a critical introduction. Palgrave Macmillan, Basingstoke. 2014.
  2. Czuryk M, Kostrubiec J. The legal status of local self-government in the field of public security. Studia nad Autorytaryzmem i Totalitaryzmem. 2019;41(1):33–47.
  3. Kostrubiec J. The The role of public order regulations as acts of local law in the performance of tasks in the field of public security by local self-government in Poland. Lex Localis-journal of Local Self-government. 2021;19:111–129.
  4. Cherep AV, Leshchenko AA. The essential characteristics of the recovery after armed conflicts in the countries of the world. The problems of economy. 2022.
  5. Djeddah C, Pm S. The impact of armed conflict on children: a threat to public health. Contribution of the World Health Organization to the United Nations Study on the Impact of Armed Conflict on Children. 1997.
  6. Canatan K. Immigration perceptions and attitudes of european societies: a sociological approach. Istanbul University Journal of Sociology. 2013;27:317–332.
  7. Yemen Öcal A. A study on conflict-related migrations within the context of public policy. Examining the social and economic impacts of conflict-induced migration. 2019. 24 p.
  8. Gryshova I, Kofman B, Petrenko O. Migration cultures and their outcomes for national security. Journal of Security and Sustainability Issues. 2019;8(3):18.
  9. Beqaj B. Public space, public interest and challenges of urban transformation. IFAC-Papers online. 2016;49(29):320–324.
  10. Minton A. The paradox of safety and fear: Security in public space. Architectural Design. 2018;88(3)84–91.
  11. Biswas S, Anavatti SG, Garrratt MA. Path planning and task assignment for multiple UAVs in dynamic environments. Unmanned aerial vehicles Advances in Nonlinear Dynamics and Chaos (ANDC). 2021;81–102.
  12. Srivastava D, Rakesh Kumar S, Gayathri N, et al. Security aspects and UAVs in socialized regions. Security in IoT Social Networks Intelligent Data-Centric Systems. 2021;229–245.
  13. Kumar GP, Sridevi B. Development of efficient swarm intelligence algorithm for simulating two-dimensional orthomosaic for terrain mapping using cooperative unmanned aerial vehicles. Intelligent Data-Centric Systems. 2020;75–93.
  14. Laghari AA, Awais Khan J, Rashid Ali L, et al. Unmanned aerial vehicles: A review. Cognitive Robotics. 2023;3:8–22.
  15. Chamola V, Pavan K, Aayush Agarwal, et al. Comprehensive review of unmanned aerial vehicle attacks and neutralization techniques. Ad Hoc Netw. 2021;111:102324.
  16. Velagapudi P, Owens S, Scerri P, et al. Environmental factors aecting situation awareness in unmanned aerial vehicles. Proceedings of AIAA. 2009.
  17. Gade R, Moeslund TB. Thermal cameras and applications: a survey. Machine vision and applications. 2014;25:245–262.
  18. ALshukri D, Vidhya Lavanya R, Sumesh EP, et al. Intelligent border security intrusion detection using IoT and embedded systems. 2019 4th MEC International Conference on Big Data and Smart City (ICBDSC). 2019;1–3 p.
  19. Glouftsios G. Governing border security infrastructures: Maintaining large-scale information systems. Security Dialogue. 2020;52(5):452–470.
  20. Upreti Y. Issues in border security of Nepal. Nepal Journals online. 2021;4(1):152­–160.
  21. Fauzan F, Abdullah K, Ahmad MZ. Border security problems in the waters of the Natuna Islands: Between national boundaries and illegal fishing. Journal of International Relations. 2019;3(2):94–114.
  22. Oliveira Martins B, Lidén K, Jumbert MG. Border security and the digitalisation of sovereignty: insights from EU borderwork. European Security. 2022;31(3):475–494.
  23. Koslowski R, Schulzke M. Drones along borders: Border security UAVs in the United States and the European Union. International Studies Perspectives. 2018;19(4):305-324.
  24. Akhilesh MK, Ratnakar MJ, Sandesh MR, et al. Border security system using arduino, Ultrasonic Sensors and IoT. International Research Journal of Engineering and Technology (IRJET). 2020;7(5):3293–3299.
  25. Agbiboa DE. Borders that continue to bother us: the politics of cross-border security cooperation in Africa’s Lake Chad Basin. Commonwealth & Comparative Politics. 2017;55(4):403–425.
  26. Watkins J. Bordering borderscapes: Australia’s use of humanitarian aid and border security support to immobilise asylum seekers. Geopolitics. 2017;22:958–983.
  27. Xu Y, Zhang B, Ge Z, et al. Advances and perspectives towards spent LiFePO4 battery recycling. Journal of Cleaner Production. 2023;434:140077.
  28. Li J, Ma Z. Past and present of LiFePO4: From fundamental research to industrial applications. Chem. 2019;5(1):3–6.
  29. Tseng Y, Huang H, Chen L, et al. Characteristic research on lithium iron phosphate battery of power type. MATEC Web conferences. 2018;185:0004.
  30. Wang WX, Wu Y. An overview of recycling and treatment of spent LiFePO4 batteries in China. Resources Conservation and Recycling. 2017;127:233–243.
  31. Parthiban R, Ponnambalam P. An enhancement of the solar panel efficiency: A comprehensive review. Front Energy Res. 2022;10:937155.
  32. Ali ZN, Shekhar S, Singh G. Review paper on solar panel. Journal of emerging technologies and innovative research. 2020;7(7):800–804.
Creative Commons Attribution License

©2024 Demir. This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.