Fusion Camera Project (Internship)

Description

Situation awareness (SA) is a critical factor for successful decision-making in a wide range of sectors. Within the military context, high SA associates with an accurate perception of environmental elements, which is beneficial for quickly assessing the threats and projecting what is likely to occur in the immediate future. Hence, considerable tech companies and corporations are importing digital technology into the military, and their main object is to increase the situational awareness of the ground forces.

One of these companies is Microflown AVISA. Microflown AVISA’s mission is to provide a complete 3D situational awareness throughout measuring acoustic particle velocity with its invented sensor i.e. Acoustic Multi-Mission Sensor. AMMS is an acoustical detection sensor that is capable of detecting and localizing Rockets, Artillery and Mortars (RAM), and Small Arms Fire (SAF).One of the new products, Microflown intended to introduce shortly into the market, is the Castle system, which comprises an array of AMMS sensors and integrates new devices that increase accuracy and decrease error rate in captured data.
In the Castle system, some problems and uncertainties might arise when dealing with close threat situations. They can be summarized in twofold: firstly, the system has a difficulty of differentiating between audible threats and noise, and secondly, the operator, which occupies the multi-threat hostile environment, experiences a lack of vision of the surroundings because of the restricted view and thus, having slow instead of a fast response.
In order to rectify this problem and allow the operator to process the Castle information quickly and utilize it effectively to react to the threat sources, an imaging system should be integrated with the Castle system that provides a view of the threat source throughout converting geographic coordinates into pixels ones.
The main requirements of Fusion Camera system proposed by Microflow are: firstly, the system should be based on the client-server model, and hence, it consists of two applications — one is running on the server platform (AMR) and the other on the client platform. Secondly, the system must comprise The AMR platform that is based on the TX6Q-1036 Computer-On-Module board and The Orlaco cameras as image providers. Thirdly, the programming language is C++, and the GUI of client application should be programmed with QML (Qt Modeling language).

Hardware set up: AMR, Orlaco Camera’s, switch, AMMS and weather station

 

The integration between Fusion Camera and Castle systems

The research questions and challenges that are resolved during this project were as follows:

  • Regarding the Orlaco cameras attributes (Resolution- Encoding formats- bitrate- framerate), what are the best values that can be chosen to grant the best performance and compatibility of the camera with the system as a whole? (The answer to this question is presented in by conducting the Orlaco Camera Quality Test).
  • Developing software for embedded systems (AMR) requires running, debugging, and analyzing the software on the target to test its reliability. That is why a cross-compiler must be automated and configured to carry out these functions. Wherefore, the question arises; how can the cross-compiler be prepared in the presence of the Qt and OpenCV libraries? How can Qt be employed to simplify these processes in one click. (The answer to this question is presented by conducting research regarding Cross-Compiling Qt and OpenCV for Embedded System Development).
  • IF the frontend (GUI) of the client application were to be developed using QML and the backend using C++, how will the interaction be achieved between those ends.
  • When an event detected by the Castle system, an interval of time is consumed from the instant of detection by the Castle system until the instant of acquisition by the Fusion Camera system. Besides, the difference between the sound and light speeds is enormous; therefore, the information retrieved by the camera and AMMS sensor is not at the same timestamp captured. That is why frames should be buffered for a certain period of time, but the question here is, for how long should the frames be buffered.

Overview of the achievements:

The Server Application in the Fusion Camera system captures 360-degree pictures of the surroundings concurrently, and when an event has occurred, the system takes care of handling the retrieval of event’s data from the Castle system, whereupon it is processed and integrated into captured pictures. Then the pictures are sent to the Client application to be displayed.

A quick overview of the achievements which have been laid the foundation of building this system:

  • Orlaco Camera Quality Test: which revels the utmost compatible attributes of the Orlaco cameras with the current system.
  • Cross Compiling Qt and OpenCV For Embedded System research: which depicts how an ARM cross-build toolchain can be prepared to ease up developing for the embedded target.
  • Client Fusion Camera application: is an application which can be run on a Windows platform and make a TCP connection to receive data from The Server Fusion Camera application.
  • Server Fusion Camera application: is an application which can capture from 4 Orlaco cameras over UDP. And it can process the received data from the Castle system and integrating it in the corresponded pictures, which is sent immediately to the Client application when it is connected.
Client application main window

 

On the left: main window with informative dialog in case of error occurrence. On the right: An dialog window to allow the client to connect to a unique server.

 

Server application console (when the Server application runs with a default option)

 

Error detection system (when the Server application runs with a custom option and a wrong insertion has been filled)

 

Details
  • Date: November 7, 2020
  • Categories: AlgorithmC++Cross-compilingLinuxQt & QML
  • Client Microflown