023-024-025-026_EETE-NEW-VF.indd

EETE FEBRUARY 2013

AUDIO & VIDEO ELECTRONICS A fundamental challenge in the development of any video processing product is the complexity and diversity of the imagery that must be processed by the product for the large range of applications. Experience has shown that the development of a video processing function, with testing confined to only a limited data set, can introduce significant programme risk as discovery of ‘corner case problems’ late in the development may necessitate substantial rework. Consequently, RFEL performed a series of trials using various cameras and platforms, with imagery gathered at different times of the day and under various weather conditions. The data gathered was sufficiently diverse to give confidence that the stabilisation design would be fit for purpose for land, maritime and airborne applications. Several contrasting approaches can be used for electronic image stabilisation. The first, and most popular, is the use of prominent image features to generate frame-to-frame flow vectors. Typically, this approach involves feature detection and tracking of these features between frames. If the frame-to-frame movement is assumed to be low and high detection thresholds are used then the implementation can be relatively simple. However, performance and robustness when operating with diverse imagery can be poor. An alternative approach, adopted by RFEL, is to process image frames on a tile basis in the spatial frequency domain. Such an approach determines the stabilisation corrections through the Fig. 2: Stabilised image set. The rotation correction can be readily gauged from the edges of the image frame. analysis of much more scene information and can operate effectively even when the scene contains no high-contrast prominent features. Consequently, the spatial frequency approach lends itself to a more robust and accurate stabilisation solution, albeit at the price of a significantly higher processing complexity. In terms of the derived requirements, the RFEL stabilisation function was specified to deliver a stable image under the most demanding of applications covering: driving aids for military vehicles, diverse airborne platforms, targeting systems and remote border security cameras. Furthermore, the algorithm design was required to stabilise images subjected to two-dimensional translation and rotation from both static and moving platforms. It is envisaged that the stabilisation function will be used for supporting many different physical equipment installations. As such, the centre of rotation could be within the camera or external to it and the stabilisation algorithm must be able to cope with such installations. The stabilisation function was required to provide real-time correction at frame-rates of up to 150Hz for various imaging devices and for resolutions of up to 1080p including both daylight and infrared cameras. For example, a 1080p colour camera operating at 8-bits and with a frame-rate of 60Hz, necessitates operation with an input data rate of about 1 Gbits/s. An FPGA-based hardware implementation provides the computational resources needed to process several gigabit/s input data rates with the selected spatial frequency stabilisation method. However, even with the inherent processing power of an FPGA, the implementation has to be carefully tailored to satisfy the stringent latency and power consumption constraints. Given that the stabilisation function is likely to be only one component of a larger processing suite, it was also necessary to minimise the number of gates and external memory accesses used. The level of stabilisation accuracy achieved under a very diverse and demanding range of evaluation test data was typically less than ± 1 pixels, even when subjected to random frame-to frame displacements of up to ± 25 pixels in x and y directions and with a frame-to-frame rotational variation of up to ± 5˚. The performance of the stabilisation function is illustrated with figures 1 and 2, using a small number of frames from a daylight camera. The performance of the stabilisation design is shown using five consecutive frames from a sequence, when subjected to a random frameto frame rotation as large as ± 1˚ around the centre of the image. The design has proven to be extremely flexible and can be used for both static and moving camera platforms. Although this capability can be delivered through an FPGAonly implementation, further capability and performance can be achieved through additional software functions hosted on the ARM multicore processors embedded in the latest FPGAs. The stabilisation design was originally implemented on a development platform for Xilinx’s new Zynq-7000 All Programmable SoC, which hosts an ARM Cortex-A9 MPCore dual core processor. This development board allowed early revisions of the design to be matured based upon target device resource and processing constraints. The processor was accelerated by exploiting existing RFEL’s IP Core components that reside in the fabric of the FPGA and have been optimised and tested over the last 10 years. A specific hardware design was also undertaken that provides the stabilisation IP Core, together with other video processing functions, in a fully integrated custom hardware system-on-module. This module can interface with many different standards such as Analogue, CameraLink and GigE based protocols such as GigEVision. RFEL’s video image stabilisation processing capability is now available as an IP Core, optimised for FPGA. The fully integrated hardware system-on-module that incorporates the stabilisation function will be available in Q2/2013 and may be ruggedised to military standards. This stabilisation system offers exemplary performance even when the camera is subjected to extreme unwanted shifts and rotations. When this capability is coupled with a low power and low latency implementation, the design becomes highly suited to military and security applications, as well as more demanding commercial applications. In addition, the IP Core can be readily integrated with existing processor hardware with negligible impact on size and weight. 22 Electronic Engineering Times Europe February 2013 www.electronics-eetimes.com


EETE FEBRUARY 2013
To see the actual publication please follow the link above