18 June 2001
Air Force Research Laboratory, September 2000
Comptek Amherst Systems, Inc. delivers a reconfigurable vision system based on the biomimetic concept of foveal vision.
Comptek Amherst Systems, Inc. recently delivered a novel vision system to the Munitions Directorate under an ongoing Phase II Small Business Innovation Research effort. This real-time, non-mechanical, reconfigurable vision system acquires and tracks objects based on the biological concept of foveal vision.1 The camera allows wide field-of-view surveillance with localized high resolution in an area of interest. This new electronically reconfigurable system reduces the typical image-processing data throughput bottleneck to allow real-time target acquisition and tracking in one compact system. The unique design is small, inexpensive and eliminates the need for large power supplies and mechanical gazing mechanisms.
This vision system is beneficial to any bandwidth-constrained application that requires simultaneous wide field-of-view surveillance, localized high resolution, and fast frame rates. Several applications that benefit both the military and commercial sectors include search and rescue capabilities, intruder security and surveillance, and autonomous robots for security applications. The system could also be used for wireless networked cameras, particularly for military applications such as network-centric warfare and remote surveillance. State-of-the-art seekers for precision munitions and unmanned aerial vehicles (UAVs) are also military specific applications, while assembly line part inspections and quality control uses are of benefit to the commercial sector.
Electro-optic sensors are evolving from single color 128 � 128 arrays to multicolor 1000 � 1000 arrays. This increase in sensor size produces a data rate that on-board processors cannot handle. Biological vision systems avoid this problem by incorporating a small area of high resolution (fovea) within a wide field of view having a lower resolution. This biologically-inspired vision system is able to track targets over a wide field of view (60� for the lens currently being tested) at a 30 Hz video rate using a personal computer (PC) with no special digital signal processing (DSP) equipment. The hybrid tracker can locate and track even partially occluded targets. The vision system was successfully used to detect and track human faces1 and a model train on a simple circular track.
The prototype reconfigurable vision system consists of three main hardware components: a foveal camera, power supply, and host PC. The camera contains a visible imager and lens along with support electronics and interface connectors. The imager uses a complementary metal oxide semiconductor (CMOS) 128 � 128 reconfigurable multi-resolution active pixel sensor developed at the National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL).2 The imager features direct pixel access for reading high-resolution subsets of the imager array and on-chip pixel averaging to achieve a wide field of view at lower resolutions. The high-resolution subsets become dynamic windows that track moving objects. Resolution reduction occurs directly on the image sensor and requires minimal overhead, eliminating the traditional imager readout bottleneck. Noise suppression also occurs on-chip. The imager is located behind a C-mount camera lens with a 60� field of view. The camera lens is interchangeable; however, a tradeoff exists between field of view and resolution. Support electronics and interface connectors are industry standard commercial off-the-shelf (COTS) components.
The foveal camera consumes very little power. The 60-mW camera can be powered for several hours by a single 9-volt battery or indefinitely by a standard AC power adapter.
The third component of the prototype foveal vision system is a Windows NT� host PC. Targeting algorithms, video processing, and configuration control all execute on the Pentium� with no additional processors. The PC houses commercial digital-to-analog add-on interface boards. A digital input/output (I/O) board (National Instruments PCI-DIO-32HS) relays real-time control signals from the PC to the camera for imager configuration control. A high-speed analog-to-digital (A/D) board (National Instruments PCI-MIO-16E-1) digitizes and stores the video stream for access by the processor and video display hardware. Drivers for camera interfacing also reside inside the PC. The interface performs control signal generation and video data acquisition.
The prototype foveal vision system incorporates three software modules implemented as real-time priority programs in Windows NT�: a Detection and Tracking Algorithm (DTA) module, a Control module, and a Data Acquisition module. The DTA module processes imagery and generates window-of-interest requests. The Control module converts window requests into digital control signals that reconfigure the imager (updated at the sensor frame rate). The Data Acquisition module collects output from the A/D board and sends the requested window to the DTA. The DTA then generates the next window request.
The DTA processes incoming
imagery and establishes a window of interest. The algorithm uses three levels
of resolution: peripheral (two octaves below maximum imager resolution), perifoveal
(one octave below maximum imager resolution), or foveal (maximum imager resolution).
Figure 1 is an example of an image with all three resolutions: peripheral resolution
in the background and two windows of interest, one with perifoveal resolution
and one with foveal resolution. Peripheral resolution exploits fast frame rates
and a wide field of view used to determine the initial region of interest. Perifoveal
resolution is an intermediate resolution used to refine region of interest properties
after a region of interest is detected. Foveal resolution allows fine-tuning
of the region of interest and subsequent extraction of a template for tracking.
The DTA combines centroid-based
and correlation-based techniques to detect and track moving targets. The centroid-based
mode acquires a reliable target template. Then, the algorithm shifts to the
correlation-based mode. If the correlation-based mode looses track of the target,
the algorithm returns to the centroid-based mode to establish another target
template. This hybrid-tracking algorithm allows target position to remain stable
in the window of interest even in the presence of background clutter. Aircraft, munitions, and
UAVs that have the capability to extract regions of interest from large field-of-view
sensors will greatly ease the job of both human operators and automatic target
recognition algorithms. Today, a pilot must monitor several distributed sensors
to identify incoming missile threats. With a biologically-inspired vision system,
the pilot can respond more quickly to an incoming missile threat when a single
sensor autonomously identifies the threat and warns the pilot. Biologically-inspired
vision systems can conceptually aid all types of fire-and-forget munitions in
much the same way. The Small Smart Bomb, a 250-lb, fixed-target munition concept,
can use a reconfigurable vision system in the terminal seeker to increase munition
accuracy by reducing navigation errors. A future air superiority missile can
use a biologically-inspired vision system to combine guidance and fuzing functions
into an integrated, compact, and precise system. Reconfigurable vision systems
are also quite valuable for the latest anti-materiel munitions. These munitions
survey a wide field-of-view while gleaning enough information about specific
objects on the ground to reliably classify them as targets. The next generation of this
reconfigurable vision system will incorporate many upgrades and is currently
under development at Comptek Amherst Systems, Inc. The CMOS active pixel sensor
will be increased to a 256 � 256 array. The A/D conversion of the sensor output
will take place near the chip inside the camera housing, thereby reducing noise
and eliminating the need for the A/D board in the host PC. The digital I/O board
will also become unnecessary since all communications with the host PC will
be accomplished through a standard Ethernet interface. The system will also
be optimized to increase frame rates up to 1 kHz (for scenarios with sufficient
light and small readout windows). DSP-based COTS boards could also be easily
added for implementation of more complex tracking algorithms. Finally, Comptek
Amherst Systems, Inc. and NASA JPL are currently adding the ability to simultaneously
output up to three individually reconfigurable windows from the same exposure.
This closed-loop targeting
system is beneficial to any bandwidth-constrained application that requires
simultaneous wide field-of-view surveillance, localized high resolution, and
fast frame rates. The ability to switch modes of operation on a frame-by-frame
basis gives this system robustness and an error recovery capability that has
not been achieved by traditional mechanically-based vision systems. Mr. David Stack and Dr.
Cesar Bandera of Comptek Amherst Systems, Inc. and Dr. Guang Yang, Christopher
Wrigley, and Dr. Bedabrata Pain of NASA's Jet Propulsion Laboratory performed
this research. This article is a summary of previously published research.1
Mr. Nick Rummelt and Mr. Paul McCarley of the Munitions Directorate (MN), are
collaborate researchers and contributing authors. This research effort is part
of MN's biomimetic technology thrust.3 Figure 1. Example of
three-window foveal imagery: peripheral resolution (wide field-of-view surveillance),
perifoveal resolution (target detection), and foveal resolution (target tracking)
Capt. Amanda S. Birch
of the Air Force Research Laboratory's Munitions Directorate wrote
this article. For more information contact TECH CONNECT at (800) 203-6451 or
visit the web site at http://www.afrl.af.mil/techconn/index.htm.
Reference document MN-00-01.
References
1 Stack, David J., Cesar Bandera, Christopher Wrigley, Bedabrata Pain. "A Real-Time Reconfigurable Foveal Target Acquisition and Tracking System." Proc. SPIE, vol 3692-34, April 1999.
2 Pain, B., Kemeny, S. E., Panicacci, R., Matthies, L., Fossum, E. R. "Multiresolution Image Sensor." IEEE Trans. on Circuits and Systems for Video Technology, vol 7 (4)(1997) 575-583
3 Paul McCarley. "Advances in Biologically Inspired On/Near Sensor Processing." Proc. SPIE, vol 3698, April 1999.
The 2000 Defense Manufacturing Conference (DMC '00) will be held 27-30 November 2000, at the Tampa Convention Center and the Tampa Marriott Waterside Hotel in Tampa, Florida. DMC '00 is hosted by the US Air Force and the Joint Defense Manufacturing Technology Panel. Based on the theme "Foundation for Global Security," the agenda is structured to provide the 1,000 participants with an overview of defense manufacturing and sustainment, as well as detailed technical discussions relating to the various initiatives and technology areas currently being pursued. Attendees will be presented with the status of government and industry programs, and with a vision for the future of defense manufacturing and sustainment. DMC '00 will also feature over 120 government and industry exhibits and a poster session that will highlight the latest technology thrusts. In recent years, the exhibits and the poster session have become an integral conference feature, providing a setting that encourages face-to-face discussion between exhibitors and conference attendees.
About the Author:
Captain Amanda "Mandy" Sue Birch is currently the Executive Officer to the Commander, Munitions Directorate, Air Force Research Laboratory, Air Force Materiel Command, Eglin Air Force Base, Florida. She assists the commander in the direction of more than 400 military, civilian, and contractor personnel who develop, integrate, and transition the science and technology for air-launched munitions. The directorate performs research on precision guidance, missile guidance and control, computational mechanics, smart sub-munitions, warheads, and explosives.
Captain Birch was commissioned through the United States Air Force Academy in 1996 as a distinguished graduate. She holds a Master of Science degree in Mechanical Engineering from the Massachusetts Institute of Technology where she was a National Science Foundation Graduate Research Fellow. During her assignment at the Munitions Directorate, Captain Birch also served in the seeker Image and Signal Processing Brach where she evaluated autonomous target acquisition algorithms and demonstrated advanced biomimetic seeker technologies. She also served as a lead engineer in the Guidance Simulation Branch where she modeled and simulated flight performance capabilities of new munition concepts.