A System Approach to Adaptive Multi-modal Sensor Designs
Project Status: Active (Started: 2008)
Recently, a great deal of effort has been put into adaptive and tunable multimodal sensor designs to address the challenging problems of detecting and identifying targets in highly cluttered, dynamic scenes. Whereas these efforts have created or will soon create novel adaptive multimodal sensors, it is unfortunate they are not up to the expectations from the real-world applications. Historically, the development of such a new sensor system began with overall specifications followed by specifications for the various elements followed by component development, system integration and test. For complex systems, this process is slow, expensive and inflexible because of the large number of requirements, constraints and design options that need to be resolved.
Therefore, we propose to demonstrate an iterative system approach to adaptive multimodal sensor designs. This approach will be based on the integration of tools developed by us for the physics-based simulation of complex scenes and targets and modeling of sensors with a workflow management system that enables the integration of hardware and software modules. The goal is to reduce development time and system cost while achieving better results through an iterative process that incorporates simulation, evaluation and refinement of critical elements.
As a case study, we use an effective peripheral-fovea design as an example of how tradeoffs can be done within a system context. This design is inspired by the biological vision systems for achieving real-time imaging with a hyperspectral/range fovea and panoramic peripheral view. This design and the related data exploitation algorithms will be simulated and evaluated in our general framework. The results of this project will be an optimized design for the peripheral-fovea structure and a system model for how sensor systems can be developed within a simulation context. The results will then be available as tools for the design of other sensor systems.
The PIs and other researchers at both CCNY and RIT will leverage their expertise in data simulation and data management at RIT, sensor design and data exploitation at CCNY to yield a system approach for adaptive multimodal sensor designs. The combined hyperspectral data/sensor simulation and management tools will support detailed system simulation with synthetic image data from virtual instruments. This data can then be used to evaluate design trade-offs, image processing algorithms and sensor fusion using performance metrics that can be specified for different scenarios.
AFOSR 2011 Summer Faculty Fellowship Program, Multimodal Layered Sensing for Modeling and Detection, Zhigang Zhu (PI), Edgardo Molina (PhD Student), T. Wang (PhD Student), 05/23/2011- 07/30/2011
AFOSR DISCOVERY CHALLENGE THRUSTS (DCTs), Award #FA9550-08-1-0199, A System Approach to Adaptive Multi-modal Sensor Designs, 04/01/2008 –03/31/2012
PI: Professor Zhigang Zhu (CCNY)
Co-PI: Professor Harvey Rhody (RIT)
DURIP, Award #W911NF-08-1-0531, Equipment for Networked Autonomous Mobile Systems, 09/26/2008-09/25/2009
PI: Professor Jizhong Xiao (EE)
Co-PI: Professor Zhigang Zhu (CS)