Laboratory for Imaging Algorithms and Systems (LIAS)

Overview

The Laboratory for Imaging Algorithms and Systems was founded in 2001 to develop imaging systems and algorithms and carry them to the point of prototypical use. LIAS operates 4 airborne imaging systems. Imagery from these sensors is processed into georeferenced images for integration and display with standard geographic information systems (GIS). The systems include:

WASP: High performance visible/IR (RGB, SWIR, MWIR, LWIR) mapping system

WASP LITE: Low cost, light weight 7 band imaging system with reconfigurable spectral filters

MISI: 70 band visible, 10 bandinfrared imaging spectrometer

CAMMS: compact, light weight mapping system for light aircraft or UAV applications

Imaging algorithms are developed by the group, often in collaboration with the DIRS laboratory, for the extraction of information from remote sensing imagery. A camera geometry calibration laboratory with a unique imaging cage has been constructed for the calibration of infrared and visible camera equipment. An airborne system software architecture has been developed to manage real-time collection, processing and delivery of image and sensor information to end users such as emergency responders.

The LIAS Ground Station Laboratory develops sensors that can be deployed on the surface or in the water to monitor environmental parameters for environment and emergency response monitoring. The ground sensor systems and the airborne systems can be used together to provide an integrated view of the environment using imagery and sensor data.

The WASP system is unique in its ability to do rapid image collection in both visible and infrared bands and its ability to process and deliver the imagery to users in various locations in real time. The airborne data processing (ADP) architecture supports sensors, algorithms and communications as modules in a network environment. This enables distributed processing between remote and ground stations and immediate response to users. The uniqueness of this framework led Leica Geosystems, Inc. to name the LIAS laboratory a Center of Excellence in Photogrammetry and Remote Sensing.

LIAS is made up of a lead faculty member, associated research faculty and staff, and laboratory staff. The faculty lead is Harvey Rhody, and associated faculty and research staff include Eli Saber, John Kerekes, John Schott, Vincent Amuso, Soheil Dianat,, David Messinger, Carl Salvaggio, Bob Kremens, Don Light. The LIAS staff include Don McKeown, Jason Faulring, Bob Krzaczek, Bill Hoagland and Colleen Desimone. Graduate students who are working on laboratory supported or related projects are Xiaofeng Fan (Rhody), Prudhvi Gurram (Saber).

LIAS is housed on the second floor of the IT Collaboratory building where it moved in 2006. The current facilities include about 10,000 square feet of laboratory space for the construction and calibration of equipment and the development of software systems.

Major Projectsin 2008

SOFIA Data Cycle System (DCS)—This year saw the continuation of the grant for the development of the DCS for the Stratospheric Observatory for Infrared Astronomy (SOFIA). This project was begun several years ago to develop a complete observation management system for general investigators for this major NASA observatory. The DCS was successfully delivered for integration this year. Bob Krzaczek is the lead software architect on this project which is undergoing the final stages of integration at NASA. The SOFIA Data Cycle System is a novel component of the observatory and will be in daily use when the observatory begins flying in late 2008. Figure 1 shows the layout of the SOFIA observatory on a specially modified Boeing 747.

Figure 1: NASA’s SOFIA Observatory

Integrated Sensing Systems Initiative (ISSI) —ISSI is a program that follows on the highly successful WASP program. The WASP system, originally conceived as a new airborne system for the detection and mapping of wildfires in support of the US Forest Service, was successfully demonstrated as a tool for disaster response and law enforcement during 2007. A key objective is to demonstrate the integration of WASP imagery information products into the disaster response workflow. Demonstration participants included New York State Electric and Gas (NYSEG), Monroe County Office of Emergency Management, New York State Department of Environmental Conservation, and the New York State Police.

RIT conducted a demonstration flight over a river area that had flooded the previous year and during which a flood map was built over a three-day period. In the RIT demonstration, the same flood area, over 40 linear miles, was mapped in less than two hours. Figure 2 shows a mosaic of the RIT image map overlaid on a LANDSAT image base map. The inset shows the detailed resolution that is available in the mosaic. The resolution of the satellite image base map is only about 50ft and the mosaic imagery from the RIT system is about 1ft.

Figure 2: RIT system maps a flood zone in 2 hours that took 3 days by conventional means

Development of the WASP system has progressed to include the use of a digital data downlink that transfers geo-referenced imagery to a ground station from the aircraft in realtime, that is, as the imagery is collected. Thus the WASP system has demonstrated multi-band visible and infrared imaging capability, on-the-fly geo-referencing of the imagery to a map, and realtime delivery of the imagery to a remote ground station.

Technology TransferProjects—LIAS under sponsorship from the New York Office for Science, Technology, and Innovation (NYSTAR), and in collaboration with Geospatial Systems Inc. (GSI) has delivered to GSI a very lightweight, and compact integrated airborne mapping camera system suitable for installation in small unmanned aerial vehicles (UAVs). This system, called CAMMS,is the prototype for a new product introduced by GSI called the Tactical Airborne Mapping and Surveillance System (TAMSS) as seen in Figure 3.

Figure 3: GSI TAMSS Developed with support from RIT

NGA University Research Initiative (NURI) —The goal of this three-year NURI project is the development of semi-automated tools for the construction of models such as those used in DIRSIG from a variety of imagery and sensor data. We are in the third year of this project and have succeeded in the development of techniques to register imagery from visible and infrared airborne camera systems (WASP and WASP Lite) as well as Lidar imagery and some line scanner hyperspectral imagery (MISI). This past year we integrated feature extraction tools and the ability to construct 3D models as shown in Figure 4.

Figure 4: Automated feature extraction tools developed under the NURI project

Group Publications

S.R. Lach, S.D. Brown and J.P. Kerekes, "Semi-Automated DIRSIG Scene Modeling from 3D LIDAR and Passive Imaging Sources", SPIE Laser Radar Technology & Applications XI, Defense & Security Symposium, Orlando, FL, April, 2006

Stephen R. Lach and John P. Kerekes, "Multisource Data Processing for Semi-Automated Radiometrically-Correct Scene Simulation", URBAN/URS, Paris, April 2007.

Xiaofeng Fan and Harvey Rhody, "A Harris Corner Label Enhanced MMI Algorithm for Multi-Modal Airborne Image Registration, 2nd International Conference on Computer Vision Theory and Applications, Barcelona, 8-11 March, 2007 VISAPP 2007

Prudhvi Gurram, Eli Saber and Harvey Rhody, "A Novel Triangulation Method for Building Parallel-Perspective Stereo Mosaics", Electronic Imaging Symposium, San Jose, CA, Jan 2007

Seminars and Presentations

Stephen R Lach, Semi-Automated DIRSIG Scene Modeling from 3D LIDAR and Passive Imaging Sources, February 7, 2007. (Digital Imaging and Remote Sensing seminar)

Harvey Rhody, Automated Imagery Analysis & Scene Modeling, NARP Symposium, September 13-15, 2006, Washington, DC