Planetary environments are among the most hazardous, remote and uncharted in the solar system. They are also critical to the search for life, human exploration, resource extraction, infrastructure and science. These applications represent the prime unexploited opportunity for automated modeling, but robots are under-utilized for this purpose. There is urgent need to explore, document, and evaluate these spaces with robots and to do so in a superior and efficient manner beyond the state-of-the-art.
This thesis introduces Lumenhancement: the use of active illumination and intensity imaging with optical domain knowledge to enhance geometric modeling. While planetary environments are among the most challenging for robots, they share unique appearance constraints that can be exploited for sensing. Their barren, dry, dark, and rocky nature enables a variety of physics-based vision techniques which are not pertinent in other field environments. Synergistic integration of calibrated imagery with traditional range sensing results in models with increased accuracy, sample density and readability. By leveraging the prevalence of existing illumination – such as sunlight – and common imaging sensors along with post-processing capability, this work promises broad significance.
Contributions from this thesis extend the state-of-the-art in several ways. Future discussion is anchored by experimental characterization of the planetary domain for the material and geometric properties of appearance. Material reflectance characterization using gonioreflectometry has created the first empirical BRDF database of planetary materials. Studies of surface geometry have resulted in the first expansive database of range sensor comparative performance. The correctness of common vision assumptions in this domain, implications to intensity image techniques, and relevance to other domains are addressed. Novel methods for range and image fusion are devised to enhance and optimize aspects of model quality in the context of these principles, including geometric super-resolution, image-directed optimal sampling, and material classification. New possibilities for visualizing Lumenhanced models are also presented. Finally, implementations on mobile mapping robots and field experimentation at a coal mine and moon-yard are documented.