We have developed an SI-traceable narrow-band tunable radiance source based on an optical parametric oscillator (OPO) and an integrating sphere for the calibration of spectroradiometers. The source is calibrated with a reference detector over the ultraviolet/visible spectral range with an uncertainty of <1%. As a case study, a CubeSat spectroradiometer has been calibrated for radiance over its operating range from 370 nm to 480 nm. To validate the results, the instrument has also been calibrated with a traditional setup based on a diffuser and an FEL lamp. Both routes show good agreement within the combined measurement uncertainty. The OPO-based approach could be an interesting alternative to the traditional method, not only because of reduced measurement uncertainty, but also because it directly allows for wavelength calibration and characterization of the instrumental spectral response function and stray light effects, which could reduce calibration time and cost.
DOCUMENT
Calibration of spectral imaging instruments is a prerequisite for many applications, in particular in the field of Earth observation. In this contribution we will present a novel traceability route to celebrate spectral imaging instruments, based on tunable radiance source that is referenced to a primary detector standard.
DOCUMENT
Three-dimensional (3D) reconstruction has become a fundamental technology in applications ranging from cultural heritage preservation and robotics to forensics and virtual reality. As these applications grow in complexity and realism, the quality of the reconstructed models becomes increasingly critical. Among the many factors that influence reconstruction accuracy, the lighting conditions at capture time remain one of the most influential, yet widely neglected, variables. This review provides a comprehensive survey of classical and modern 3D reconstruction techniques, including Structure from Motion (SfM), Multi-View Stereo (MVS), Photometric Stereo, and recent neural rendering approaches such as Neural Radiance Fields (NeRFs) and 3D Gaussian Splatting (3DGS), while critically evaluating their performance under varying illumination conditions. We describe how lighting-induced artifacts such as shadows, reflections, and exposure imbalances compromise the reconstruction quality and how different approaches attempt to mitigate these effects. Furthermore, we uncover fundamental gaps in current research, including the lack of standardized lighting-aware benchmarks and the limited robustness of state-of-the-art algorithms in uncontrolled environments. By synthesizing knowledge across fields, this review aims to gain a deeper understanding of the interplay between lighting and reconstruction and provides research directions for the future that emphasize the need for adaptive, lighting-robust solutions in 3D vision systems.
MULTIFILE