Significant advances have been made in developing normalincidence sensitive quantum-dot infrared photodetectors (QDIPs) that can exhibit spectral responses tunable through the bias voltages applied. This tunability makes it possible to build spectral imaging system in IR range based on single QDIP, without any spectral dispersive device upfront. To achieve such adaptivity, algorithms must be developed to find the optimized operation bias voltages set which maximizes the spectral context inside the output data while reducing the data redundancy. In this paper, we create a new, general definition of signal-to-noise ratio (SNR) in spectral space, based on a geometrical spectral imaging model recently developed. With the new SNR definition, a scene-independent set of bias voltages is selected to maximize the average SNR of the sensor. Then, some bias voltages can be added or removed. This dynamic optimization process is performed throughout the imaging process so that the balance between data information and data volume is always achieved.