This page describes three hyperspectral images, which we refer to as the Blue, Yellow, and Blue_Y images respectively. They were taken in a special laboratory room [5, 6] where the illumination was under computer control. The first two images (Blue and Yellow) are of the same set of surfaces under two different illuminants. One illuminant was bluish, the other yellowish. The third image (Blue_Y) was acquired under the same illuminant as the Blue image. The gray background surface was changed to a yellow one for this image, however. The light reaching the camera from the yellow background under the bluish illuminant has the same CIE XYZ tristimulus coordinates (almost) as the light reaching the camera from the gray surfaace under the yellowish illuminant. You can see a plot of the bluish and yellowish illuminant spectra by clicking here.
These images were acquired by myself and my colleagues as part of a psychophysical study of color constancy . You are welcome to use these images, but you should cite us, just as you would when you use data from any scientific publication. For these images, in addition to citing the web site , you might also consider citing Brainard et. al :
Each hyperspectral image consists of 31 monochromatic image planes, corresponding to wavelengths between 400 and 700 nm (inclusive) in 10 nm steps.
Each monochromatic images is stored in a raw format as follows.
For a listing of a MATLAB function that will read the individual images into a matrix variable, click here. (This function has only been tested on the Macintosh platform. It is possible that a byte order problem will crop up if you use it on other platforms.)
The monochromatic images in the archive are named according to the convention ROOT0001, ROOT0002, .... ROOT0031, where ROOT is either BLUE, YELLOW, or BLUE_Y depeding on the image archive. These individual images correspond to 400 nm, 410 nm, ..., 700 nm respectively. In the archive, the monochromatic images are stored in the subdirectory RAW.
The individual monochromatic images were taken using different f-stops and different exposure durations. To produce a calibrated hyperspectral image, each individual image must be scaled by a calibration factor. The calibration factors were determined by comparing the image data at a reference location to direct radiometric measurements (PhotoResearch PR-650) of the light coming from this location. For the two images here, the reference location was the white paper centered on the large gray cardboard. The white paper was a Munsell matte N 9.5/ paper. To obtain an estimate of the illuminant incident at the reference location, multiply the reference spectrum by 1.12. The calibration factors and reference spectrum are provided as calibration.mtxt (Macintosh text format) and calibration.utxt (UNIX text format).
To view a hyperspectral image, it is useful to reduce it to an RGB format. The RGB images are useful for obtaining a sense of the appearance of the image, but the values should not be used for calculations - they are specific to a particular monitor and rendering procedures. For calculations, the underlying spectral data should be used. For the images shown, the rendering to RGB was done in two basic steps. First, we used the hyperspectral images together with the CIE 1931 color matching functions to calculate the XYZ tristimulus values at each image location. We then used monitor calibration information (for an Apple 20" color monitor that was available in our lab) to compute RGB image values that produce an image that is pixel-by-pixel metameric to the XYZ image that was derived from the hyperspectral image. Some of the regions of the hyperspectral image were out of gamut -- producing the appropriate monitor metamer would require negative power on one or more guns, or more light than the monitor could produce. These were brought into gamut using some combination of scaling and clipping before gamma correction. The particular monitor data used for gamma correction has the feature that there is a relatively high-threshold before variation in input values has any effect on the light output. Because of this and the gamut mapping, the rendered images on this page can appear washed out. We have since developed procedures for producing nicer looking RGB images. Please contact David Brainard if you are interested knowing more about this particular issue.
Each image archive contains the raw images, the files calibration.mtxt and calibration.utxt, and a TIFF file containing the rendered RGB image.
The archives are in UNIX tar format. For Macintosh users, this format is easily unpacked using the Stuffit application.
1. The images were acquired by placing interference filters in front of the CCD camera lens. We have since discovered that this arrangement leads to small artifacts in the individual images. These artifacts arise because the interference filters contain small pinholes and the spatial structure of the light coming through these pinholes shows up in some of the individual images. The effect is most pronounced for short wavelength images. On the whole these artifacts are quite small and they are not visible in the rendered RGB images. We have since improved our camera design to greatly reduce this problem.
2. There seems to be a small amount of geometric shift and distortion between the individual monochromatic images. This may be due to camera movement between exposures or to variations in the overall optical system that arise as one interference filter is substituted for the next. On the whole these effects are small, but for some purposes they may be important. We have not attempted geometric correction of the individual images. We did place two grid targets in the images however. It should be straightforward to write software that extracts the locations of these targets and uses this information to evaluate and perhaps correct for geometric image distortions.
3. We have not yet fully characterized the optical MTF of our camera system. Certainly, however, chromatic aberrations mean that the MTF varies as a function of wavelength.
J. E. Farrell, J. M. Kraft, M. D. Rutherford, J. D. Tietz, and P. L. Vora helped with camera design, camera calibration, and/or image acquisition. The work was supported primarily by a philanthropic gift from the Hewlett-Packard Corporation.