I am attempting to process multispectral images from a MicaSense RedEdge using Pix4D 4.0.25. Typically I accept the defaults for Ag. Multispectral and let the data process. The resulting output appears to be 32 bit instead of 16 bit which is the output from my camera. Is there a setting to force the output to 16 bit depth?
The camera output is usually in 16 bits format since the digital counts are stored as integer whereas the results from step 3 are saved in float since they contain the transparency level as well. At the moment this option is set as default and it cannot be changed by the user.
Thank you for the note. So, to undo this integer to float conversion I should be able to convert the data directly from 32-bit float to 32-bit integer and then convert the bit-depth, correct?
Unfortunately, it is not expected to work in this direct relationship. Is more of a matter how we export the reflectance map which is by default in 32-bit float. I am not familiar with how exactly you could change the depth or if you can using a 3rd party software.
The way that the data is assigned to the reflectance is not by projecting the entire image but pixel by pixel by computing an average giving weight to the nadir one hence the depth of the image is not equal with the depth of the reflectance no matter the size of the first one.
I hope this gives you some answers
Thanks for your notes. I just reprocessed some of my data with a slightly modified version of Ag. Multispectral. I ticked the option for merging the GeoTIFF outputs and found that the data are now in 16 bit unsigned. Is this conversion documented? I did not see it in the manual.
The rules for data conversion should be the following:
- mosaic creation follows the input data type. In the case of multispectral cameras, it’s very likely to be 16 bits-per-pixel input, 16 bits-per-pixel output (depends on the exact camera model). In teh case of RGB cameras, it will be mapped to RGBA by default (ie adding an alpha channel) even though the input images may not have transparency. This means RGB mosaics are 8 bits per channel * 4 channels = 32 bits per pixel, and 24 bits per channel if you check the option “no transparency”.
- DSM and reflectance are single-channel, real value images. DSM will store the height of each pixel in Float32 format, reflectance will store a value (that should be between 0 and 1 by definition). There is no transparency (aka alpha channel), as invalid pixels are depicted by a special NO_DATA value set in the header of the GeoTIFF file (you can ask your GIS software or GDAL tools to query this value for you in case you’re curious about its value). These files then have 32 bits per channel * 1 channel = 32 bits per pixel.
- index images are originally real valued images obtained by combination of teh reflectances. Hence, tehy are also real valued, 32 bits.
- Coloured index files as exported from the index calculator which are are colour mapping of these real index maps. The colour mapped images are created so that they can be open by regular image viewers (unlike real valued images), hence they are RGBA, so 8 bits per channel * 4 channels = 32 bits per pixel.
Are you seeing deviations from these conversion rules please?
Thank you for your notes.
Can you tell me if the 32 bit value in the reflectance map created from processing the raw imagery from the MicaSense RedEdge (Blue, Green, Red, RedEdge, and NIR bands) represent the fraction of measured reflectance from 0 to 1 (i.e. 0-100%)?
Yes, this is correct the values represent the measured reflectance from 0 to 1 which one can interpret as 0-100%.
I take the 0 to 1 to mean that all values should be between 0 and 1. For my bands, I have three which go beyond 1: red (0.024 to 1.152), rededge (0.025 to 1.152), and nir (0.088 to 1.382). I would assume this means that those bands were incorrectly calibrated, but is there any other reason why the values might go beyond 1?
Hey Crop science Ncsu,
All values should normally be between 0 and 1. If you have a value of 1.15, that means you have %115 reflectance which is scientifically not possible. However if you have snow on the ground, a car or other outlier objects in the field sometimes reflectance values gets confused. I had reflectance values above %100 before, and I was able to fix most of those by using the Camera only calibration method. Moreover inaccurate reflectance values may be caused by over or under exposed calibration images so make sure calibration images make sense as well.
Thank you for your notes! After you comments, I decided to threshold my reflectance images and found that only the NIR band has enough values beyond one to warrant concern.
For carrying out Camera only calibration, did you use the method outlined here by Pix4D?
Is there a method for quantitatively validating over or under exposure of calibration images?
Hey Crop science Ncsu,
I haven’t used the MTP method that you have highlighted, Camera only(Step 3) option is still an automatic processing. If you are concerned about not enough images being calibrated you may try Alternative calibration and 0.5 image scale(Step 1). I have only used MTP’s few times and only if other methods don’t work since it is time consuming. As for quantitatively validating over or under exposure of calibration images there is really no threshold that I apply it is a visual inspection however I always try to pick the calibration image that has the most variation.
Selim, thank you for that!
I have processed the images in Pix4D using Ag. Multispectral. The reflectance tiles were then merged in ArcGIS into a composite band raster. I used the display settings that MicaSense recommends (see here), but my RGB image appears brown when it should be green (see below). Is there some step I am missing? Possibly the red reflectance data was poorly calibrated?
To obtain values between 0-1 using the Rededge one is advised to use a radiometric calibration target.
@ Selim thank you for your answers. Indeed since reflectance factors can reach values beyond 1, especially for strongly forward reflecting surfaces such as snow, shiny objects etc I believe that in this case over the roof or some parts of the asphalt you could have this behaviour. You might find useful information in the paper “Reflectance quantities in optical remote sensing—definitions and case studies”
@John to check if this is the case you could generate the index map for each band and under Color Map and Prescription set the min 0 and max 1 and make sure the option clamped it is not selected. This way the pixels which are outside this range will not be displayed therefore easier to identify. Another way is to look at the statistics, in case the max is very high and the average is pretty low than most probably there are some isolated pixel with this high values.
If this is not the case the presence of reflectance values above 1 could be a sign of problems in the image acquisition or image processing (for this it would be useful for you to post here the quality report, the p4d file and the log file).
Before going further I would like to make sure that the concept of corrections is well defined. The camera only it is not a calibration method but a radiometric correction method. In our software you can apply 3 radiometric corrections:
* camera corrections: (vignetting, dark current, ISO, etc…), more info here.
* sun irradiance corrections: this option is available when a sunshine sensor has been used during the image acquisition; for more have a look at this article.
* sun angle in case you have a rigid geometry between the sensor and the camera.
On top of that, you can use a radiometric calibration target to calibrate your images; more info here article.
As for the RGB composite, it would be better to ask Micasense which is the best ration for doing a composite using Rededge imagery :-). Also, you might find the following video useful. But keep in mind that is a third party software, so we have not tested it yet:
Hey Crop science Ncsu,
Like Ina mentioned above Camera only would affect the reflectance values but it wouldn’t affect the image calibration since image calibration happens at Initial Processing step and reflectance values/radiometric calibration happens at DSM Orthomosaic and Index step. It is highly recommended to use calibration targets to get accurate reflectance values. As for the reflectance factor values it would be something like Blue 0.56, Green 0.56, Red 0.55, NIR 0.50 and Red Edge 0.54 but you should contact with MicaSense and provide your calibration target number since they are specific to each target and they may have slightly different values.
As for the discoloration of RGB’s and how to make an RGB composite from a MicaSense RedEdge camera please take a look at the following link:
I mostly use QGIS for creating an RGB composite orthomosaic from MicaSense RedEdge but I have also created an orthomosaic using ArcMap as well. You may get better results in QGIS. Even after applying these corrections your RGB from MicaSenseEdge may not exactly look like an RGB orthomosaic from an actual RGB camera. Your soil may still look discolored. Moreover in QGIS there are other settings under Layer Properties/Style, you may try to change Brightness, Saturation and Contract settings.
Furthermore I am guessing you are talking about the top right corner, that seems to be the only vegetated area from what I can see or are you talking about the entire orthomosaic ?