Setting Validity Mask with Python Script

I’m trying to set a Validity Mask as part of the KMeansClassification within a Python script, but I’m getting a ‘Floating point exception (core dumped)’ (full output below) when it is defined as follows:

	kmeans = otbApplication.Registry.CreateApplication("KMeansClassification")
	kmeans.SetParameterString("in", chmname+"_clip.tif")
	kmeans.SetParameterInt("ts", 1000)
	kmeans.SetParameterInt("ram", 2048)
	kmeans.SetParameterInt("nc", inputNC)
	##setting vm
	kmeans.SetParameterString("vm", "vm.tif")
	kmeans.SetParameterString("out", chmname+"_kmeans.tif")
	kmeans.SetParameterOutputImagePixelType("out", 1)
	kmeans.ExecuteAndWriteOutput()

When I omit kmeans.SetParameterString("vm", "vm.tif") the script runs fine. If I include the (same) validity mask on the command line with otbcli_KMeansClassification it runs fine. Incidentally, despite setting kmeans.SetParameterInt("ram", 2048) the application still uses the default RAM limit of 128MB. I suspect that I’m overlooking something wrt to how the parameters ought to be set, but any advice would be appreciated.

log:

2019-03-07 12:48:48 (INFO): No kwl metadata found in file CT_KK_CHM_clip.tif
2019-03-07 12:48:48 (INFO): No kwl metadata found in file vm.tif
2019-03-07 12:48:48 (INFO): Estimated memory for full processing: 430.541MB (avail.: 128 MB), optimal image partitioning: 4 blocks
2019-03-07 12:48:48 (INFO): Estimation will be performed in 5 blocks of 7706x1303 pixels
2019-03-07 12:48:48 (INFO): Estimated memory for full processing: 430.541MB (avail.: 128 MB), optimal image partitioning: 4 blocks
2019-03-07 12:48:48 (INFO): Estimation will be performed in 6 blocks of 3552x3552 pixels
2019-03-07 12:48:50 (INFO): Estimated memory for full processing: 574.067MB (avail.: 128 MB), optimal image partitioning: 5 blocks
2019-03-07 12:48:50 (INFO): Estimation will be performed in 6 blocks of 7706x1086 pixels
2019-03-07 12:48:50 (INFO): No kwl metadata found in file CT_KK_CHM_clip.tif
2019-03-07 12:48:50 (INFO): Estimated memory for full processing: 574.067MB (avail.: 128 MB), optimal image partitioning: 5 blocks
2019-03-07 12:48:50 (INFO): Estimation will be performed in 6 blocks of 7706x1086 pixels
2019-03-07 12:48:52 (INFO): Estimated memory for full processing: 1196.03MB (avail.: 128 MB), optimal image partitioning: 10 blocks
2019-03-07 12:48:52 (INFO): File CT_KK_CHM_kmeans.tif will be written in 11 blocks of 7706x592 pixels
Floating point exception (core dumped)

Hi,
Could you provide us with some extract of your inputs so that we can reproduce the bug?
From what you said I cannot see what went wrong…
And also, what versions of python and OTB you are using?
Antoine

Hi Antoine,
I’ve created a subset of the data and it is available to download here.

I’m using Python 2.7:

Python 2.7.15rc1 (default, Nov 12 2018, 14:31:15)
[GCC 7.3.0] on linux2

The OTB Python Bindings are from 6.6, but I’m using version 6.7 on the cli.

Thanks in advance.
Daniel.

I resolved the issue by generating a validity mask using the ManageNoData application and then using the band math to remove the no-data areas. Still curious to understand what caused the above issue and why the ram was not read using kmeans.SetParameterInt("ram", 2048)

Hi Daniel,
Sorry for the late reply…
I didn’t manage to reproduce your issue on bindings 6.7. Did you you reproduced the bug with the cli from 6.7?
And just to be clear, when you say that you are using the bindings from 6.6 you are also using the lib from 6.6 right?

There is definitely an error on the RAM parameter though…
Antoine