LargeScaleMeanShift

Context

  • Segmentation of Spot7 image subset.

Configuration setup

My system: *Windows10
Version of the OTB: *7.1
I installed the OTB with: *the binaries,

Description of my issue

  • When i run my segmentation i have a “bad allocation” error : Caught std::exception during application execution: bad allocation
  • I putted as RAM limit : -ram 25000
  • I have 32 GO of available RAM
  • the segmentation use all my available RAM and crash befor Shapefile generating.
  • Image size is : 15360 x12875 pixel x 4 bands : UInt16

=> Segmentation commande : otbcli_LargeScaleMeanShift -in image -spatialr 7 -ranger 10 -minsize 100 -mode.vector.out image_seg.shp -ram 25000

What should I to do ?

Thanks
Mohamed.

Hello Mohamed. I am having a similar issue with LargeScaleMeanShift. Is the issue the same as I describe below??

‘LargeScaleMeanShift’ has failed with return status -1. Please refer to ‘LargeScaleMeanShift’ documentation and check log tab.

The crash appears to happen in the ‘Computing Stats on input image’ stage.

The log states:

(FATAL) LargeScaleMeanShift: Caught std::exception during application execution: bad allocation

1 Like

Hi Mark and thx for the reply,
I don’t think so. I put all the error message :

Estimated memory for full processing: 9806.98MB (avail.: 256 MB), optimal image partitioning: 39 blocks
2020-10-07 17:55:24 (INFO): Estimation will be performed in 40 blocks of 15360x322 pixels
Computing stats on input image …: 100% [**************************************************] (19s)
2020-10-07 18:09:56 (FATAL) LargeScaleMeanShift: Caught std::exception during application execution: bad allocation

Dear @kadirim,
Your issue looks very similar to what @wkcmark describes here. I will try to reproduce your bug, to understand what happens.
Best regards.
Julien.

1 Like

@julienosman if you need any further info on this let me know. I really need to get this up and running again.

Does the procedure work with a subset of the image? You could it to crop a much smaller portion and check if it runs then. If not then there is a deeper problem, if it runs through then it might be “simply” a memory problem

Dear @oldi
It seems be a memory problem even with 32 GO of Memory RAM.
I will try with Grass according to @wkcmark advice.
Regrads

@oldi yes it works on smaller areas for me. However in the previous version it was running fine with big dataset.

It could also be due to errors at the image boundary. You can try to make a much larger subset, with some decent distance to the image boundary.

@julienosman did you ever get anywhere with this? I’ve run into the error again.