No intersection with tile

Hi,

I want to use s1tiling on HPC. So I use Docker container with Singularity. It is kind of working but say " INFO - No intersection with tile 31TEL" even if they actually intersect.

I use images downloaded from peps “manually” with the 31TEL area as bbox. I’am doing that because I can’t download from an HPC.

It detect SRTM tiles
I tried with different tiles : same problem.
I tried a very small tile_to_product_overlap_ratio : 0.001
Here is the log. If someone can help…

Thank you!

INFO - OTB version detected on the system is 7.4.0
DEBUG - Running S1Tiling with:
DEBUG - [Paths]
DEBUG - - geoid_file                     : /tmp/S1TilingEnv/lib/python3.6/site-packages/s1tiling/resources/Geoid/egm96.grd
DEBUG - - output                         : /data/data_out
DEBUG - - s1_images                      : /data/raw
DEBUG - - srtm                           : /data/share/SRTM
DEBUG - - tmp                            : /data/tmp
DEBUG - [DataSource]
DEBUG - - download                       : False
DEBUG - - first_date                     : 2022-01-01
DEBUG - - last_date                      : 2022-01-30
DEBUG - - polarisation                   : VV VH
DEBUG - - roi_by_tiles                   : ALL
DEBUG - [Processing]
DEBUG - - calibration                    : sigma
DEBUG - - mode                           : debug logging
DEBUG - - nb_otb_threads                 : 4
DEBUG - - nb_parallel_processes          : 2
DEBUG - - orthorectification_gridspacing : 40.0
DEBUG - - output_spatial_resolution      : 10.0
DEBUG - - ram_per_process                : 4096
DEBUG - - remove_thermal_noise           : True
DEBUG - - srtm_shapefile                 : /tmp/S1TilingEnv/lib/python3.6/site-packages/s1tiling/resources/shapefile/srtm_tiles.gpkg
DEBUG - - tile_to_product_overlap_ratio  : 0.001
DEBUG - - tiles                          : ['31TEL']
DEBUG - - tiles_shapefile                : /tmp/S1TilingEnv/lib/python3.6/site-packages/s1tiling/resources/shapefile/Features.shp
DEBUG - [Mask]
DEBUG - - generate_border_mask           : True
DEBUG - prod: S1A_IW_GRDH_1SDV_20220105T174755_20220105T174820_041329_04E9D0_FA7C.SAFE
DEBUG -   KEEP S1A_IW_GRDH_1SDV_20220105T174755_20220105T174820_041329_04E9D0_FA7C.SAFE /// True == 2022-01-01 <= 2022-01-05 <= 2022-01-30
DEBUG - prod: S1A_IW_GRDH_1SDV_20220107T173146_20220107T173211_041358_04EAC4_2590.SAFE
DEBUG -   KEEP S1A_IW_GRDH_1SDV_20220107T173146_20220107T173211_041358_04EAC4_2590.SAFE /// True == 2022-01-01 <= 2022-01-07 <= 2022-01-30
DEBUG - prod: S1A_IW_GRDH_1SDV_20220112T173952_20220112T174017_041431_04ED29_01B7.SAFE
DEBUG -   KEEP S1A_IW_GRDH_1SDV_20220112T173952_20220112T174017_041431_04ED29_01B7.SAFE /// True == 2022-01-01 <= 2022-01-12 <= 2022-01-30
DEBUG - prod: S1A_IW_GRDH_1SDV_20220117T174755_20220117T174820_041504_04EF85_8377.SAFE
DEBUG -   KEEP S1A_IW_GRDH_1SDV_20220117T174755_20220117T174820_041504_04EF85_8377.SAFE /// True == 2022-01-01 <= 2022-01-17 <= 2022-01-30
DEBUG - prod: S1A_IW_GRDH_1SDV_20220119T173145_20220119T173210_041533_04F07C_BCFC.SAFE
DEBUG -   KEEP S1A_IW_GRDH_1SDV_20220119T173145_20220119T173210_041533_04F07C_BCFC.SAFE /// True == 2022-01-01 <= 2022-01-19 <= 2022-01-30
DEBUG - prod: S1A_IW_GRDH_1SDV_20220124T173952_20220124T174017_041606_04F2F6_EF7B.SAFE
DEBUG -   KEEP S1A_IW_GRDH_1SDV_20220124T173952_20220124T174017_041606_04F2F6_EF7B.SAFE /// True == 2022-01-01 <= 2022-01-24 <= 2022-01-30
DEBUG - prod: S1A_IW_GRDH_1SDV_20220129T174754_20220129T174819_041679_04F573_8F7A.SAFE
DEBUG -   KEEP S1A_IW_GRDH_1SDV_20220129T174754_20220129T174819_041679_04F573_8F7A.SAFE /// True == 2022-01-01 <= 2022-01-29 <= 2022-01-30
INFO - Requested tiles: ['31TEL']
INFO - The following tiles will be process: ['31TEL']
DEBUG - Check SRTM tile for 31TEL
INFO - SRTM ok
INFO - Check SRTM coverage for 31TEL
INFO - 0 images to process on ['31TEL'] tiles
INFO - Required SRTM tiles: ['N45E002', 'N46E002', 'N45E004', 'N46E004', 'N46E003', 'N45E003']
DEBUG - Create temporary SRTM diretory (/data/tmp/tmpppbjkw82) for needed tiles ['N45E002', 'N46E002', 'N45E004', 'N46E004', 'N46E003', 'N45E003']
DEBUG - - cp /data/share/SRTM/N45E002.hgt <-- /data/tmp/tmpppbjkw82/N45E002.hgt
DEBUG - - cp /data/share/SRTM/N46E002.hgt <-- /data/tmp/tmpppbjkw82/N46E002.hgt
DEBUG - - cp /data/share/SRTM/N45E004.hgt <-- /data/tmp/tmpppbjkw82/N45E004.hgt
DEBUG - - cp /data/share/SRTM/N46E004.hgt <-- /data/tmp/tmpppbjkw82/N46E004.hgt
DEBUG - - cp /data/share/SRTM/N46E003.hgt <-- /data/tmp/tmpppbjkw82/N46E003.hgt
DEBUG - - cp /data/share/SRTM/N45E003.hgt <-- /data/tmp/tmpppbjkw82/N45E003.hgt
DEBUG - new StepFactory(AnalyseBorders) -> app=
DEBUG - new StepFactory(Calibration) -> app=SARCalibration
DEBUG - new StepFactory(BorderCutting) -> app=ResetMargin
DEBUG - new StepFactory(OrthoRectification) -> app=OrthoRectification
DEBUG - new StepFactory(Concatenation) -> app=Synthetize
DEBUG - new StepFactory(BuildBorderMask) -> app=BandMath
DEBUG - new StepFactory(SmoothBorderMask) -> app=BinaryMorphologicalOperation
DEBUG - Remove ['S1Processor-worker-0.debug.log', 'S1Processor-worker-1.debug.log', 'S1Processor-worker-0.warning.log', 'S1Processor-worker-1.warning.log']
distributed.http.proxy - INFO - To route to workers diagnostics web server please install jupyter-server-proxy: python -m pip install jupyter-server-proxy
distributed.worker - INFO -       Start worker at:      tcp://127.0.0.1:33883
distributed.worker - INFO -          Listening to:      tcp://127.0.0.1:33883
distributed.worker - INFO -          dashboard at:            127.0.0.1:34585
distributed.worker - INFO - Waiting to connect to:      tcp://127.0.0.1:35013
distributed.worker - INFO - -------------------------------------------------
distributed.worker - INFO -               Threads:                          1
distributed.worker - INFO -                Memory:                   33.58 GB
distributed.worker - INFO -       Local Directory: /dask-worker-space/worker-w0pqb935
distributed.worker - INFO - -------------------------------------------------
distributed.worker - INFO -         Registered to:      tcp://127.0.0.1:35013
distributed.worker - INFO - -------------------------------------------------
distributed.core - INFO - Starting established connection
distributed.worker - INFO -       Start worker at:      tcp://127.0.0.1:46641
distributed.worker - INFO -          Listening to:      tcp://127.0.0.1:46641
distributed.worker - INFO -          dashboard at:            127.0.0.1:37713
distributed.worker - INFO - Waiting to connect to:      tcp://127.0.0.1:35013
distributed.worker - INFO - -------------------------------------------------
distributed.worker - INFO -               Threads:                          1
distributed.worker - INFO -                Memory:                   33.58 GB
distributed.worker - INFO -       Local Directory: /dask-worker-space/worker-rum6_7t6
distributed.worker - INFO - -------------------------------------------------
distributed.worker - INFO -         Registered to:      tcp://127.0.0.1:35013
distributed.worker - INFO - -------------------------------------------------
distributed.core - INFO - Starting established connection
distributed.worker - INFO - Starting Worker plugin <distributed.client._WorkerSetupPlugin object at 0-f26f35f2-8078-4746-84da-b34ec8a76385
distributed.worker - INFO - Starting Worker plugin <distributed.client._WorkerSetupPlugin object at 0-fadb7e96-bff3-449a-92be-da391c224fcb
INFO - Processing tile 31TEL (1/1)
INFO - Using images already downloaded, as per configuration request
INFO - Downloading images related to 31TEL took 0.00021183100761845708sec
DEBUG - Test intersections of 31TEL
INFO - Intersecting raster list w/ 31TEL took 1.1696779500052799sec
INFO - No intersection with tile 31TEL
INFO - Processing of tile 31TEL took 1.1741652900236659sec
INFO - Execution report: no error detected
INFO -  -> Nothing has been executed
INFO - Stopping worker at tcp://127.0.0.1:33883
distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:33883
INFO - Stopping worker at tcp://127.0.0.1:46641
distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:46641
DEBUG - Cleaning temporary SRTM diretory (<TemporaryDirectory '/data/tmp/tmpppbjkw82'>)

Hi,

I’m testing it on my machine over the same time range. 17 products are being downloaded.

Then we keep and expect the directory hierarchy introduced by eodag. (Eventually I’ll have to simplify that part). That means that in the {s1_images} directory, the product will be stored as:

data_raw/ -- my s1_images
├── S1A_IW_GRDH_1SDV_20220104T060017_20220104T060042_041307_04E90C_472B
│   └── S1A_IW_GRDH_1SDV_20220104T060017_20220104T060042_041307_04E90C_472B.SAFE
│       ├── annotation
│       │   ├── calibration
...
│           └── s1-product-preview.xsd
└── S1A_IW_GRDH_1SDV_20220104T060042_20220104T060107_041307_04E90C_70D8
    └── S1A_IW_GRDH_1SDV_20220104T060042_20220104T060107_041307_04E90C_70D8.SAFE
        ├── annotation
...

Have you kept the same organization?

The eodag hierarchy was the problem. In fact a direct .SAFE folder could be better to match products downloaded from other sources.

Thank you!

This actually relates to an eodag pending issue: Extract SAFE products without intermediate directory · Issue #251 · CS-SI/eodag · GitHub

At the moment, I’d rather not convolute S1Tiling code base if I can avoid it to support pathnames imposed by eodag plus simplified pathnames. Also it seems there are differences with some other providers.

Any way, I also hope we can simplify the situation.