Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

check datacube explorer #14

Open
palmoreck opened this issue Aug 20, 2020 · 3 comments
Open

check datacube explorer #14

palmoreck opened this issue Aug 20, 2020 · 3 comments

Comments

@palmoreck
Copy link
Member

See: https://github.com/opendatacube/datacube-explorer

@palmoreck
Copy link
Member Author

Change in setup.py version of ODC to 1.7 and in datacube-explorer/cubedash/_utils.py comment all that is related with from datacube.index.eo3 import is_doc_eo3 like:

def prepare_dataset_formatting(
    dataset: Dataset,
    include_source_url=False,
    include_locations=False,
) -> CommentedMap:

#comment all statements and add:
    return prepare_document_formatting(
            doc,
            # Label old-style datasets as old-style datasets.
            doc_friendly_label="EO1 Dataset",
            include_source_url=include_source_url,
            )

When cubedash-gen --init --all is executed this came out:

Traceback (most recent call last):
  File "/shared_volume/datacube-explorer/cubedash/generate.py", line 96, in generate_report
    force_dataset_extent_recompute=recreate_dataset_extents,
  File "/shared_volume/datacube-explorer/cubedash/summary/_stores.py", line 202, in refresh_product
    recompute_all_extents=force_dataset_extent_recompute,
  File "/shared_volume/datacube-explorer/cubedash/summary/_extents.py", line 269, in refresh_product
    engine, product, force_update_all=recompute_all_extents
  File "/shared_volume/datacube-explorer/cubedash/summary/_extents.py", line 363, in _populate_missing_dataset_extents
    changed = engine.execute(query).rowcount
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/base.py", line 2237, in execute
    return connection.execute(statement, *multiparams, **params)
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/base.py", line 1011, in execute
    return meth(self, multiparams, params)
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
    distilled_params,
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
    e, statement, parameters, cursor, context
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
    sqlalchemy_exception, with_traceback=exc_info[2], from_=e
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
    cursor, statement, parameters, context
  File "/usr/local/lib/python3.6/dist-packages/sqlalchemy/engine/default.py", line 593, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedFunction) function st_geomfromgeojson(jsonb) does not exist
LINE 1: ...tial, projection, valid_data}') IS NOT NULL) THEN ST_GeomFro...
                                                             ^
HINT:  No function matches the given name and argument types. You might need to add explicit type casts.

Although in postgresql DB on antares_datacube:

antares_datacube=# SELECT ST_AsText(ST_GeomFromGeoJSON('{"type":"Point","coordinates":[-48.23456,20.12345]}')) As wkt;
            wkt            
---------------------------
 POINT(-48.23456 20.12345)
(1 row)

works...

Also part of the errors

Traceback (most recent call last):
  File "/shared_volume/datacube-explorer/cubedash/generate.py", line 96, in generate_report
    force_dataset_extent_recompute=recreate_dataset_extents,
  File "/shared_volume/datacube-explorer/cubedash/summary/_stores.py", line 202, in refresh_product
    recompute_all_extents=force_dataset_extent_recompute,
  File "/shared_volume/datacube-explorer/cubedash/summary/_extents.py", line 269, in refresh_product
    engine, product, force_update_all=recompute_all_extents
  File "/shared_volume/datacube-explorer/cubedash/summary/_extents.py", line 328, in _populate_missing_dataset_extents
    columns = {c.name: c for c in _select_dataset_extent_columns(product)}
  File "/shared_volume/datacube-explorer/cubedash/summary/_extents.py", line 379, in _select_dataset_extent_columns
    md_type, default_crs=_default_crs(dt)
  File "/shared_volume/datacube-explorer/cubedash/summary/_extents.py", line 84, in get_dataset_extent_alchemy_expression
    get_dataset_srid_alchemy_expression(md, default_crs),
  File "/shared_volume/datacube-explorer/cubedash/summary/_extents.py", line 160, in get_dataset_srid_alchemy_expression
    f"CRS expected in form of 'EPSG:1234'. Got: {default_crs!r}"
NotImplementedError: CRS expected in form of 'EPSG:1234'. Got: 'PROJCS["unnamed",GEOGCS["WGS 84",DATUM["unknown",SPHEROID["WGS84",6378137,6556752.3141]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["standard_parallel_1",17.5],PARAMETER["standard_parallel_2",29.5],PARAMETER["latitude_of_origin",12],PARAMETER["central_meridian",-102],PARAMETER["false_easting",2500000],PARAMETER["false_northing",0]]'

Check:

opendatacube/datacube-explorer#30

opendatacube/datacube-explorer#127

opendatacube/datacube-explorer#143

If this cannot be solved, use datacube 1.8 but take into account #13

@palmoreck
Copy link
Member Author

Use next cmds for partially solve some errors reported in #14 (comment)

1) Clone and reset to commit (because I'm using ODC 1.7 version and errors of last line)

cd /shared_volume & git clone https://github.com/opendatacube/datacube-explorer.git
git reset --hard 47a9d269b90acd68ec181df6ad92689041928452

Last commit refers to https://github.com/opendatacube/datacube-explorer/releases/tag/2.1.9

Then change in setup.py version of ODC to 1.7 (as this is the one I'm using...)

2) In postgresql DB:

update agdc.dataset 
    set metadata=jsonb_set(metadata, '{creation_dt}', metadata->'extent'->'to_dt') 
    where metadata->>'creation_dt' is null;

According to opendatacube/datacube-explorer#45 . This is better handled if creation_dt field is incorporated in templates/landsat_espa.yaml templates/srtm_cgiar.yaml Then also will change ingestion/landsat_espa.py and ingestion/srtm_cgiar.py

An option also will be to change crs of INEGI LCC2 to EPSG:4326 For this change:

ingestion/ls8_espa_mexico.yaml

ingestion/srtm_cgiar_mexico.yaml

maybe use this as an example: ls7_espa_colombia.yaml for this change.

Check as an example: old-prep-scripts/radiometrics_prepare.py

Using "EPSG:4326" could also solve: #13 so check this option...

3) Install gunicorn as refered as an option in docu of odc-explorer datacube-explorer#run

pip install gunicorn    
#then execute:
gunicorn -b '0.0.0.0:8080' -w 4 cubedash:app

For first tests I used:

pip show datacube-explorer
Name: datacube-explorer
Version: 2.1.10.dev0+g47a9d26.d20200915
Summary: UNKNOWN
Home-page: https://github.com/opendatacube/datacube-explorer
Author: Geoscience Australia
Author-email: [email protected]
License: UNKNOWN
Location: /shared_volume/datacube-explorer
Requires: cachetools, click, datacube, fiona, flask, Flask-Caching, flask-themes, geoalchemy2, geographiclib, jinja2, pyorbital, pyproj, python-dateutil, python-rapidjson, shapely, simplekml, sqlalchemy, structlog, dataclasses
Required-by:

@palmoreck
Copy link
Member Author

palmoreck commented Sep 15, 2020

Related to #14 (comment) some screenshots:

Captura de pantalla 2020-09-15 a las 11 30 54 a m
Not sure why is not displayed ls8_espa_scene

ref: ingestion/ls8_espa_mexico.yaml#L33 and srtm_cgiar_mosaic its correctly displayed:

Captura de pantalla 2020-09-15 a las 11 31 01 a m

Captura de pantalla 2020-09-15 a las 11 31 08 a m

ref: ingestion/srtm_cgiar_mexico.yaml#L28

Also don't know if name output_type in ingestion/srtm_cgiar_mexico.yaml#L2 and ingestion/ls8_espa_mexico.yaml#L2 it's needed because ODC-explorer failed for ls8_espa_mexico and srtm_cgiar_mexico both related to error

CRS expected in form of 'EPSG:1234'. Got: 'PROJCS["unnamed",GEOGCS["WGS 84",DATUM["unknown",SPHEROID["WGS84",6378137,6556752.3141]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["standard_parallel_1",17.5],PARAMETER["standard_parallel_2",29.5],PARAMETER["latitude_of_origin",12],PARAMETER["central_meridian",-102],PARAMETER["false_easting",2500000],PARAMETER["false_northing",0]]'

Regarding ls8_espa_scene next was successfully displayed when clicking in datasets of homepage and selecting this product:

Captura de pantalla 2020-09-15 a las 11 31 30 a m

and selecting one of the metadata_mex_l8.yaml#part=7 files

Captura de pantalla 2020-09-15 a las 11 31 32 a m

So, maybe the first screenshot that showed no datasets for ls8_espa_scene is related to not having EPSG:4326 crs def but INEGI one

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant