You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
create table goe_test.tab_bf (id number(2), bf binary_float);
insert into goe_test.tab_bf values (1,1);
commit;
bin/offload -t goe_test.tab_bf
...
Exception caught at top-level
OffloadDataTypeControlsException: 4 byte binary floating point data cannot be offloaded to this backend system: BF
I believe the same change also needs to happen for a Snowflake backend.
We did this is because the 64 bit value ends up being slightly different to the 32 bit source and we used to mandate value parity. There are two options:
Allow use of --double-columns as an override to force the offload
Just go ahead and offload to float64, perhaps including an advisory notice
I think we should do option two, there's no other option so why stop and make the user manually intervene.
Once this is supported a columns will need to be added into any type mapping test tables where appropriate (at least integration/offload/test_data_type_controls.py, probably other places too).
The text was updated successfully, but these errors were encountered:
BINARY_FLOAT will not offload to BigQuery:
I believe the same change also needs to happen for a Snowflake backend.
We did this is because the 64 bit value ends up being slightly different to the 32 bit source and we used to mandate value parity. There are two options:
--double-columns
as an override to force the offloadI think we should do option two, there's no other option so why stop and make the user manually intervene.
Once this is supported a columns will need to be added into any type mapping test tables where appropriate (at least
integration/offload/test_data_type_controls.py
, probably other places too).The text was updated successfully, but these errors were encountered: