Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run tests for different connectors in different builds #10440

Conversation

arhimondr
Copy link
Member

@arhimondr arhimondr commented Apr 18, 2018

We were observing a lot of memory related intermittent product test failures on master. In order to test statistics API we are required to run ANALYZE on hive, that takes us some additional memory to run.

The idea is to move mysql, cassandra, posgree tests to a separate build. So there is no need to provision both, memory demanding cassandra and hive for the same build.

This allows us to avoid spinning up all backends (hadoop, mysql, cassandra, posgree)
alltogether to save some memory

This is just a temporary solution to simply fix intermittent failutes.

Eventually we would like to have something like this: #10439

Copy link
Contributor

@electrum electrum left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, but I'll leave it to someone more familiar with product tests to accept

Copy link
Contributor

@kokosing kokosing left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you also make Presto to use INFO for logging?

@@ -73,7 +73,7 @@ script:
- |
if [[ -v PRODUCT_TESTS_BASIC_ENVIRONMENT ]]; then
presto-product-tests/bin/run_on_docker.sh \
multinode -x quarantine,big_query,storage_formats,profile_specific_tests,tpcds
multinode -x quarantine,big_query,storage_formats,profile_specific_tests,tpcds,cassandra,mysql_connector,postgresql_connector,mysql
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we have mysql and mysql_connector? It would be nice to merge them. Maybe something for following PR.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, i noticed that. We should definitely do that as part of the follow up.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please file a ticket for that, to not to forget about it

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

EXTERNAL_SERVICES="hadoop-master sqlserver"
elif [[ "$ENVIRONMENT" == "singlenode-ldap" ]]; then
EXTERNAL_SERVICES="hadoop-master ldapserver"
elif [[ "$ENVIRONMENT" == "singlenode-mysql" ]]; then
EXTERNAL_SERVICES="hadoop-master mysql"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we need hadoop when testing mysql?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't. But due to the limitation in Tempto related to convention based tests, no matter what test suited you select it will alwasy try to provision TPCH dataset in Hive. We should fix that as part of the followup.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you file a ticket for tempto for that? I will try to fix that.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

elif [[ "$ENVIRONMENT" == "singlenode-mysql" ]]; then
EXTERNAL_SERVICES="hadoop-master mysql"
elif [[ "$ENVIRONMENT" == "singlenode-postgresql" ]]; then
EXTERNAL_SERVICES="hadoop-master postgresql"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we really need hadoop when testing psql?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

elif [[ "$ENVIRONMENT" == "singlenode-postgresql" ]]; then
EXTERNAL_SERVICES="hadoop-master postgresql"
elif [[ "$ENVIRONMENT" == "singlenode-cassandra" ]]; then
EXTERNAL_SERVICES="hadoop-master cassandra"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and here do we need hadoop when testing cassandra? I would prefer to use tpch if any connector has to be used other than cassandra.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same problem

@@ -7,6 +7,5 @@ source "${BASH_SOURCE%/*}/../common/compose-commons.sh"
docker-compose \
-f ${BASH_SOURCE%/*}/../common/standard.yml \
-f ${BASH_SOURCE%/*}/../common/kerberos.yml \
-f ${BASH_SOURCE%/*}/../common/jdbc_db.yml \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you haven't changed tests suites for this this environment, are you sure that it was not using jdbc connectors?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We used to start jdbc services only for multinode and singlenode environments. This include was simply extra / copy-paste artifact.

https://github.com/prestodb/presto/blob/master/presto-product-tests/bin/run_on_docker.sh#L145

@@ -6,6 +6,5 @@ source "${BASH_SOURCE%/*}/../common/compose-commons.sh"

docker-compose \
-f ${BASH_SOURCE%/*}/../common/standard.yml \
-f ${BASH_SOURCE%/*}/../common/jdbc_db.yml \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you haven't changed tests suites for this this environment, are you sure that it was not using jdbc connectors?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@@ -7,6 +7,5 @@ source "${BASH_SOURCE%/*}/../common/compose-commons.sh"
docker-compose \
-f ${BASH_SOURCE%/*}/../common/standard.yml \
-f ${BASH_SOURCE%/*}/../common/kerberos.yml \
-f ${BASH_SOURCE%/*}/../common/jdbc_db.yml \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you haven't changed tests suites for this this environment, are you sure that it was not using jdbc connectors?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

@@ -6,6 +6,5 @@ source "${BASH_SOURCE%/*}/../common/compose-commons.sh"

docker-compose \
-f ${BASH_SOURCE%/*}/../common/standard.yml \
-f ${BASH_SOURCE%/*}/../common/jdbc_db.yml \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you haven't changed tests suites for this this environment, are you sure that it was not using jdbc connectors?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

version: '2'
services:

postgresql:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

automation fails with: Caused by: java.net.UnknownHostException: postgres

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It works fine locally. On Travis for some reason hostname comes after servicename, and a hostname field is ignored. This is weird. For now i'm just going to revert back the service name.

This allows us to avoid spinning up all backends (hadoop, mysql, cassandra, posgree)
alltogether to save some memory
@arhimondr arhimondr force-pushed the segregate-product-tests-by-connector-tested branch from f93d6c8 to e5c5a79 Compare April 18, 2018 19:14
@arhimondr
Copy link
Member Author

Can you also make Presto to use INFO for logging?

You mean setting all here to INFO at least?

https://github.com/prestodb/presto/blob/master/presto-product-tests/conf/presto/etc/log.properties#L10

@findepi findepi removed their request for review April 18, 2018 19:41
@findepi
Copy link
Contributor

findepi commented Apr 18, 2018

@arhimondr you asked me for review, but i think @electrum's and @kokosing's is more than enough. Let me know if I'm still needed here.

@@ -73,7 +73,7 @@ script:
- |
if [[ -v PRODUCT_TESTS_BASIC_ENVIRONMENT ]]; then
presto-product-tests/bin/run_on_docker.sh \
multinode -x quarantine,big_query,storage_formats,profile_specific_tests,tpcds
multinode -x quarantine,big_query,storage_formats,profile_specific_tests,tpcds,cassandra,mysql_connector,postgresql_connector,mysql
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please file a ticket for that, to not to forget about it

EXTERNAL_SERVICES="hadoop-master sqlserver"
elif [[ "$ENVIRONMENT" == "singlenode-ldap" ]]; then
EXTERNAL_SERVICES="hadoop-master ldapserver"
elif [[ "$ENVIRONMENT" == "singlenode-mysql" ]]; then
EXTERNAL_SERVICES="hadoop-master mysql"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you file a ticket for tempto for that? I will try to fix that.

@nezihyigitbasi
Copy link
Contributor

Thanks for fixing this @arhimondr, merged.

@arhimondr arhimondr deleted the segregate-product-tests-by-connector-tested branch April 18, 2018 21:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants