Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stop using custom logging setup for workflows #459

Open
oyvindeide opened this issue Nov 22, 2021 · 1 comment
Open

Stop using custom logging setup for workflows #459

oyvindeide opened this issue Nov 22, 2021 · 1 comment
Assignees

Comments

@oyvindeide
Copy link
Contributor

Using:

def getLogger(module_name="subscript"):
results in flooding the stdout of ert when used in workflows where subscript is essentially used as a library. In those cases using logging is fine, but ert should be responsible for setting up the logger. In those cases the standard logging module should be used directly:

import logging
...
logger = logging.getLogger(__name__)
logger.info("msg")

for forward models using the subscript logger is fine, because they have their own entry points and is the module in charge of the excution.

@berland
Copy link
Collaborator

berland commented Nov 23, 2021

For e.g. csv_merge, is this a matter of only resetting the logger object (and its level) around

def run(self, *args):
# pylint: disable=no-self-use
"""Parse with a simplified command line parser, for ERT only,
call csv_merge_main()"""
parser = get_ertwf_parser()
args = parser.parse_args(args)
logger.setLevel(logging.INFO)
globbedfiles = glob_patterns(args.csvfiles)
csv_merge_main(csvfiles=globbedfiles, output=args.output)
?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants