-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cardinality too high due to filename
label in some use-cases
#439
Comments
@sed-i we should probably treat this is a high/critical priority item for the coming pulse. |
We should allow the user to override the default |
FTR and given @Abuelodelanada's comment, we need these logs. So not scraping them is not an option. |
Overriding the default scrape job doesn't mean "not scraping those logs"; you can override the default with the same targets, but define custom labels: this is how you would get the same logs, but without the automatic label injection for the If I understand correctly, that's the result you want to achieve, or am I missing something? |
Hey sorry for the delayed reply. Yes, your understanding is correct. I was just trying to make it explicit to eliminate all doubt (as the "or the entire scrape job" could be interpreted as providing a toggle to either enable or disable the scrapping). |
Enhancement Proposal
When using Loki to ingest logs from an Openstack cloud, the inclusion of the
filename
label for each log line can potentially lead to a huge cardinality.Openstack uses libvirt to spawn VMs, and libvirt creates one log file for each VM launched (
/var/log/libvirt/qemu/$DOMAIN_NAME.log
). Some clouds used as build farms, can spawn thousands of VMs in the course of a few days, each with it's own value for thefilename
label.We need a way to customize Loki so as to avoid including the
filename
label in some files (or directories?). To not lose querying capabilities (as these files don't have the domain name on each line), maybe we could consider using structured metadata?The text was updated successfully, but these errors were encountered: