Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lokiexporter: Support dynamic attributes for labels #2290

Closed
gramidt opened this issue Feb 7, 2021 · 12 comments
Closed

lokiexporter: Support dynamic attributes for labels #2290

gramidt opened this issue Feb 7, 2021 · 12 comments
Labels
comp:grafanalabs Grafana Labs components - Loki, Grafana exporter/loki Loki Exporter Stale

Comments

@gramidt
Copy link
Member

gramidt commented Feb 7, 2021

Is your feature request related to a problem? Please describe.
The Loki exporter by design forces a user to explicitly configure the attributes they want to allow as labels within Loki. This is a safety net to help prevent accidentally adding dynamic labels that may significantly increase cardinality, thus having a performance impact on their Loki instances. However, users may trust their downstream configurations, such as Fluentd or Stanza, to keep labels to a minimum.

Describe the solution you'd like
As a Loki user, I want to be able to filter attributes and have the exporter dynamically transform them into valid Loki labels, so that downstream systems do not have to sync their attributes within the exporter config.

Describe alternatives you've considered

  • Keep the exporter as is.
  • Just use expressions to match attribute names and not dynamically convert them. This would require the 'attributeprocessor' to be used in the pipeline for transforming attributes into valid Loki labels. This option still requires users to sync all attributes.
@gramidt
Copy link
Member Author

gramidt commented Feb 7, 2021

@andrewhsu - Could you assign this to me?

@gramidt
Copy link
Member Author

gramidt commented Feb 20, 2021

@bogdandrutu - Could you assign this to me?

@gillg
Copy link
Contributor

gillg commented Mar 3, 2021

Hello, I'm pretty according with you it could be an non default option.
But we should at least define some WhiteListed patterns to prevent missconfigs.
How do you imagine the config structure ?

Maybe it can be also a workaround for another bug I faced with lokiexporters and attributes with uppercase chars impossible to whitelist (more details here : open-telemetry/opentelemetry-collector#2594, the root cause seems dispatched in 2 new other issues)

kisieland referenced this issue in kisieland/opentelemetry-collector-contrib Mar 16, 2021
@alolita alolita added the comp:grafanalabs Grafana Labs components - Loki, Grafana label Sep 30, 2021
ljmsc referenced this issue in ljmsc/opentelemetry-collector-contrib Feb 21, 2022
)

* Fix flaky test TestSimpleSpanProcessorShutdownHonorsContextCancel

* Add changes to changelog

* Prioritize return of exporter error

* Update changelog description

* Fix deadlock bug
@github-actions
Copy link
Contributor

github-actions bot commented Nov 4, 2022

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Nov 4, 2022
@jpkrohling
Copy link
Member

cc @mar4uk

@jpkrohling jpkrohling added exporter/loki Loki Exporter and removed Stale labels Nov 21, 2022
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@jpkrohling
Copy link
Member

@mar4uk, @kovrus, could this be accomplished with the transform processor? Like, add a new Loki exporter hints as attributes based on other data?

@mar4uk
Copy link
Contributor

mar4uk commented Feb 1, 2023

probably this configuration should work:

processors:
  transform:
    log_statements:
      - context: log
        statements:
          - keep_keys(resource.attributes, ["service.name", "service.namespace"])
  resource:
    attributes:
      - action: insert
        key: service_name
        from_attribute: service.name

      - action: insert
        key: service_namespace
        from_attribute: service.namespace

      - action: insert
        key: loki.resource.labels
        value: service_namespace, service_name
exporters:
  loki:
    endpoint: "http://localhost:3100/loki/api/v1/push"
service:
  pipelines:
    logs:
      receivers: [otlp]
      processors: [transform, resource]
      exporters: [loki]

transform processor can be used to filter attributes, resource processor can be used to promote attributes to loki labels.
Do you find this sufficient @gramidt? Please let me know if my example config doesn't solve an issue and if this issue is still actual :)

@github-actions
Copy link
Contributor

github-actions bot commented Apr 3, 2023

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@mar4uk
Copy link
Contributor

mar4uk commented Apr 3, 2023

@gramidt will fixing this issues/19215 solve your case?

@github-actions github-actions bot removed the Stale label Apr 4, 2023
@github-actions
Copy link
Contributor

github-actions bot commented Jun 5, 2023

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jun 5, 2023
@jpkrohling
Copy link
Member

I'm closing, as this doesn't seem to be relevant anymore. If the issue @mar4uk linked isn't enough to satisfy this use-case and if there's still interest in this, please leave a comment and I'll reopen.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:grafanalabs Grafana Labs components - Loki, Grafana exporter/loki Loki Exporter Stale
Projects
None yet
Development

No branches or pull requests

5 participants