Skip to content

Commit

Permalink
Merge branch 'main' into static-windows-installer-option
Browse files Browse the repository at this point in the history
  • Loading branch information
erikbaranowski authored Jan 4, 2024
2 parents b39c365 + 50d1620 commit 11c826c
Show file tree
Hide file tree
Showing 217 changed files with 6,122 additions and 2,638 deletions.
49 changes: 46 additions & 3 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,22 @@ internal API changes are not present.
Main (unreleased)
-----------------

### Breaking changes

- `otelcol.receiver.prometheus` will drop all `otel_scope_info` metrics when converting them to OTLP. (@wildum)
- If the `otel_scope_info` metric has labels `otel_scope_name` and `otel_scope_version`,
their values will be used to set OTLP Instrumentation Scope name and version respectively.
- Labels of `otel_scope_info` metrics other than `otel_scope_name` and `otel_scope_version`
are added as scope attributes with the matching name and version.

- The `target` block in `prometheus.exporter.blackbox` requires a mandatory `name`
argument instead of a block label. (@hainenber)

- In the azure exporter, dimension options will no longer be validated by the Azure API. (@kgeckhart)
- This change will not break any existing configurations and you can opt in to validation via the `validate_dimensions` configuration option.
- Before this change, pulling metrics for azure resources with variable dimensions required one configuration per metric + dimension combination to avoid an error.
- After this change, you can include all metrics and dimensions in a single configuration and the Azure APIs will only return dimensions which are valid for the various metrics.

### Enhancements

- Flow Windows service: Support environment variables. (@jkroepke)
Expand All @@ -27,42 +43,67 @@ Main (unreleased)
Previously, only `remote.*` and `local.*` components could be referenced
without a circular dependency. (@rfratto)

- Add support for Basic Auth-secured connection with Elasticsearch cluster using `prometheus.exporter.elasticsearch`. (@hainenber)

- Add a `resource_to_telemetry_conversion` argument to `otelcol.exporter.prometheus`
for converting resource attributes to Prometheus labels. (@hainenber)

- `pyroscope.ebpf` support python on arm64 platforms. (@korniltsev)

- `otelcol.receiver.prometheus` does not drop histograms without buckets anymore. (@wildum)

- Added exemplars support to `otelcol.receiver.prometheus`. (@wildum)
- `mimir.rules.kubernetes` may now retry its startup on failure. (@hainenber)

- Added links between compatible components in the documentation to make it
easier to discover them. (@thampiotr)

- Allow defining `HTTPClientConfig` for `discovery.ec2`. (@cmbrad)

- The `remote.http` component can optionally define a request body. (@tpaschalis)

- Added support for `loki.write` to flush WAL on agent shutdown. (@thepalbi)

- Add support for `integrations-next` static to flow config conversion. (@erikbaranowski)

- Add support for passing extra arguments to the static converter such as `-config.expand-env`. (@erikbaranowski)

- Added 'country' mmdb-type to log pipeline-stage geoip. (@superstes)

- Azure exporter enhancements for flow and static mode, (@kgeckhart)
- Allows for pulling metrics at the Azure subscription level instead of resource by resource
- Disable dimension validation by default to reduce the number of exporter instances needed for full dimension coverage

- Add `max_cache_size` to `prometheus.relabel` to allow configurability instead of hard coded 100,000. (@mattdurham)

- Add support for `http_sd_config` within a `scrape_config` for prometheus to flow config conversion. (@erikbaranowski)

- Add an option to the windows static mode installer for expanding environment vars in the yaml config. (@erikbaranowski)

### Bugfixes

- Update `pyroscope.ebpf` to fix a logical bug causing to profile to many kthreads instead of regular processes https://github.com/grafana/pyroscope/pull/2778 (@korniltsev)

- Update `pyroscope.ebpf` to produce more optimal pprof profiles for python processes https://github.com/grafana/pyroscope/pull/2788 (@korniltsev)

- In Static mode's `traces` subsystem, `spanmetrics` used to be generated prior to load balancing.
This could lead to inaccurate metrics. This issue only affects Agents using both `spanmetrics` and
This could lead to inaccurate metrics. This issue only affects Agents using both `spanmetrics` and
`load_balancing`, when running in a load balanced cluster with more than one Agent instance. (@ptodev)

- Fixes `loki.source.docker` a behavior that synced an incomplete list of targets to the tailer manager. (@FerdinandvHagen)

- Fixes `otelcol.connector.servicegraph` store ttl default value from 2ms to 2s. (@rlankfo)

- Add staleness tracking to labelstore to reduce memory usage. (@mattdurham)

- Fix issue where `prometheus.exporter.kafka` would crash when configuring `sasl_password`. (@rfratto)

### Other changes

- Bump github.com/IBM/sarama from v1.41.2 to v1.42.1

- Attatch unique Agent ID header to remote-write requests. (@captncraig)

v0.38.1 (2023-11-30)
--------------------

Expand Down Expand Up @@ -137,6 +178,8 @@ v0.38.0 (2023-11-21)

- Added support for python profiling to `pyroscope.ebpf` component. (@korniltsev)

- Added support for native Prometheus histograms to `otelcol.exporter.prometheus` (@wildum)

- Windows Flow Installer: Add /CONFIG /DISABLEPROFILING and /DISABLEREPORTING flag (@jkroepke)

- Add queueing logs remote write client for `loki.write` when WAL is enabled. (@thepalbi)
Expand Down
2 changes: 2 additions & 0 deletions cmd/grafana-agent/entrypoint.go
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ import (
"github.com/go-kit/log"
"github.com/go-kit/log/level"
"github.com/gorilla/mux"
"github.com/grafana/agent/internal/agentseed"
"github.com/grafana/agent/pkg/config"
"github.com/grafana/agent/pkg/logs"
"github.com/grafana/agent/pkg/metrics"
Expand Down Expand Up @@ -98,6 +99,7 @@ func NewEntrypoint(logger *server.Logger, cfg *config.Config, reloader Reloader)
return nil, err
}

agentseed.Init("", logger)
ep.reporter, err = usagestats.NewReporter(logger)
if err != nil {
return nil, err
Expand Down
79 changes: 77 additions & 2 deletions cmd/internal/flowmode/cmd_convert.go
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ import (
"strings"

"github.com/spf13/cobra"
"github.com/spf13/pflag"

"github.com/grafana/agent/converter"
convert_diag "github.com/grafana/agent/converter/diag"
Expand All @@ -20,6 +21,7 @@ func convertCommand() *cobra.Command {
output: "",
sourceFormat: "",
bypassErrors: false,
extraArgs: "",
}

cmd := &cobra.Command{
Expand All @@ -41,7 +43,11 @@ The -f flag can be used to specify the format we are converting from.
The -b flag can be used to bypass errors. Errors are defined as
non-critical issues identified during the conversion where an
output can still be generated.`,
output can still be generated.
The -e flag can be used to pass extra arguments to the converter
which were used by the original format. Multiple arguments can be passed
by separating them with a space.`,
Args: cobra.RangeArgs(0, 1),
SilenceUsage: true,

Expand Down Expand Up @@ -71,6 +77,7 @@ output can still be generated.`,
cmd.Flags().StringVarP(&f.report, "report", "r", f.report, "The filepath and filename where the report is written.")
cmd.Flags().StringVarP(&f.sourceFormat, "source-format", "f", f.sourceFormat, fmt.Sprintf("The format of the source file. Supported formats: %s.", supportedFormatsList()))
cmd.Flags().BoolVarP(&f.bypassErrors, "bypass-errors", "b", f.bypassErrors, "Enable bypassing errors when converting")
cmd.Flags().StringVarP(&f.extraArgs, "extra-args", "e", f.extraArgs, "Extra arguments from the original format used by the converter. Multiple arguments can be passed by separating them with a space.")
return cmd
}

Expand All @@ -79,6 +86,7 @@ type flowConvert struct {
report string
sourceFormat string
bypassErrors bool
extraArgs string
}

func (fc *flowConvert) Run(configFile string) error {
Expand Down Expand Up @@ -112,7 +120,12 @@ func convert(r io.Reader, fc *flowConvert) error {
return err
}

riverBytes, diags := converter.Convert(inputBytes, converter.Input(fc.sourceFormat), []string{})
ea, err := parseExtraArgs(fc.extraArgs)
if err != nil {
return err
}

riverBytes, diags := converter.Convert(inputBytes, converter.Input(fc.sourceFormat), ea)
err = generateConvertReport(diags, fc)
if err != nil {
return err
Expand Down Expand Up @@ -174,3 +187,65 @@ func supportedFormatsList() string {
}
return strings.Join(ret, ", ")
}

func parseExtraArgs(extraArgs string) ([]string, error) {
var result []string
if extraArgs == "" {
return result, nil
}

arguments := strings.Fields(extraArgs)
for i, arg := range arguments {
fs := pflag.NewFlagSet("extra-args", pflag.ExitOnError)
fs.ParseErrorsWhitelist.UnknownFlags = true
keyStartIndex := 0
doParseFlagValue := false

// Split the argument into key and value.
splitArgs := strings.SplitN(arg, "=", 2)

// Append the key to the result.
result = append(result, splitArgs[0])

// If the flag has a value, add it to the FlagSet for parsing.
if len(splitArgs) == 2 && splitArgs[1] != "" {
doParseFlagValue = true
if arg[1] == '-' { // longhand flag, ie. --flag=value
keyStartIndex = 2
} else if arg[0] == '-' { // shorthand flag, ie. -f=value
keyStartIndex = 1
} else { // invalid flag that was split on '=' but has no dashes in the key
return nil, fmt.Errorf("invalid flag found: %s", arg)
}
}

if doParseFlagValue {
result = append(result, "")
lastIndex := len(result) - 1
key := splitArgs[0][keyStartIndex:]
if keyStartIndex == 2 {
fs.StringVar(&result[lastIndex], key, result[lastIndex], "")
} else {
// static mode uses keys with a single dash. We need to sanitize them here.
if len(key) != 1 {
arguments[i] = "-" + arguments[i]
fs.StringVar(&result[lastIndex], key, result[lastIndex], "")
} else {
fs.StringVarP(&result[lastIndex], "", key, result[lastIndex], "")
}
}

// We must parse the flag here because the pointer to the array element
// &result[lastIndex] is overridden by the next iteration of the loop.
// This can be improved if we preallocate the array, however we won't
// know the final length without analyzing the arguments so there
// is some complexity in doing so.
err := fs.Parse(arguments)
if err != nil {
return nil, err
}
}
}

return result, nil
}
86 changes: 86 additions & 0 deletions cmd/internal/flowmode/cmd_convert_test.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
package flowmode

import (
"testing"

"github.com/stretchr/testify/require"
)

func TestParseExtraArgs(t *testing.T) {
type testCase struct {
name string
extraArgs string
expected []string
expectedError string
}

var testCases = []testCase{
{
name: "integrations next with env vars",
extraArgs: "-enable-features=integrations-next -config.expand-env",
expected: []string{"-enable-features", "integrations-next", "-config.expand-env"},
},
{
name: "longhand",
extraArgs: "--key=value",
expected: []string{"--key", "value"},
},
{
name: "shorthand",
extraArgs: "-k=value",
expected: []string{"-k", "value"},
},
{
name: "bool longhand",
extraArgs: "--boolVariable",
expected: []string{"--boolVariable"},
},
{
name: "bool shorthand",
extraArgs: "-b",
expected: []string{"-b"},
},
{
name: "combo",
extraArgs: "--key=value -k=value --boolVariable -b",
expected: []string{"--key", "value", "-k", "value", "--boolVariable", "-b"},
},
{
name: "spaced",
extraArgs: "--key value",
expected: []string{"--key", "value"},
},
{
name: "value with equals",
extraArgs: `--key="foo=bar"`,
expected: []string{"--key", `"foo=bar"`},
},
{
name: "no value",
extraArgs: "--key=",
expected: []string{"--key"},
},
{
name: "no dashes",
extraArgs: "key",
expected: []string{"key"},
},
{
name: "no dashes with value",
extraArgs: "key=value",
expectedError: "invalid flag found: key=value",
},
}

for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
res, err := parseExtraArgs(tc.extraArgs)
if tc.expectedError != "" {
require.EqualError(t, err, tc.expectedError)
return
}
require.NoError(t, err)
require.Equal(t, tc.expected, res)
})
}
}
17 changes: 13 additions & 4 deletions cmd/internal/flowmode/cmd_run.go
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ import (
"github.com/grafana/agent/component"
"github.com/grafana/agent/converter"
convert_diag "github.com/grafana/agent/converter/diag"
"github.com/grafana/agent/internal/agentseed"
"github.com/grafana/agent/pkg/boringcrypto"
"github.com/grafana/agent/pkg/config/instrumentation"
"github.com/grafana/agent/pkg/flow"
Expand Down Expand Up @@ -123,6 +124,7 @@ depending on the nature of the reload error.
BoolVar(&r.disableReporting, "disable-reporting", r.disableReporting, "Disable reporting of enabled components to Grafana.")
cmd.Flags().StringVar(&r.configFormat, "config.format", r.configFormat, fmt.Sprintf("The format of the source file. Supported formats: %s.", supportedFormatsList()))
cmd.Flags().BoolVar(&r.configBypassConversionErrors, "config.bypass-conversion-errors", r.configBypassConversionErrors, "Enable bypassing errors when converting")
cmd.Flags().StringVar(&r.configExtraArgs, "config.extra-args", r.configExtraArgs, "Extra arguments from the original format used by the converter. Multiple arguments can be passed by separating them with a space.")
return cmd
}

Expand All @@ -144,6 +146,7 @@ type flowRun struct {
clusterName string
configFormat string
configBypassConversionErrors bool
configExtraArgs string
}

func (fr *flowRun) Run(configPath string) error {
Expand Down Expand Up @@ -245,7 +248,8 @@ func (fr *flowRun) Run(configPath string) error {
return fmt.Errorf("failed to create otel service")
}

labelService := labelstore.New(l)
labelService := labelstore.New(l, reg)
agentseed.Init(fr.storagePath, l)

f := flow.New(flow.Options{
Logger: l,
Expand All @@ -263,7 +267,7 @@ func (fr *flowRun) Run(configPath string) error {

ready = f.Ready
reload = func() (*flow.Source, error) {
flowSource, err := loadFlowSource(configPath, fr.configFormat, fr.configBypassConversionErrors)
flowSource, err := loadFlowSource(configPath, fr.configFormat, fr.configBypassConversionErrors, fr.configExtraArgs)
defer instrumentation.InstrumentSHA256(flowSource.SHA256())
defer instrumentation.InstrumentLoad(err == nil)

Expand Down Expand Up @@ -362,7 +366,7 @@ func getEnabledComponentsFunc(f *flow.Flow) func() map[string]interface{} {
}
}

func loadFlowSource(path string, converterSourceFormat string, converterBypassErrors bool) (*flow.Source, error) {
func loadFlowSource(path string, converterSourceFormat string, converterBypassErrors bool, configExtraArgs string) (*flow.Source, error) {
fi, err := os.Stat(path)
if err != nil {
return nil, err
Expand Down Expand Up @@ -403,7 +407,12 @@ func loadFlowSource(path string, converterSourceFormat string, converterBypassEr
}
if converterSourceFormat != "flow" {
var diags convert_diag.Diagnostics
bb, diags = converter.Convert(bb, converter.Input(converterSourceFormat), []string{})
ea, err := parseExtraArgs(configExtraArgs)
if err != nil {
return nil, err
}

bb, diags = converter.Convert(bb, converter.Input(converterSourceFormat), ea)
hasError := hasErrorLevel(diags, convert_diag.SeverityLevelError)
hasCritical := hasErrorLevel(diags, convert_diag.SeverityLevelCritical)
if hasCritical || (!converterBypassErrors && hasError) {
Expand Down
Loading

0 comments on commit 11c826c

Please sign in to comment.