-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
Description
In one of our internal applications, I noticed a data race in opentelemetry-go logging code. We're using otelzap, but I think the issue
Environment
- OS: Ubuntu 24.04
- Architecture: x86_64
- Go Version: 1.24.6
- opentelemetry-go version: v1.37.0
Steps To Reproduce
Run the program below with the race detector. This is a minimised reproducer for a race I found in opentelemetry-collector.
package main
import (
"sync"
"go.opentelemetry.io/otel/attribute"
"go.opentelemetry.io/otel/log"
sdklog "go.opentelemetry.io/otel/sdk/log"
)
func main() {
provider := sdklog.NewLoggerProvider()
opt := log.WithInstrumentationAttributes(
attribute.String("key1", "value1"),
attribute.String("key2", "value2"),
)
var g sync.WaitGroup
for i := 0; i < 2; i++ {
g.Add(1)
go func() {
defer g.Done()
_ = provider.Logger("", opt)
}()
}
g.Wait()
}
With v1.37.0/v0.13.0, you will observe a data race:
==================
WARNING: DATA RACE
Read at 0x00c00017c180 by goroutine 8:
slices.insertionSortCmpFunc[go.shape.struct { Key go.opentelemetry.io/otel/attribute.Key; Value go.opentelemetry.io/otel/attribute.Value }]()
/home/andrew/go/toolchain/1.24.6/src/slices/zsortanyfunc.go:12 +0x43a
slices.stableCmpFunc[go.shape.struct { Key go.opentelemetry.io/otel/attribute.Key; Value go.opentelemetry.io/otel/attribute.Value }]()
/home/andrew/go/toolchain/1.24.6/src/slices/zsortanyfunc.go:343 +0x67
slices.SortStableFunc[go.shape.[]go.opentelemetry.io/otel/attribute.KeyValue,go.shape.struct { Key go.opentelemetry.io/otel/attribute.Key; Value go.opentelemetry.io/otel/attribute.Value }]()
/home/andrew/go/toolchain/1.24.6/src/slices/sort.go:38 +0x92
go.opentelemetry.io/otel/attribute.NewSetWithFiltered()
/home/andrew/go/pkg/mod/go.opentelemetry.io/[email protected]/attribute/set.go:212 +0x57
go.opentelemetry.io/otel/attribute.NewSet()
/home/andrew/go/pkg/mod/go.opentelemetry.io/[email protected]/attribute/set.go:184 +0x3c
main.main.WithInstrumentationAttributes.func2()
/home/andrew/go/pkg/mod/go.opentelemetry.io/otel/[email protected]/logger.go:123 +0x97
go.opentelemetry.io/otel/log.loggerOptionFunc.applyLogger()
/home/andrew/go/pkg/mod/go.opentelemetry.io/otel/[email protected]/logger.go:105 +0xea
go.opentelemetry.io/otel/log.NewLoggerConfig()
/home/andrew/go/pkg/mod/go.opentelemetry.io/otel/[email protected]/logger.go:80 +0x2a1
go.opentelemetry.io/otel/sdk/log.(*LoggerProvider).Logger()
/home/andrew/go/pkg/mod/go.opentelemetry.io/otel/sdk/[email protected]/provider.go:113 +0x31f
main.main.func1()
/tmp/foo/main.go:22 +0xec
Previous write at 0x00c00017c180 by goroutine 9:
go.opentelemetry.io/otel/attribute.NewSetWithFiltered()
/home/andrew/go/pkg/mod/go.opentelemetry.io/[email protected]/attribute/set.go:230 +0x384
go.opentelemetry.io/otel/attribute.NewSet()
/home/andrew/go/pkg/mod/go.opentelemetry.io/[email protected]/attribute/set.go:184 +0x3c
main.main.WithInstrumentationAttributes.func2()
/home/andrew/go/pkg/mod/go.opentelemetry.io/otel/[email protected]/logger.go:123 +0x97
go.opentelemetry.io/otel/log.loggerOptionFunc.applyLogger()
/home/andrew/go/pkg/mod/go.opentelemetry.io/otel/[email protected]/logger.go:105 +0xea
go.opentelemetry.io/otel/log.NewLoggerConfig()
/home/andrew/go/pkg/mod/go.opentelemetry.io/otel/[email protected]/logger.go:80 +0x2a1
go.opentelemetry.io/otel/sdk/log.(*LoggerProvider).Logger()
/home/andrew/go/pkg/mod/go.opentelemetry.io/otel/sdk/[email protected]/provider.go:113 +0x31f
main.main.func1()
/tmp/foo/main.go:22 +0xec
Goroutine 8 (running) created at:
main.main()
/tmp/foo/main.go:20 +0x36c
Goroutine 9 (finished) created at:
main.main()
/tmp/foo/main.go:20 +0x36c
==================
Found 1 data race(s)
exit status 66
The problem is that WithInstrumentationAttributes
is passing variadic args to attribute.NewSet
, two loggers may share the same slice, and attribute.NewSet
mutates the slice (in-place sorting/de-duplication). I don't know if it's the best fix, but one way to fix this is with this:
diff --git a/log/logger.go b/log/logger.go
index 8441ca88..f6a259d1 100644
--- a/log/logger.go
+++ b/log/logger.go
@@ -5,6 +5,7 @@ package log // import "go.opentelemetry.io/otel/log"
import (
"context"
+ "slices"
"go.opentelemetry.io/otel/attribute"
"go.opentelemetry.io/otel/log/embedded"
@@ -120,6 +121,7 @@ func WithInstrumentationVersion(version string) LoggerOption {
// The passed attributes will be de-duplicated.
func WithInstrumentationAttributes(attr ...attribute.KeyValue) LoggerOption {
return loggerOptionFunc(func(config LoggerConfig) LoggerConfig {
+ attr := slices.Clone(attr)
config.attrs = attribute.NewSet(attr...)
return config
})
Expected behavior
No data race.
Tip: React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1
or me too
, to help us triage it. Learn more here.