Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: OTLP logs only output when batch processor fails #1962

Open
landonxjames opened this issue Jul 24, 2024 · 3 comments
Open

[Bug]: OTLP logs only output when batch processor fails #1962

landonxjames opened this issue Jul 24, 2024 · 3 comments
Labels
M-exporter-otlp triage:todo Needs to be traiged.

Comments

@landonxjames
Copy link

What happened?

I have the following minimal example using the otel crates to observe the aws-sdk-rust's existing tracing spans: https://github.com/landonxjames/aws-rust-sdk-otel-test

The setup for my tracing subscriber can be found here and the full list of dependencies here.

I am running a Jaeger instance to collect the metrics with the following command:

docker run -d --name jaeger -e COLLECTOR_OTLP_ENABLED=true -p 16686:16686 -p 4317:4317 -p 4318:4318 jaegertracing/all-in-one:latest

I am then running the linked program with cargo run --release.

The output of my program looks like:

DynamoDB client version: 1.38.0
Region:                  us-west-2

Tables:
  MyTable1
  MyTable2
Found 2 tables
OpenTelemetry trace error occurred. cannot send message to batch processor as the channel is closed
OpenTelemetry trace error occurred. cannot send message to batch processor as the channel is closed

Note that the seeming error lines at the end are not always present. Strangely metrics only appear in my Jaeger instance when those error logs are present. When the error logs are not present nothing shows up in Jaeger at all. The error is present (and thus the logs show up) in about ~25% of my runs.

API Version

0.24.0

SDK Version

0.24.1

What Exporter(s) are you seeing the problem on?

OTLP

Relevant log output

OpenTelemetry trace error occurred. cannot send message to batch processor as the channel is closed
OpenTelemetry trace error occurred. cannot send message to batch processor as the channel is closed
@landonxjames landonxjames added bug Something isn't working triage:todo Needs to be traiged. labels Jul 24, 2024
@landonxjames landonxjames changed the title [Bug]: Intermittent failures of batch processor [Bug]: OTLP logs only output when batch processor fails Jul 24, 2024
@cijothomas
Copy link
Member

@landonxjames
Copy link
Author

Can you try and do an explicit shutdown of the provider?

Unfortunately that did not help. But it did push me a bit further in debugging. I pulled your linked example locally (and commented out all the metrics/logs bits since Jaeger doesn't seem to support those), and I was able to consistently get the OTel created spans showing up in Jaeger (the ones inside tracer.in_span()).

I then added the following lines to the main of that example with the intention of also exporting traces from the tracing crate to Jaeger:

let tracer = tracer_provider.tracer("MY_TEST_TRACER");
let telemetry_layer = tracing_opentelemetry::layer().with_tracer(tracer);

...

tracing_subscriber::registry()
        .with(filter)
        .with(telemetry_layer) //add new layer to registry
        .init();

Adding the telemetry_layer to the tracing_subscriber::registry() seems to be the culprit as that stops any traces from being exported to Jaeger. Seems like the issue is likely in the tracing_opentelemetry crate.

@cijothomas
Copy link
Member

Thanks for the additional details. Most likely its version conflict - we see such issues reported often in this repo, though we don't own tracing-opentelemetry to actually help. #1571 (comment) will help with this, as this repo will natively support tracing soon.

Please let us know if you are able to fix the issue with latest version of all crates.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
M-exporter-otlp triage:todo Needs to be traiged.
Projects
None yet
Development

No branches or pull requests

2 participants