Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jaeger no flush interval #609

Conversation

dyladan
Copy link
Member

@dyladan dyladan commented Dec 11, 2019

Which problem is this PR solving?

Short description of the changes

  • Remove the interval that periodically flushes the jaeger exporter. The exporter already flushes on every export and should depend on the span processor for batching

@codecov-io
Copy link

codecov-io commented Dec 11, 2019

Codecov Report

Merging #609 into master will decrease coverage by 1.69%.
The diff coverage is 86.84%.

@@            Coverage Diff            @@
##           master     #609     +/-   ##
=========================================
- Coverage   92.13%   90.44%   -1.7%     
=========================================
  Files         182      180      -2     
  Lines        9041     9135     +94     
  Branches      792      814     +22     
=========================================
- Hits         8330     8262     -68     
- Misses        711      873    +162
Impacted Files Coverage Δ
...ackages/opentelemetry-exporter-jaeger/src/types.ts 100% <ø> (ø) ⬆️
...telemetry-tracing/src/export/BatchSpanProcessor.ts 100% <100%> (ø) ⬆️
.../opentelemetry-exporter-jaeger/test/jaeger.test.ts 100% <100%> (ø) ⬆️
...ckages/opentelemetry-exporter-jaeger/src/jaeger.ts 90% <82.14%> (-2.5%) ⬇️
...pentelemetry-core/test/internal/validators.test.ts 50% <0%> (-50%) ⬇️
...elemetry-core/test/trace/spancontext-utils.test.ts 55.55% <0%> (-44.45%) ⬇️
...lemetry-core/test/trace/ProbabilitySampler.test.ts 56.52% <0%> (-43.48%) ⬇️
...s/opentelemetry-core/test/trace/NoopTracer.test.ts 60% <0%> (-40%) ⬇️
...s/opentelemetry-core/test/context/B3Format.test.ts 63.39% <0%> (-36.61%) ⬇️
...ges/opentelemetry-core/test/trace/NoopSpan.test.ts 63.63% <0%> (-36.37%) ⬇️
... and 33 more

@dyladan
Copy link
Member Author

dyladan commented Dec 11, 2019

Explanation of the issue

BatchSpanProcessor has a 20 second default timeout
JaegerExporter has a 5 second default timeout

When the JaegerExporter exports, it calls append on the _sender. In the default configuration, the sender flushes on every append.

After 20 seconds, the BatchSpanProcessor exports a batch of spans, which are appended and then exported. During this export, the 4th 5 second interval of the Jaeger exporter timeout occurs and it attempts to flush. 2 concurrent flushes cause the crash.

Fix: remove the 5 second timer from jaeger exporter

the jaeger exporter is already flushing on every export so there is no need for it to ever flush manually.

Brought over from issue thread

this._logger.debug('successful append for : %s', thriftSpan.length);

// Flush all spans on each export. No-op if span buffer is empty
await this._flush();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps flush should happen whenever thriftSpan.length > 0 and move above debug statement in same if loop. WDYT?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First thing the sender does is check length and return if zero and this isn't a critical path or tight loop. Can change if you feel strongly though

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok. I was little annoyed by the above debug log, even though nothing was exported it just printed successful append for : 0 every few secs.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh i understand, yea that can be fixed

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

@mayurkale22 mayurkale22 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall lgtm

@dyladan
Copy link
Member Author

dyladan commented Dec 13, 2019

I also added e7d0695 which makes the batch span processor periodic timer not export if there is nothing to export

@dyladan dyladan added Awaiting reviewer feedback bug Something isn't working labels Dec 13, 2019
@mayurkale22 mayurkale22 merged commit b4ab8a5 into open-telemetry:master Dec 13, 2019
@dyladan dyladan deleted the jaeger-no-flush-interval branch December 16, 2019 21:32
pichlermarc pushed a commit to dynatrace-oss-contrib/opentelemetry-js that referenced this pull request Dec 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants