Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lambda exiting before sending telemetry data #17

Closed
dhenard opened this issue Jan 13, 2021 · 4 comments
Closed

Lambda exiting before sending telemetry data #17

dhenard opened this issue Jan 13, 2021 · 4 comments
Labels
bug Something isn't working lambda Lambda related issues

Comments

@dhenard
Copy link

dhenard commented Jan 13, 2021

Snippet of main handler with delay added to give the function time to send the data is below. Without the delay at the end it would not send the data more than it would. Everything works with the delay but would rather not incur the extra cost of the 400ms per function call.

I have a separate file in the project that creates the tracer. Any suggestions on how to get this functioning without the delay would be appreciated.

` const { tracing } = require('./tracing');

exports.handler = async(event, context) => {
const delay = 400;
const tracer = tracing;
const span = tracer.startSpan('POS Request');
context.callbackWaitsForEmptyEventLoop = true;
let rslt = '';

span.setAttributes({
    'service.version': process.env.SERVICE_VERSION,
    'faas.execution': context.awsRequestId,
    'faas.coldstart': coldStartFlag,
    'faas.logStream': context.logStreamName
});

coldStartFlag = false;

if (event.key_id != -1) {
    span.addEvent('Start processing');
    rslt = await processEvent(event, tracer, span);
} else {
    rslt = JSON.stringify({Path: 'No project selected.'});
    span.addEvent('Project Missing', { result: rslt });
}
if (typeof rslt === 'string' && rslt.toUpperCase().includes('ERROR') || rslt[0] === undefined) {
    span.end();
    await tracer.getActiveSpanProcessor().forceFlush();
    // await tracer.getActiveSpanProcessor().shutdown();
    await sleep(delay);
    throw rslt;
}
else {
    span.addEvent('Finished processing');
    span.end();
    await tracer.getActiveSpanProcessor().forceFlush();
    // await tracer.getActiveSpanProcessor().shutdown();
    await sleep(delay);
    return rslt;
}

}; `

@anuraaga
Copy link

Hi @dhenard - can you confirm the span processor you are using is the BatchSpanProcessor which should be the default? Your code has the forceFlush which should really be taking care of blocking until the span's been spent as implemented here

https:/open-telemetry/opentelemetry-js/blob/392c43f2f6b10c608e8882cd97925a2fedd58b08/packages/opentelemetry-tracing/src/export/BatchSpanProcessor.ts#L55

If you happen to be using the SimpleSpanProcessor, then indeed there seems to be a bug there as pending requests would not actually be waited on.

https:/open-telemetry/opentelemetry-js/blob/392c43f2f6b10c608e8882cd97925a2fedd58b08/packages/opentelemetry-tracing/src/export/SimpleSpanProcessor.ts#L38

@anuraaga
Copy link

Sorry I noticed our docs are incorrectly using SimpleSpanProcessor so I guess you might be using it

https://aws-otel.github.io/docs/getting-started/js-sdk/trace-manual-instr#export-trace-data-to-otlp-exporter

If you can try the BatchSpanProcessor (I think just replace the import / constructor call) that would be great.

@dhenard
Copy link
Author

dhenard commented Jan 18, 2021

Hey @anuraaga - Thanks for looking into this, I was using the SimpleSpanProcessor but will give the BatchSpanProcessor a go.

BatchSpanProcessor is working without the need for a delay.

@anuraaga
Copy link

@dhenard Excellent, glad to hear that! Let me close this issue and feel free to post again if anything comes up.

@alolita alolita added lambda Lambda related issues bug Something isn't working labels Dec 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working lambda Lambda related issues
Projects
None yet
Development

No branches or pull requests

3 participants