Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing Json log exporter #3744

Merged
merged 5 commits into from
Oct 14, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions exporters/logging-otlp/build.gradle.kts
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,14 @@ otelJava.moduleName.set("io.opentelemetry.exporter.logging.otlp")
dependencies {
compileOnly(project(":sdk:trace"))
compileOnly(project(":sdk:metrics"))
compileOnly(project(":sdk:logs"))

implementation(project(":exporters:otlp:common"))

implementation("com.fasterxml.jackson.core:jackson-core")

testImplementation(project(":sdk:testing"))
testImplementation(project(":sdk:logs"))

testImplementation("org.skyscreamer:jsonassert")
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
/*
* Copyright The OpenTelemetry Authors
* SPDX-License-Identifier: Apache-2.0
*/

package io.opentelemetry.exporter.logging.otlp;

import static io.opentelemetry.exporter.logging.otlp.JsonUtil.JSON_FACTORY;

import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.io.SegmentedStringWriter;
import io.opentelemetry.exporter.otlp.internal.logs.ResourceLogsMarshaler;
import io.opentelemetry.sdk.common.CompletableResultCode;
import io.opentelemetry.sdk.logs.data.LogData;
import io.opentelemetry.sdk.logs.export.LogExporter;
import java.io.IOException;
import java.util.Collection;
import java.util.logging.Level;
import java.util.logging.Logger;

/**
* A {@link LogExporter} which writes {@linkplain LogData logs} to a {@link Logger} in OTLP JSON
* format. Each log line will include a single {@code ResourceLogs}.
*/
public final class OtlpJsonLoggingLogExporter implements LogExporter {

private static final Logger logger = Logger.getLogger(OtlpJsonLoggingLogExporter.class.getName());

/** Returns a new {@link OtlpJsonLoggingLogExporter}. */
public static LogExporter create() {
return new OtlpJsonLoggingLogExporter();
}

private OtlpJsonLoggingLogExporter() {}

@Override
public CompletableResultCode export(Collection<LogData> logs) {
ResourceLogsMarshaler[] allResourceLogs = ResourceLogsMarshaler.create(logs);
for (ResourceLogsMarshaler resourceLogs : allResourceLogs) {
SegmentedStringWriter sw = new SegmentedStringWriter(JSON_FACTORY._getBufferRecycler());
try (JsonGenerator gen = JsonUtil.create(sw)) {
resourceLogs.writeJsonTo(gen);
} catch (IOException e) {
// Shouldn't happen in practice, just skip it.
continue;
}
logger.log(Level.INFO, sw.getAndClear());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will end up putting a whole lot of log lines on a single log output. I don't think that's what I'd expect to happen. I think this should only output one log per line, rather than cramming the whole json body into one line.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I remember reading a PR from a while back that part of the intent of these loggers is that another process could parse each log line as a Resource{type} message. If that is indeed part of the intended use case of the log exporters, we shouldn't break convention.

Copy link
Member

@jack-berg jack-berg Oct 14, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea in general of logging exported LogRecords is cyclical when thinking about it from the perspective of application logs: an application logs a message -> an appender feeds to the otel log sink -> the log exporter logs the message in json format -> an appender feeds to the otel log sink

I think its only really practical to use one of these log exporters if you're collecting events as LogRecords rather than application logs.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. I see the point. We should have the equivalent of the LoggingSpanExporter that will just mirror the logs back to JUL, one per line, which can be very useful for debugging. But, of course, problematic if used with a JUL appender! Perhaps that one (which I know is not this one) should just use stdout.

}
return CompletableResultCode.ofSuccess();
}

@Override
public CompletableResultCode shutdown() {
return CompletableResultCode.ofSuccess();
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,160 @@
/*
* Copyright The OpenTelemetry Authors
* SPDX-License-Identifier: Apache-2.0
*/

package io.opentelemetry.exporter.logging.otlp;

import static io.opentelemetry.api.common.AttributeKey.booleanKey;
import static io.opentelemetry.api.common.AttributeKey.longKey;
import static io.opentelemetry.api.common.AttributeKey.stringKey;
import static org.assertj.core.api.Assertions.assertThat;

import io.github.netmikey.logunit.api.LogCapturer;
import io.opentelemetry.api.common.Attributes;
import io.opentelemetry.sdk.common.InstrumentationLibraryInfo;
import io.opentelemetry.sdk.logs.data.LogData;
import io.opentelemetry.sdk.logs.data.LogRecord;
import io.opentelemetry.sdk.logs.data.Severity;
import io.opentelemetry.sdk.logs.export.LogExporter;
import io.opentelemetry.sdk.resources.Resource;
import java.util.Arrays;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.skyscreamer.jsonassert.JSONAssert;
import org.slf4j.event.Level;

class OtlpJsonLoggingLogExporterTest {

private static final Resource RESOURCE =
Resource.create(Attributes.builder().put("key", "value").build());

private static final LogData LOG1 =
LogRecord.builder(RESOURCE, InstrumentationLibraryInfo.create("instrumentation", "1"))
.setName("testLog1")
.setBody("body1")
.setFlags(0)
.setSeverity(Severity.INFO)
.setSeverityText("INFO")
.setSpanId("8765432112345876")
.setTraceId("12345678876543211234567887654322")
.setEpochMillis(1631533710L)
.setAttributes(Attributes.of(stringKey("animal"), "cat", longKey("lives"), 9L))
.build();

private static final LogData LOG2 =
LogRecord.builder(RESOURCE, InstrumentationLibraryInfo.create("instrumentation2", "2"))
.setName("testLog2")
.setBody("body2")
.setFlags(0)
.setSeverity(Severity.INFO)
.setSeverityText("INFO")
.setSpanId("8765432112345875")
.setTraceId("12345678876543211234567887654322")
.setEpochMillis(1631533710L)
.setAttributes(Attributes.of(booleanKey("important"), true))
.build();

@RegisterExtension
LogCapturer logs = LogCapturer.create().captureForType(OtlpJsonLoggingLogExporter.class);

LogExporter exporter;

@BeforeEach
void setUp() {
exporter = OtlpJsonLoggingLogExporter.create();
}

@Test
void log() throws Exception {
exporter.export(Arrays.asList(LOG1, LOG2));

assertThat(logs.getEvents())
.hasSize(1)
.allSatisfy(log -> assertThat(log.getLevel()).isEqualTo(Level.INFO));
JSONAssert.assertEquals(
"{\n"
+ " \"resource\":{\n"
+ " \"attributes\":[\n"
+ " {\n"
+ " \"key\":\"key\",\n"
+ " \"value\":{\n"
+ " \"stringValue\":\"value\"\n"
+ " }\n"
+ " }\n"
+ " ]\n"
+ " },\n"
+ " \"instrumentationLibraryLogs\":[\n"
+ " {\n"
+ " \"instrumentationLibrary\":{\n"
+ " \"name\":\"instrumentation2\",\n"
+ " \"version\":\"2\"\n"
+ " },\n"
+ " \"logs\":[\n"
+ " {\n"
+ " \"timeUnixNano\":\"1631533710000000\",\n"
+ " \"severityNumber\":\"SEVERITY_NUMBER_INFO\",\n"
+ " \"severityText\":\"INFO\",\n"
+ " \"name\":\"testLog2\",\n"
+ " \"body\":{\n"
+ " \"stringValue\":\"body2\"\n"
+ " },\n"
+ " \"attributes\":[\n"
+ " {\n"
+ " \"key\":\"important\",\n"
+ " \"value\":{\n"
+ " \"boolValue\":true\n"
+ " }\n"
+ " }\n"
+ " ],\n"
+ " \"traceId\":\"12345678876543211234567887654322\",\n"
+ " \"spanId\":\"8765432112345875\"\n"
+ " }\n"
+ " ]\n"
+ " },\n"
+ " {\n"
+ " \"instrumentationLibrary\":{\n"
+ " \"name\":\"instrumentation\",\n"
+ " \"version\":\"1\"\n"
+ " },\n"
+ " \"logs\":[\n"
+ " {\n"
+ " \"timeUnixNano\":\"1631533710000000\",\n"
+ " \"severityNumber\":\"SEVERITY_NUMBER_INFO\",\n"
+ " \"severityText\":\"INFO\",\n"
+ " \"name\":\"testLog1\",\n"
+ " \"body\":{\n"
+ " \"stringValue\":\"body1\"\n"
+ " },\n"
+ " \"attributes\":[\n"
+ " {\n"
+ " \"key\":\"animal\",\n"
+ " \"value\":{\n"
+ " \"stringValue\":\"cat\"\n"
+ " }\n"
+ " },\n"
+ " {\n"
+ " \"key\":\"lives\",\n"
+ " \"value\":{\n"
+ " \"intValue\":\"9\"\n"
+ " }\n"
+ " }\n"
+ " ],\n"
+ " \"traceId\":\"12345678876543211234567887654322\",\n"
+ " \"spanId\":\"8765432112345876\"\n"
+ " }\n"
+ " ]\n"
+ " }\n"
+ " ]\n"
+ "}",
logs.getEvents().get(0).getMessage(),
/* strict= */ false);
assertThat(logs.getEvents().get(0).getMessage()).doesNotContain("\n");
}

@Test
void shutdown() {
assertThat(exporter.shutdown().isSuccess()).isTrue();
}
}