Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Constantly memory leaking on x-pack.reporting #90274

Closed
drTr0jan opened this issue Feb 4, 2021 · 13 comments
Closed

Constantly memory leaking on x-pack.reporting #90274

drTr0jan opened this issue Feb 4, 2021 · 13 comments
Labels
bug Fixes for quality problems that affect the customer experience (Deprecated) Feature:Reporting Use Reporting:Screenshot, Reporting:CSV, or Reporting:Framework instead impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. loe:medium Medium Level of Effort regression

Comments

@drTr0jan
Copy link

drTr0jan commented Feb 4, 2021

  • Kibana 7.10.1
  • Elasticsearch 7.10.1
  • FreeBSD 12.2
  • node.js 10.23.1 (run with --max_old_space_size=1024)

I've two installation of Kibana with same config and each of them has constantly memory leaking about ~100 MB/day.
After node has reached about 1163 MB of resident set memory consumption (by ps(1)) then it falls with fatal error - JavaScript heap out of memory.

07:29:16.041746 {"type":"log","@timestamp":"2021-02-03T07:29:16Z","tags":["warning","plugins","apm"],"pid":51396,"message":"Failed executing APM telemetry task integrations"}
07:29:16.048662 {"type":"log","@timestamp":"2021-02-03T07:29:16Z","tags":["warning","plugins","apm"],"pid":51396,"message":"{ Error: [security_exception] action [indices:data/read/get] is unauthorized for user [kibana_system]\n    at respond (/usr/local/www/kibana7/node_modules/elasticsearch/src/lib/transport.js:349:15)\n    at checkRespForFailure (/usr/local/www/kibana7/node_modules/elasticsearch/src/lib/transport.js:306:7)\n    at HttpConnector.<anonymous> (/usr/local/www/kibana7/node_modules/elasticsearch/src/lib/connectors/http.js:173:7)\n    at IncomingMessage.wrapper (/usr/local/www/kibana7/node_modules/lodash/lodash.js:4949:19)\n    at IncomingMessage.emit (events.js:203:15)\n    at endReadableNT (_stream_readable.js:1145:12)\n    at process._tickCallback (internal/process/next_tick.js:63:19)\n  status: 403,\n  displayName: 'AuthorizationException',\n  message:\n   '[security_exception] action [indices:data/read/get] is unauthorized for user [kibana_system]',\n  path: '/_ml/anomaly_detectors/apm-*,*-high_mean_response_time',\n  query: undefined,\n  body:\n   { error:\n      { root_cause: [Array],\n        type: 'security_exception',\n        reason:\n         'action [indices:data/read/get] is unauthorized for user [kibana_system]' },\n     status: 403 },\n  statusCode: 403,\n  response:\n   '{\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"action [indices:data/read/get] is unauthorized for user [kibana_system]\"}],\"type\":\"security_exception\",\"reason\":\"action [indices:data/read/get] is unauthorized for user [kibana_system]\"},\"status\":403}',\n  toString: [Function],\n  toJSON: [Function] }"}
14:11:23.595931 
14:11:23.596298 <--- Last few GCs --->
14:11:23.596314 
14:11:23.596327 [51396:0x803629dc0] 710196405 ms: Mark-sweep 1005.2 (1043.2) -> 1005.2 (1043.2) MB, 634.4 / 0.0 ms  (average mu = 0.673, current mu = 0.154) last resort GC in old space requested
14:11:23.596345 [51396:0x803629dc0] 710197159 ms: Mark-sweep 1005.2 (1043.2) -> 1005.2 (1043.2) MB, 753.3 / 0.0 ms  (average mu = 0.485, current mu = 0.000) last resort GC in old space requested
14:11:23.596366 
14:11:23.596381 
14:11:23.596393 <--- JS stacktrace --->
14:11:23.596405 
14:11:23.596417 ==== JS stack trace =========================================
14:11:23.596429 
14:11:23.596441 Security context: 0x35eb32b1e6c1 <JSObject>
14:11:23.596453     0: builtin exit frame: parse(this=0x35eb32b119f9 <Object map = 0x17a6738842a9>,0x223cc4a49271 <String[228]: {"took":11,"timed_out":false,"total":0,"updated":0,"deleted":0,"batches":0,"version_conflicts":0,"noops":0,"retries":{"bulk":0,"search":0},"throttled_millis":0,"requests_per_second":-1.0,"throttled_until_millis":0,"failures":[]}>,0x35eb32b119f9 <Object map = 0x17a6738842a9>)
14:11:23.596475 
14:11:23.596487     1: de...
14:11:23.596499 
14:11:23.596510 FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
14:11:23.684304  1: 0x138fc70 node::Abort() [/usr/local/bin/node]
14:11:23.684421  2: 0x138fe8d node::FatalError(char const*, char const*) [/usr/local/bin/node]
14:11:23.684595  3: 0x151f133 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
14:11:23.684841  4: 0x151f0d5 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
14:11:23.685252  5: 0x18dc322 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/bin/node]
14:11:23.685441  6: 0x18e4c56 v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/usr/local/bin/node]
14:11:23.685667  7: 0x18c0650 v8::internal::Handle<v8::internal::TransitionArray> v8::internal::Factory::NewWeakFixedArrayWithMap<v8::internal::TransitionArray>(v8::internal::Heap::RootListIndex, int, v8::internal::PretenureFlag) [/usr/local/bin/node]
14:11:23.685851  8: 0x18c055a v8::internal::Factory::NewTransitionArray(int, int) [/usr/local/bin/node]
14:11:23.685993  9: 0x1ba78bb v8::internal::TransitionsAccessor::Insert(v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::Map>, v8::internal::SimpleTransitionFlag) [/usr/local/bin/node]
14:11:23.686161 10: 0x19f4a59 v8::internal::Map::ConnectTransition(v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::Name>, v8::internal::SimpleTransitionFlag) [/usr/local/bin/node]
14:11:23.686356 11: 0x19e3236 v8::internal::Map::CopyReplaceDescriptors(v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::DescriptorArray>, v8::internal::Handle<v8::internal::LayoutDescriptor>, v8::internal::TransitionFlag, v8::internal::MaybeHandle<v8::internal::Name>, char const*, v8::internal::SimpleTransitionFlag) [/usr/local/bin/node]
14:11:23.686526 12: 0x19dfedc v8::internal::Map::CopyAddDescriptor(v8::internal::Handle<v8::internal::Map>, v8::internal::Descriptor*, v8::internal::TransitionFlag) [/usr/local/bin/node]
14:11:23.686734 13: 0x19dfc92 v8::internal::Map::CopyWithField(v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::FieldType>, v8::internal::PropertyAttributes, v8::internal::PropertyConstness, v8::internal::Representation, v8::internal::TransitionFlag) [/usr/local/bin/node]
14:11:23.686914 14: 0x19f5fdf v8::internal::Map::TransitionToDataProperty(v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes, v8::internal::PropertyConstness, v8::internal::Object::StoreFromKeyed) [/usr/local/bin/node]
14:11:23.687090 15: 0x19bb2b7 v8::internal::LookupIterator::PrepareTransitionToDataProperty(v8::internal::Handle<v8::internal::JSReceiver>, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes, v8::internal::Object::StoreFromKeyed) [/usr/local/bin/node]
14:11:23.687270 16: 0x19e5c70 v8::internal::Object::AddDataProperty(v8::internal::LookupIterator*, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes, v8::internal::ShouldThrow, v8::internal::Object::StoreFromKeyed) [/usr/local/bin/node]
14:11:23.687445 17: 0x19ea412 v8::internal::JSObject::DefineOwnPropertyIgnoreAttributes(v8::internal::LookupIterator*, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes, v8::internal::ShouldThrow, v8::internal::JSObject::AccessorInfoHandling) [/usr/local/bin/node]
14:11:23.687625 18: 0x19ea855 v8::internal::JSObject::DefinePropertyOrElementIgnoreAttributes(v8::internal::Handle<v8::internal::JSObject>, v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes) [/usr/local/bin/node]
14:11:23.687789 19: 0x199ca98 v8::internal::JsonParser<true>::ParseJsonObject() [/usr/local/bin/node]
14:11:23.687957 20: 0x199b662 v8::internal::JsonParser<true>::ParseJsonValue() [/usr/local/bin/node]
14:11:23.688119 21: 0x199af16 v8::internal::JsonParser<true>::ParseJson() [/usr/local/bin/node]
14:11:23.688287 22: 0x152ca36 v8::internal::JsonParser<true>::Parse(v8::internal::Isolate*, v8::internal::Handle<v8::internal::String>, v8::internal::Handle<v8::internal::Object>) [/usr/local/bin/node]

Screenshot_2021-02-04 Kibana - Grafana(2)

@drTr0jan drTr0jan added the bug Fixes for quality problems that affect the customer experience label Feb 4, 2021
@rudolf
Copy link
Contributor

rudolf commented Feb 8, 2021

@drTr0jan To help us narrow down potential causes, what are you using Kibana for? Your memory graph doesn't show a lot of variation within a day like what I would expect if you had a lot of users using Kibana during work hours and then dropping again when users sleep.

@drTr0jan
Copy link
Author

drTr0jan commented Feb 8, 2021

@rudolf, generally I'm using Kibana only for viewing syslogs (Elasticsearch indices). But the Kibana from a screenshot graph is an instance that users don't working (using) with.
I think that is involved by specific FreeBSD issues with xpack.ml and xpack.reporting features. I've disabled it two days ago and memleak had not been appeared.

@rudolf
Copy link
Contributor

rudolf commented Feb 8, 2021

We have not had any other reports of Kibana 7.10.2 leaking memory, so I suspect this is FreeBSD related. As far as I know there's nothing in the x-pack.ml plugin that should be platform-specific. However, the reporting plugin uses chrome to render reports so I think it's the most likely culprit. Can you try with only x-pack.reporting disabled?

(since FreeBSD isn't a supported platform the support we're able to provide is limited, see https://www.elastic.co/support/matrix)

@rudolf rudolf added the Team:Core Core services & architecture: plugins, logging, config, saved objects, http, ES client, i18n, etc label Feb 8, 2021
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-core (Team:Core)

@drTr0jan
Copy link
Author

@rudolf, I've enabled x-pack.ml plugin two days ago. Kibana is working correctly.

@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-reporting-services (Team:Reporting Services)

@drTr0jan drTr0jan changed the title Constantly memory leaking Constantly memory leaking on x-pack.reporting Feb 15, 2021
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-app-services (Team:AppServices)

@tsullivan tsullivan added loe:needs-research This issue requires some research before it can be worked on or estimated impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. regression triaged loe:large Large Level of Effort loe:medium Medium Level of Effort and removed loe:needs-research This issue requires some research before it can be worked on or estimated loe:large Large Level of Effort labels Feb 25, 2021
@Dosant
Copy link
Contributor

Dosant commented Mar 1, 2021

@elastic/kibana-app-services: If it is reporting and reporting is not really used then the code in question (task manager) will be removed in the next release (after 7.12)

@tsullivan
Copy link
Member

@elastic/kibana-app-services: If it is reporting and reporting is not really used then the code in question (task manager) will be removed in the next release (after 7.12)

The code in question here is ESQueue, which uses a loop to poll for Reporting jobs to claim. If the Reporting feature is enabled but reports aren't generated, the loop still runs. After 7.12, we are removing ESQueue and replacing it with a Kibana-wide service called Task Manager.

@pgayvallet pgayvallet removed the Team:Core Core services & architecture: plugins, logging, config, saved objects, http, ES client, i18n, etc label Mar 31, 2021
@streamich streamich added the (Deprecated) Feature:Reporting Use Reporting:Screenshot, Reporting:CSV, or Reporting:Framework instead label May 4, 2021
@tsullivan tsullivan removed their assignment May 5, 2021
@petrklapka petrklapka added 1 and removed 1 labels May 6, 2021
@petrklapka petrklapka added 1 and removed 1 labels May 6, 2021
@tsullivan
Copy link
Member

I think that is involved by specific FreeBSD issues with xpack.ml and xpack.reporting features. I've disabled it two days ago and memleak had not been appeared.

I missed this the first time, but it should be noted that FreeBSD is not listed as a supported OS in the Kibana support matrix: https://www.elastic.co/support/matrix

Also, if you are not using ML or Reporting then I don't see how they can be responsible for a memory leak. My guess is that disabling those plugins allowed less code. Node seemed to have more success running under a 1GB memory restriction.

Closing as not reproducible / not support OS.

@L4rS6
Copy link

L4rS6 commented May 25, 2021

@drTr0jan Have you found any solution for that yet? I'm running into the same problem on freebsd...

@drTr0jan
Copy link
Author

@L4rS6 try to disable x-pack.reporting. See Bug 253314 - textproc/kibana7: fix memleak and some port improvements for more info.

@L4rS6
Copy link

L4rS6 commented May 28, 2021

@drTr0jan Thank you, that solved the problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience (Deprecated) Feature:Reporting Use Reporting:Screenshot, Reporting:CSV, or Reporting:Framework instead impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. loe:medium Medium Level of Effort regression
Projects
None yet
Development

No branches or pull requests

9 participants