Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance issues in elasticsearch 1.4.3 #9701

Closed
zhufeizzz opened this issue Feb 15, 2015 · 7 comments
Closed

Performance issues in elasticsearch 1.4.3 #9701

zhufeizzz opened this issue Feb 15, 2015 · 7 comments

Comments

@zhufeizzz
Copy link

My server:
server1(client):logstash 1.5.beta1
server2(server):redis 2.8.17 + logstash 1.5.beta1 + elasticsearch 1.4.3(with shield) + kibana 4 rc1
Data Flow(10000 event per min):
logstash agent >> redis >> logstash indexer >> elasticsearch 1.4.3(with shield)

Since i upgrade elasticsearch from 1.4.2 to 1.4.3, many data block into redis, and come into elasticsearch very slow. And I checked time of data, it delayed for few minutes.
It worked very well in 1.4.2, I use the same configuration in elasticsearch 1.4.3

@kimchy
Copy link
Member

kimchy commented Feb 15, 2015

during the indexing operations, can you issue a few hot threads calls and gist them here?

@zhufeizzz
Copy link
Author

Log file is empty in logstash indexer, do i need to change log level, or do some other operation?

@kimchy
Copy link
Member

kimchy commented Feb 15, 2015

Apologies, I meant issuing this API call on ES: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/cluster-nodes-hot-threads.html while indexing is in progress.

@zhufeizzz
Copy link
Author

::: [Tethlam][trzj-nfiTFG_CCf5mnBgeQ][ip-172-2-1-50][inet[/172.2.1.50:9300]]

12.2% (61.2ms out of 500ms) cpu usage by thread 'elasticsearch[Tethlam][refresh][T#1]'
10/10 snapshots sharing following 9 elements
sun.misc.Unsafe.park(Native Method)
java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:735)
java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:644)
java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1137)
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
java.lang.Thread.run(Thread.java:745)

4.3% (21.6ms out of 500ms) cpu usage by thread 'elasticsearch[Tethlam][http_server_worker][T#1]{New I/O worker #7}'
 10/10 snapshots sharing following 15 elements
   sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
   sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
   sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79)
   sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
   sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
   org.elasticsearch.common.netty.channel.socket.nio.SelectorUtil.select(SelectorUtil.java:68)
   org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.select(AbstractNioSelector.java:415)
   org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212)
   org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
   org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
   org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
   org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
   java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
   java.lang.Thread.run(Thread.java:745)

0.2% (877.7micros out of 500ms) cpu usage by thread 'elasticsearch[Tethlam][http_server_boss][T#1]{New I/O server boss #9}'
 10/10 snapshots sharing following 14 elements
   sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
   sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
   sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79)
   sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
   sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
   sun.nio.ch.SelectorImpl.select(SelectorImpl.java:102)
   org.elasticsearch.common.netty.channel.socket.nio.NioServerBoss.select(NioServerBoss.java:163)
   org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212)
   org.elasticsearch.common.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42)
   org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
   org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
   java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
   java.lang.Thread.run(Thread.java:745)

@zhufeizzz
Copy link
Author

::: [Tethlam][trzj-nfiTFG_CCf5mnBgeQ][ip-172-2-1-50][inet[/172.2.1.50:9300]]

36.7% (183.6ms out of 500ms) cpu usage by thread 'elasticsearch[Tethlam][[logstash-2015.02.15][1]: Lucene Merge Thread #24]'
3/10 snapshots sharing following 7 elements
org.apache.lucene.index.SegmentMerger.mergeFields(SegmentMerger.java:332)
org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java:100)
org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:4180)
org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3775)
org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:409)
org.apache.lucene.index.TrackingConcurrentMergeScheduler.doMerge(TrackingConcurrentMergeScheduler.java:107)
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:486)
7/10 snapshots sharing following 8 elements
org.apache.lucene.codecs.FieldsConsumer.merge(FieldsConsumer.java:72)
org.apache.lucene.index.SegmentMerger.mergeTerms(SegmentMerger.java:399)
org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java:112)
org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:4180)
org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3775)
org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:409)
org.apache.lucene.index.TrackingConcurrentMergeScheduler.doMerge(TrackingConcurrentMergeScheduler.java:107)
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:486)

17.6% (87.8ms out of 500ms) cpu usage by thread 'elasticsearch[Tethlam][http_server_worker][T#1]{New I/O worker #7}'
10/10 snapshots sharing following 7 elements
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
java.lang.Thread.run(Thread.java:745)

2.7% (13.6ms out of 500ms) cpu usage by thread 'elasticsearch[Tethlam][http_server_worker][T#2]{New I/O worker #8}'
 10/10 snapshots sharing following 15 elements
   sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
   sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
   sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79)
   sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87)
   sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98)
   org.elasticsearch.common.netty.channel.socket.nio.SelectorUtil.select(SelectorUtil.java:68)
   org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.select(AbstractNioSelector.java:415)
   org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:212)
   org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
   org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
   org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
   org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
   java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
   java.lang.Thread.run(Thread.java:745)

@kimchy
Copy link
Member

kimchy commented Feb 15, 2015

@zhufeizzz doesn't seem like anything is happening ES side, are you sure indexing is happening?

@clintongormley
Copy link

No further info. Closing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants