[ZBX-24974] History write cache depleted after Zabbix upgrade Created: 2024 Aug 06 Updated: 2024 Dec 12 Resolved: 2024 Dec 12 |
|
Status: | Closed |
Project: | ZABBIX BUGS AND ISSUES |
Component/s: | Server (S) |
Affects Version/s: | 6.0.31 |
Fix Version/s: | None |
Type: | Problem report | Priority: | Trivial |
Reporter: | t.oshima | Assignee: | Aigars Kadikis |
Resolution: | Fixed | Votes: | 0 |
Labels: | None | ||
Remaining Estimate: | Not Specified | ||
Time Spent: | Not Specified | ||
Original Estimate: | Not Specified | ||
Environment: |
|
Attachments: |
![]() ![]() ![]() ![]() ![]() |
Description |
Steps to reproduce:
We have already upgraded the development environment Zabbix to version 6.0.31, and the same issue has not occurred. Expected: |
Comments |
Comment by Vladislavs Sokurenko [ 2024 Aug 06 ] |
Please provide zabbix_server -R diaginfo when issue occurs |
Comment by t.oshima [ 2024 Aug 07 ] |
Is it difficult to determine the cause without running "zabbix_server -R diaginfo" while the problem is occurring? |
Comment by Vladislavs Sokurenko [ 2024 Aug 07 ] |
It's currently unknown why this has happened, in the past it was possible that some items spammed Zabbix server and they could be shown in diaginfo in that case. |
Comment by t.oshima [ 2024 Aug 08 ] |
Get "zabbix_server -R diaginfo" in development environment Zabbix, upgraded from "6.0.25" to "6.0.31". |
Comment by t.oshima [ 2024 Aug 09 ] |
Please find attached the diaginfo obtained with Zabbix (6.0.31) in development environment. |
Comment by Aigars Kadikis [ 2024 Aug 09 ] |
It will be required to have "zabbix_server -R diaginfo" at the peak level: |
Comment by t.oshima [ 2024 Aug 15 ] |
I am looking for a way to determine the cause of the problem from information other than diaginfo. You mentioned that there were items spamming Zabbix server in the past, if possible please let us know what items were the cause. Is it difficult to investigate the cause from Zabbix server logs or DB? |
Comment by t.oshima [ 2024 Sep 25 ] |
We have prepared a verification environment and were able to reproduce the issue. Could you please investigate the cause based on the attached diaginfo from the peak time? |
Comment by Vladislavs Sokurenko [ 2024 Sep 30 ] |
This part might explain it, as it is seen item with itemid 126643 has 1025153 values Values:1386357 done:1202367 queued:199 processing:15 pending:183776 time:1.026531 Top.values: itemid:126643 values:1025153 steps:0 itemid:56472 values:11404 steps:2 itemid:56494 values:11404 steps:2 itemid:56488 values:11400 steps:2 itemid:56483 values:11400 steps:2 itemid:56470 values:11398 steps:2 itemid:132621 values:11395 steps:2 itemid:132622 values:11392 steps:2 itemid:56491 values:11391 steps:2 itemid:56473 values:11390 steps:2 itemid:132619 values:11388 steps:2 itemid:56471 values:11388 steps:2 itemid:56476 values:11388 steps:2 itemid:132620 values:11387 steps:2 itemid:56501 values:11387 steps:2 itemid:56489 values:11384 steps:2 itemid:56490 values:11381 steps:2 itemid:126772 values:378 steps:0 itemid:126677 values:377 steps:0 itemid:41239 values:377 steps:0 itemid:69793 values:377 steps:0 itemid:88884 values:377 steps:0 itemid:92699 values:377 steps:0 itemid:91374 values:377 steps:0 itemid:68404 values:377 steps:0 |
Comment by t.oshima [ 2024 Oct 04 ] |
Thank you for your comment. Test ConditionsCreated a log monitoring item on Linux, writing 30 lines per second to the monitored log file (for 2 hours). Results
Additional InformationThis phenomenon was confirmed in Linux log monitoring, but no increase in the history write cache (% used) was observed in Windows log monitoring with version 6.0.31. ConsiderationBased on these results, we suspect a performance degradation specific to version 6.0.31. Could you please share your thoughts on this matter? |
Comment by t.oshima [ 2024 Dec 12 ] |
We implemented the following two measures:
We will close this case.
|