-
Incident report
-
Resolution: Cannot Reproduce
-
Major
-
None
-
4.4.1
-
None
-
Linux, MySQL 5.7
I found kind of strange issue when MySQL user used by zabbix proxy user had no drop table role.
From proxy log:
24530:20191031:161009.287 Starting Zabbix Proxy (active) [zbx-prx-i4d]. Zabbix 4.4.1 (revision 8870606e6a). 24530:20191031:161009.287 **** Enabled features **** 24530:20191031:161009.287 SNMP monitoring: YES 24530:20191031:161009.287 IPMI monitoring: YES 24530:20191031:161009.287 Web monitoring: YES 24530:20191031:161009.287 VMware monitoring: YES 24530:20191031:161009.287 ODBC: YES 24530:20191031:161009.287 SSH2 support: YES 24530:20191031:161009.287 IPv6 support: YES 24530:20191031:161009.287 TLS support: YES 24530:20191031:161009.287 ************************** 24530:20191031:161009.287 using configuration file: /etc/zabbix/zabbix_proxy.conf 24530:20191031:161009.383 current database version (mandatory/optional): 04020000/04020001 24530:20191031:161009.383 required mandatory version: 04040000 24530:20191031:161009.383 starting automatic database upgrade 24530:20191031:161009.816 completed 1% of database upgrade 24530:20191031:161010.046 completed 3% of database upgrade 24530:20191031:161010.246 completed 5% of database upgrade 24530:20191031:161010.307 completed 7% of database upgrade 24530:20191031:161010.456 completed 8% of database upgrade 24530:20191031:161010.923 completed 10% of database upgrade 24530:20191031:161012.066 completed 12% of database upgrade 24530:20191031:161013.123 completed 14% of database upgrade 24530:20191031:161014.086 completed 16% of database upgrade 24530:20191031:161015.102 completed 17% of database upgrade 24530:20191031:161015.110 completed 19% of database upgrade 24530:20191031:161015.220 completed 21% of database upgrade 24530:20191031:161015.336 completed 23% of database upgrade 24530:20191031:161015.419 completed 25% of database upgrade 24530:20191031:161015.429 completed 26% of database upgrade 24530:20191031:161015.437 completed 28% of database upgrade 24530:20191031:161015.484 completed 30% of database upgrade 24530:20191031:161015.485 completed 32% of database upgrade 24530:20191031:161015.493 completed 33% of database upgrade 24530:20191031:161015.509 completed 35% of database upgrade 24530:20191031:161015.704 completed 37% of database upgrade 24530:20191031:161015.839 completed 39% of database upgrade 24530:20191031:161016.039 completed 41% of database upgrade 24530:20191031:161016.106 completed 42% of database upgrade 24530:20191031:161016.156 completed 44% of database upgrade 24530:20191031:161016.164 completed 46% of database upgrade 24530:20191031:161016.172 completed 48% of database upgrade 24530:20191031:161016.181 completed 50% of database upgrade 24530:20191031:161016.214 completed 51% of database upgrade 24530:20191031:161016.222 completed 53% of database upgrade 24530:20191031:161016.230 completed 55% of database upgrade 24530:20191031:161016.239 completed 57% of database upgrade 24530:20191031:161016.247 completed 58% of database upgrade 24530:20191031:161016.255 completed 60% of database upgrade 24530:20191031:161016.264 completed 62% of database upgrade 24530:20191031:161016.280 completed 64% of database upgrade 24530:20191031:161016.280 [Z3005] query failed: [1142] DROP command denied to user 'zabbix-proxy'@'localhost' for table 'host_inventory' [alter table `host_inventory` rename to `host_inventory_tmp`] 24530:20191031:161016.281 database upgrade failed 24555:20191031:161026.546 Starting Zabbix Proxy (active) [zbx-prx-i4d]. Zabbix 4.4.1 (revision 8870606e6a).
Than proxy was able to work and pass the data between agents and server.
Only issue was that that all simple check items using net.tcp.service[] have been returning 0 instead 1.
before that proxy has bn only complaining about
15956:20191031:125018.679 received configuration data from server at "zbx-srv.mustard", datalen 6404777 15956:20191031:125019.203 failed to update local proxy configuration copy: invalid table name "item_rtdata" 15956:20191031:125050.361 received configuration data from server at "zbx-srv.mustard", datalen 6404777 15956:20191031:125050.902 failed to update local proxy configuration copy: invalid table name "item_rtdata" 15956:20191031:125122.032 received configuration data from server at "zbx-srv.mustard", datalen 6404777 15956:20191031:125122.565 failed to update local proxy configuration copy: invalid table name "item_rtdata" 15956:20191031:125153.724 received configuration data from server at "zbx-srv.mustard", datalen 6404777
Issue have need sorted out after add drop table role and restart proxy.
24644:20191031:161239.803 using configuration file: /etc/zabbix/zabbix_proxy.conf 24644:20191031:161239.809 current database version (mandatory/optional): 04030036/04030036 24644:20191031:161239.809 required mandatory version: 04040000 24644:20191031:161239.809 starting automatic database upgrade 24644:20191031:161239.809 [Z3005] query failed: [1142] DROP command denied to user 'zabbix-proxy'@'localhost' for table 'host_inventory' [alter table `host_inventory` rename to `host_inventory_tmp`] 24644:20191031:161239.810 database upgrade failed 24663:20191031:161250.053 Starting Zabbix Proxy (active) [zbx-prx-i4d]. Zabbix 4.4.1 (revision 8870606e6a). 24663:20191031:161250.053 **** Enabled features **** 24663:20191031:161250.053 SNMP monitoring: YES 24663:20191031:161250.053 IPMI monitoring: YES 24663:20191031:161250.054 Web monitoring: YES 24663:20191031:161250.054 VMware monitoring: YES 24663:20191031:161250.054 ODBC: YES 24663:20191031:161250.054 SSH2 support: YES 24663:20191031:161250.054 IPv6 support: YES 24663:20191031:161250.054 TLS support: YES 24663:20191031:161250.054 ************************** 24663:20191031:161250.054 using configuration file: /etc/zabbix/zabbix_proxy.conf 24663:20191031:161250.059 current database version (mandatory/optional): 04030036/04030036 24663:20191031:161250.059 required mandatory version: 04040000 24663:20191031:161250.059 starting automatic database upgrade 24663:20191031:161250.078 completed 5% of database upgrade 24663:20191031:161250.154 completed 10% of database upgrade 24663:20191031:161250.186 completed 15% of database upgrade 24663:20191031:161250.236 completed 20% of database upgrade 24663:20191031:161250.369 completed 25% of database upgrade 24663:20191031:161250.462 completed 30% of database upgrade 24663:20191031:161250.554 completed 35% of database upgrade 24663:20191031:161250.653 completed 40% of database upgrade 24663:20191031:161250.752 completed 45% of database upgrade 24663:20191031:161250.881 completed 50% of database upgrade 24663:20191031:161250.974 completed 55% of database upgrade 24663:20191031:161251.074 completed 60% of database upgrade 24663:20191031:161251.140 completed 65% of database upgrade 24663:20191031:161251.190 completed 70% of database upgrade 24663:20191031:161251.342 completed 75% of database upgrade 24663:20191031:161251.496 completed 80% of database upgrade 24663:20191031:161251.505 completed 85% of database upgrade 24663:20191031:161251.647 completed 90% of database upgrade 24663:20191031:161251.742 completed 95% of database upgrade 24663:20191031:161251.742 completed 100% of database upgrade 24663:20191031:161251.742 database upgrade fully completed 24663:20191031:161252.713 proxy #0 started [main process] 24664:20191031:161252.714 proxy #1 started [configuration syncer #1] 24665:20191031:161252.714 proxy #2 started [heartbeat sender #1] 24666:20191031:161252.715 proxy #3 started [data sender #1] 24667:20191031:161252.715 proxy #4 started [housekeeper #1] 24668:20191031:161252.716 proxy #5 started [history syncer #1] 24669:20191031:161252.716 proxy #6 started [history syncer #2] 24670:20191031:161252.717 proxy #7 started [history syncer #3] 24671:20191031:161252.717 proxy #8 started [history syncer #4] 24672:20191031:161252.718 proxy #9 started [self-monitoring #1] 24673:20191031:161252.719 proxy #10 started [task manager #1] 24674:20191031:161252.719 proxy #11 started [poller #1] 24675:20191031:161252.720 proxy #12 started [poller #2] 24676:20191031:161252.720 proxy #13 started [poller #3] 24677:20191031:161252.721 proxy #14 started [poller #4] 24678:20191031:161252.721 proxy #15 started [poller #5] 24679:20191031:161252.722 proxy #16 started [poller #6] 24680:20191031:161252.722 proxy #17 started [poller #7] 24681:20191031:161252.723 proxy #18 started [poller #8] 24682:20191031:161252.723 proxy #19 started [poller #9] 24683:20191031:161252.724 proxy #20 started [poller #10] 24684:20191031:161252.724 proxy #21 started [unreachable poller #1] 24685:20191031:161252.725 proxy #22 started [trapper #1] 24686:20191031:161252.725 proxy #23 started [trapper #2] 24687:20191031:161252.726 proxy #24 started [trapper #3] 24688:20191031:161252.727 proxy #25 started [trapper #4] 24689:20191031:161252.727 proxy #26 started [trapper #5] 24690:20191031:161252.727 proxy #27 started [icmp pinger #1] 24691:20191031:161252.728 proxy #28 started [icmp pinger #2] 24692:20191031:161252.729 proxy #29 started [icmp pinger #3] 24693:20191031:161252.729 proxy #30 started [icmp pinger #4] 24694:20191031:161252.730 proxy #31 started [icmp pinger #5] 24695:20191031:161252.730 proxy #32 started [icmp pinger #6] 24696:20191031:161252.731 proxy #33 started [icmp pinger #7] 24697:20191031:161252.731 proxy #34 started [icmp pinger #8] 24698:20191031:161252.732 proxy #35 started [icmp pinger #9] 24699:20191031:161252.732 proxy #36 started [icmp pinger #10] 24700:20191031:161252.733 proxy #37 started [icmp pinger #11] 24701:20191031:161252.733 proxy #38 started [icmp pinger #12] 24702:20191031:161252.734 proxy #39 started [icmp pinger #13] 24703:20191031:161252.735 proxy #40 started [icmp pinger #14] 24704:20191031:161252.735 proxy #41 started [icmp pinger #15] 24705:20191031:161252.736 proxy #42 started [icmp pinger #16] 24706:20191031:161252.736 proxy #43 started [icmp pinger #17] 24707:20191031:161252.737 proxy #44 started [icmp pinger #18] 24708:20191031:161252.737 proxy #45 started [icmp pinger #19] 24709:20191031:161252.738 proxy #46 started [icmp pinger #20] 24710:20191031:161252.738 proxy #47 started [preprocessing manager #1]
- I think that if DB version is not on the version which should be first it would be good to check is DB user has necessary roles to successfully upgrade database scheme
- Suspicious is why proxy upgrade fail did not stopped proxy process
- I'm not sure but it may be some issue in simple items handling logic. However in that case it may be straight consequence of the running proxy with not fully upgraded DB backend.
Issue is repeatable because I found it all all three proxies which where upgraded without drop table role.