Uploaded image for project: 'ZABBIX BUGS AND ISSUES'
  2. ZBX-22171

After upgrade frontend to version > 6.0.10 there are problems with postgres DB



    • Problem report
    • Status: Open
    • Trivial
    • Resolution: Unresolved
    • None
    • None
    • None
    • None
    • Frontend OS: Red Hat 8.6
      DB OS: Red Hat 8.6
      Postgresql v.14 (8CPU, 32GB RAM, data: 146GB)
      TimescaleDB: 2.7.2
      Zabbix Server: 6.0.11
      Zabbix Proxy: 6.0.12

      Tested frontend versions: 6.0.9,6.0.10,6.0.11,6.0.12


      I had Zabbix environment on version 6.0.9 (number of hosts: 6k, items: 340k, triggers: 148k, vps:1900). After upgrade to version 6.0.12 CPU on PostgreSQL server increase to terrible values. After invesigation I found that there was created a lot of locks on database which cannot be proceeded (I don't know why, but LLD workers are rising up after frontend upgrade too). After downgrade only frontend to 6.0.9 then everything backs to normal.

      I tried every newest version, but only with frontend v.6.0.9 everything backs to normal (checked few times). 

      Steps to reproduce:

      1. Install Zabbix 6.0.9 with PostgreSQL 14 + TimescaleDB 2.7.2
      2. Upgrade to version bigger than 6.0.9
      3. Watch CPU load on Database server
      4. Downgrade only frontend to 6.0.9
      5. Watch CPU load on Database server

      See screenshots.

      LLD workers are rising up.

      CPU load on PostgreSQL server rising up to very big values.
      A lot of slow queries are visible in Zabbix Server log file.

      A lot of locks are created on database.

      Configuration syncer sync time often is very big (even more than 20 seconds).


      Database should work fine without new additional locks (like in version 6.0.9).


        1. Configuration syncer time logs.png
          Configuration syncer time logs.png
          76 kB
        2. Configuration syncer utility.png
          Configuration syncer utility.png
          271 kB
        3. CPU load.png
          CPU load.png
          34 kB
        4. Database locks.png
          Database locks.png
          30 kB
        5. DB CPU idle 1m.png
          DB CPU idle 1m.png
          26 kB
        6. DB CPU idle time.png
          DB CPU idle time.png
          32 kB
        7. DB load avarage when 6.0.12 is installed.png
          DB load avarage when 6.0.12 is installed.png
          220 kB
        8. pg_locks.txt
          167 kB
        9. PostgreSQL connected backends.png
          PostgreSQL connected backends.png
          28 kB
        10. PostgreSQL disk blocks read per second.png
          PostgreSQL disk blocks read per second.png
          30 kB
        11. query failed empty result.png
          query failed empty result.png
          4 kB
        12. strange bahaviour of lld workers 6.0.12.png
          strange bahaviour of lld workers 6.0.12.png
          257 kB
        13. Strange rising of LLD workers.png
          Strange rising of LLD workers.png
          29 kB
        14. upgrade to 6.0.12.png
          upgrade to 6.0.12.png
          37 kB
        15. virtual_transactions.txt
          25 kB
        16. when NGINX is stopped.png
          when NGINX is stopped.png
          225 kB
        17. working process.png
          working process.png
          41 kB



            aigars.kadikis Aigars Kadikis
            Godfather Mateusz Dampc
            4 Vote for this issue
            6 Start watching this issue