Uploaded image for project: 'ZABBIX BUGS AND ISSUES'
  1. ZABBIX BUGS AND ISSUES
  2. ZBX-19604

Error cloning template "Template DB Oracle by ODBC"

XMLWordPrintable

    • Icon: Problem report Problem report
    • Resolution: Fixed
    • Icon: Minor Minor
    • 6.0.0alpha4, 6.0 (plan)
    • 5.0.11
    • Team B
    • Sprint 79 (Aug 2021), Sprint 80 (Sep 2021), Sprint 81 (Oct 2021), Sprint 82 (Nov 2021), Sprint 83 (Dec 2021), Sprint 84 (Jan 2022), Sprint 85 (Feb 2022), Sprint 86 (Mar 2022)
    • 1

      Steps to reproduce:

      1. Select "Template DB Oracle by ODBC" from Configuration -> Templates
      2. Click "Full Clone" and rename
      3. Click on "Add"

      Result:
      See screenshot error.jpg

       * Value "SELECT
       * df.tablespace_name AS TABLESPACE,
       * df.type AS TYPE,
       * NVL(SUM(df.BYTES), 0) AS FILE_BYTES,
       * NVL(SUM(df.MAX_BYTES), 0) AS MAX_BYTES,
       * NVL(SUM(f.FREE), 0) AS FREE_BYTES,
       * SUM(df.BYTES)-SUM(f.FREE) AS USED_BYTES,
       * ROUND(DECODE(SUM(df.MAX_BYTES), 0, 0, (SUM(df.BYTES) / SUM(df.MAX_BYTES) * 100)), 2) AS USED_PCT_MAX,
       * ROUND(DECODE(SUM(df.BYTES), 0, 0,(SUM(df.BYTES)-SUM(f.FREE))/ SUM(df.BYTES)* 100), 2) AS USED_FILE_PCT,
       * DECODE(df.STATUS, 'ONLINE', 1, 'OFFLINE', 2, 'READ ONLY', 3, 0) AS STATUS
       * FROM ( SELECT
       * ddf.file_id,
       * dt.contents AS TYPE,
       * dt.STATUS ,
       * ddf.file_name,
       * ddf.tablespace_name,
       * TRUNC(ddf.bytes) AS bytes,
       * TRUNC(GREATEST(ddf.bytes, ddf.maxbytes)) AS max_bytes
       * FROM
       * dba_data_files ddf,
       * dba_tablespaces dt
       * WHERE
       * ddf.tablespace_name = dt.tablespace_name ) df,
       * ( SELECT TRUNC(SUM(bytes)) AS FREE, file_id FROM dba_free_space GROUP BY file_id ) f
       * WHERE df.file_id = f.file_id (+)
       * GROUP BY df.tablespace_name, df.TYPE, df.status
       * UNION ALL
       * SELECT
       * Y.name AS TABLESPACE,
       * Y.type AS TYPE,
       * NVL(SUM(Y.BYTES), 0) AS FILE_BYTES,
       * NVL(SUM(Y.MAX_BYTES), 0) AS MAX_BYTES,
       * NVL(MAX(NVL(Y.FREE_BYTES, 0)), 0) AS FREE,
       * SUM(Y.BYTES)-SUM(Y.FREE_BYTES) AS USED_BYTES,
       * ROUND(DECODE(SUM(Y.MAX_BYTES), 0, 0, (SUM(Y.BYTES) / SUM(Y.MAX_BYTES) * 100)), 2) AS USED_PCT_MAX,
       * ROUND(DECODE(SUM(Y.BYTES), 0, 0,(SUM(Y.BYTES)-SUM(Y.FREE_BYTES))/ SUM(Y.BYTES)* 100), 2) AS USED_FILE_PCT,
       * DECODE(Y.TBS_STATUS, 'ONLINE', 1, 'OFFLINE', 2, 'READ ONLY', 3, 0) AS STATUS
       * FROM ( SELECT
       * dtf.tablespace_name AS name,
       * dt.contents AS TYPE,
       * dt.STATUS AS tbs_status,
       * dtf.status AS status,
       * dtf.bytes AS bytes,
       * (SELECT
       * ((f.total_blocks - s.tot_used_blocks)* vp.value)
       * FROM ( SELECT tablespace_name, SUM(used_blocks) tot_used_blocks FROM gv$sort_segment
       * WHERE tablespace_name != 'DUMMY'
       * GROUP BY tablespace_name) s,
       * ( SELECT tablespace_name, SUM(blocks) total_blocks FROM dba_temp_files
       * WHERE tablespace_name != 'DUMMY'
       * GROUP BY tablespace_name) f,
       * ( SELECT value FROM v$parameter WHERE name = 'db_block_size') vp
       * WHERE
       * f.tablespace_name = s.tablespace_name
       * AND f.tablespace_name = dtf.tablespace_name ) AS free_bytes,
       * CASE WHEN dtf.maxbytes = 0 THEN dtf.bytes
       * ELSE dtf.maxbytes END AS max_bytes
       * FROM
       * sys.dba_temp_files dtf,
       * sys.dba_tablespaces dt
       * WHERE
       * dtf.tablespace_name = dt.tablespace_name ) Y
       * GROUP BY Y.name, Y.TYPE, Y.tbs_status;" is too long for field "params" - 2578 characters. Allowed length is 2048 characters.
      

      Expected:
      No error and template cloned.

       

      When installing, the provided database scripts were failing due to column size. We had to change the script to increase the size of "params" and "description" columns from "items" table from 2048 to 4000 in "schema.sql" script. This change corrected many insert errors except one that tried to insert a value size of 8000 (as we didn't need that item, we accepted that failure).

      Also we needed to change item queries because of some inserts with NCLOB values were not being inserted in the database. We compare how the new Zabbix version had that entries in "data.sql" script and changed them for 5.0.10 "data.sql" script.

      The only error that remain after executing the database scripts were the one about the size of 8000 (impossible to correct).

       

      So, our database has "params" colum size of 4000 at "items" table, so no error should be raised by Zabbix. Our suspect is that the "clone" function in the web server ¿? is setting somewhere this size and as some items have more than 2048 (in this case, one item of the "Template DB Oracle by ODBC") it fails.

       

        1. error_import_template.JPG
          error_import_template.JPG
          246 kB
        2. error.JPG
          error.JPG
          108 kB
        3. gendb.sh
          2 kB
        4. image-2021-06-29-10-45-14-467.png
          image-2021-06-29-10-45-14-467.png
          21 kB
        5. image-2021-06-29-17-16-53-575.png
          image-2021-06-29-17-16-53-575.png
          237 kB
        6. image-2021-06-29-17-19-24-892.png
          image-2021-06-29-17-19-24-892.png
          249 kB
        7. oracle_5.0_old.xml
          146 kB
        8. rmdb.sh
          0.2 kB
        9. template_db_oracle_odbc.xml
          146 kB

            MVekslers Michael Veksler
            lgplaza Laura García
            Team B
            Votes:
            1 Vote for this issue
            Watchers:
            13 Start watching this issue

              Created:
              Updated:
              Resolved: