In some setups Zabbix has to work with large json data that is used as master item for thousands of dependent items. In most cases those items are generated by lld using the same data and have similarly formatted jsonpath's for metric extraction.
The input data has array of objects, each having set of properties that are monitored. So the jsonpaths can be split into two parts - object location and metric location. In most cases the object location ends with jsonpath expression component. For example:
then with those paths defined in item prototypes:
Zabbix would discover following items:
So the objects are located with $[?(@.mount=="/run")], $[?(@.mount=="/")] while the metrics with .total, .used
If multiple metrics per object are monitored (the more the better), then the jsonpath processing could be optimized by caching the objects (json fragments located by the path ending with the expression component). The jsonpath processing logic would be like:
1. find expression component
a. check if the location is not cached
1. extract json fragment at the location
2. cache extracted data
b. extract the metrics from cached data
One concern is jsonpaths with expressions in metric locations using absolute path, something like $.data[?(@.mount=="/")][?($.active="IUOSADGY")].total (this is just example path, not related to the data used above). However as extracted json fragments in Zabbix are just pointers within the original json - it should be possible to implement.
Another and probably the main concern - similar optimization could be achieved manually by nesting dependent items - first discover objects containing json fragments and then extract metrics from discovered objects instead of using the original input data.