Some landing web pages can be huge (listing web-site, etc.) and the download of the whole page can be important (500KB or more)
If the web monitoring is set to 1 check per minute we can reach up to 20GB of data transfer per month, which is huge for simple monitoring.
In some case also, the only check performed to validate the web monitoring can be the HTTP return code.
Can we improve the web monitoring by adding the following :
- GUI – check box to specify only the header of the page should be downloaded
- backend – set CURLOPT_NOBODY option for curl to retrieve only the header
- backend – validate the requested HTTP return code against the one just downloaded.
It would drastically cut the network usage and would be suitable for many people.