The Nimsoft On Demand team is looking for your feedback related to QoS "Top X" dashboards/reports.
Over the last several years - we have all leveraged Remko's great "qos_server" probe. This probe was developed for us to be able to create custom "Top 10/X" views within SDP/UMP - by updating the S_QOS_DATA tables with the last sample values. This probe requires Perl to be installed on the NMS Server and it writes directly to the database.
With the advent of UMP and the List Designer/List Viewer - we now have a way to create similar lists natively in those portlets. This functionality does not require the qos_server probe to be installed.
**However - in order to leverage Top X tables within custom dashboards - Remko's qos_server probe is still required.
And lastly - there is the new usage_metering probe. In order for the usage_metering probe to run - there is a change that needs to be made to the data_engine probe via Raw Configure. The probe can function only when update_last_value = yes in the data_engine probe configuration. What is interesting about this (and getting to my point finally) - is that I believe this change to the data_engine achieves the same objective of Remko's qos_server probe - meaning that you do not need both running at the same time. As another note - we have historically not enabled this change within the data_engine probe because of potential performance issues.
So - the questions for you all:
1. Are you using Remko's qos_server still today?
2. Are you using the list designer/list viewer instead? Are you using these AND the qos_server?
3. Are you using the data_engine flag instead of qos_server?
4. Can our experts please comment on your perception of how to best handle this moving forward?
We want to give you the best Nimsoft On Demand image possible so that the systems are ready for you out of the box. Your answers to these questions will be of great assistance.