Quantcast
Channel: Symantec Connect - Endpoint Management - Articles
Viewing all articles
Browse latest Browse all 706

Taking Some Radical Actions to Improve Inventory Quality on the Symantec CMDB

$
0
0
  • Introduction
  • Resource Update Summary hashes history
  • Manifesto against datahash verification
  • Creating a SQL Task to clean up the hashes
  • Conclusion
  • References

Introduction

Inventory quality is very important for a lot of customer, and it's very important for Symantec as well, as many times invalid data found on the CMDB can be associated with the product quality (sometimes correctly, other times not so).

Today we'll review a specific element of the inventory gathering chain: the data insertion and we'll take some (arguably) drastic step to prevent those issues

Resource Update Summary hashes history

As previously documented [1] (and the process as not changed since I wrote this article in November 2009) inventory data that is received from a resource will be inserted in the database only if the inventory hash does not match the hash already stored in the ResourceUpdateSummary table (for this resource and inventoryclassguid).

This process has proved to be not so reliable [2][3][4] (and despite the documentation pointing to 6.x and forwarding it still applies to 7.x and agent inventory). This is because the inventory data that is found on the various data class tables can differ from the data that generated the hash recorded in the Resource Update Summary table. This leads to agents sending valid data to the server that is processed and not inserted in the DB when it should.

Manifesto against datahash verification

Warning! This view reflects the option of the author and the author only.

With inventory data being sent only if modified (for the case of standard inventory) or in full (in case of a full inventory) one can ask why we are using a datahash verification process.

After all, if the agent finds that its latest inventory differs from the previous one and sends it to the server it would make sense to commit the received data (even if the server thinks - many a time wrongly - that the data is already stored in the DB.

So I argue here that it is better to insert the data and take the risk of having it correctly insert into the DB versus taking the risk to not insert valid data. And if the SQL server is taking a hit because of this we can always re-work the inventory task schedules to take this into account.

Creating a SQL task to clean up the hashes

Now it is very simple to clean up the Inventory Solution classes so that the datahashes (in resource update summary) will be clean before you start a new day. We start with this SQL query (note that we do not include the basic inventory classes as these events are sent very often):

update ResourceUpdateSummary
   set DataHash = null
 where InventoryClassGuid in (
			 select d.Guid --, d.name, d.DataTableName 
			   from DataClass d
			  where DataTableName like 'Inv_SW%'
				 or DataTableName like 'Inv_HW%'
				 or DataTableName like 'Inv_UG%'
				 or DataTableName like 'Inv_OS%'
		)
   and datahash is not null

Then we create the new SQL task to run against the CMDB:

Connect_Cleanup_RUS_Hashs.png

And we schedule this task to run every day (around 0500 - not on the hour).

RUS_CleanUp_Sched.png

With the count of affected rows recorded in the SQL task we can even track the count of inventory classes updated daily. A nice side effect of this clean up process:

Cleanup_Exec_Result.png

Conclusion

Inventory data may not be inserted in the data base for the wrong reason. With the provided steps in place you can now ensure that inventory data that is received by the server is inserted unless it was received since your last cleanup (so duplicated data will not impact database use at the insert point).

You can also see how many dataclasses are updated in within the scheduled interval to track how much data is really coming back (and if this really has an impact on your database usage as well).

References

[1] Spotlight on the Notification Server ResourceUpdateSummary Table

[2] Data is not synchronized on the reporting server even after a full refresh cycle from the forwarding server

[3] KNOWN ISSUE: Forwarded data missing for some data classes

[4] Hash checker command line options details here!


Viewing all articles
Browse latest Browse all 706

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>