Google-ify ITM with Splunk and SOAP


Splunk is a popular product designed to add a simple search engine interface to the vast amounts of data generated in enterprise environments. Splunk can pull data in from scripts, files, directories, network ports, WMI and many other sources.

For a quick overview, take the product tour.

In this article, we are interested in the script input interface as a means of pulling ITM SOAP data into Splunk. This will allow us to search the vast amount of ITM information available in a simple and meaningful manner.

Splunk comes with a free and enterprise license. The free version is more than sufficient for this article. However, the free license for version 4 (which we are using) is not available until 2009 Q3. The product is a single download and installs an enterprise license for 60 days. If you find Splunk useful, keep an eye on their website and apply the free license when it is released.


splunk home

  • Download the scripts for this article here:
  • splunk_soap.tar
  • Untar into <SPLUNK_INSTALL>/bin/scripts. You will have the following files:
  • Edit splunk_soap.bat| and change the token <CHANGE_TO_SPLUNK_INSTALL> to your Splunk install directory, e.g. C:Splunk.
  • Edit and change the following values for your environment, ensure the enclosing quotes remain:
    ITM_USER = "sysadmin"
    ITM_PWD = "orbdata"
    ITM_SOAP_PORT = "1920"
    ITM_SOAP_HOST = ""


On the Splunk home page:

  • Select Manager
  • Select ‘System Configurations -> Data inputs’
  • Select ‘Scripts’ and press <New>
  • In the ‘Command’ field, enter the full path to <SPLUNK_INSTALL>/bin/scripts/splunk_soap.bat|.sh.
    Note this is also where the poll interval for data collection is specified.
  • Press <Save>
    Your screen should look like this (note that the script has two entries):

splunk data input


Now we are ready to search our SOAP data.

Splunk ships with (and uses) Python so we don’t need to install it separately to use and test our script manually.

To test:

You will need to set the Python path (change C:Splunk to your environment and the win32 directory if running on Unix):

C:Splunk inscripts>set PYTHONPATH=C:Splunk in;C:SplunkPython-2.6Libsite

Then run the script:

C:Splunk inscripts>c:Splunk inpython.exe

message_text=new log file created.
message_text=threshold xml override document object name not defined


The <SPLUNK_INSTALL>/bin/scripts/payloads directory contains a selection of XML files containing SOAP queries used by our script.

Splunk will call our script on a regular basis (time configurable in the ‘Data Inputs’ section we just used above) which will in turn run each SOAP query against ITM and print the results out for Splunk to pick up.

Currently our SOAP queries are:

  • Agent operations log
  • Agent details
  • NT Process information
  • Current situation status

Open up the search screen in the Splunk GUI and try some search criteria. Note that in order to limit the search to our script source, end your query with source=script.

For example to search for ‘offline’:

offline source=script

splunk search offline

Here are some more samples to get you started:

  • Offline Linux agents: lz *offline source=script
  • Online Window’s agents: nt *online source=script
  • Where a system is used: <hostname> source=script
  • Where a process is used: *<process_name>* source=script
  • Agent connecting to TEMS: connecting originnodesource=script
  • Warehouse agent info: product=hd source=script

splunk search java

Points to note:

  • Splunk searches are case sensitive; our script outputs everything in lowercase to avoid missing data
  • You can use wildcards and multiple search keywords; e.g: *host* <situation_name>
  • The longer Splunk runs the script, the more data will be available.
  • Consider increasing the polling time in the Data Inputs screen once you are happy that data is being collected. Particularly if there is a lot of SOAP queries or data.
  • The script creates a log file splunk_soap.log in the same directory

Getting more data

The payloads directory contains sample SOAP XML files. A standard SOAP query looks like this:


For convenience, our Python script adds in the <CT_Get>, <userid> & <password> tags which is why all the examples in the payloads directory only contain an <object> or <table> tag. Ensure to following this convention with your own payloads.

More SOAP examples can be found here.

Example: adding windows systems disk data

  • Find the attributes file for the agent type required under <ITM_HOME>CMSATTRLIB (Windows) or <ITM_HOME>/tables/<TEMS_NAME>/ATTRLIB (Linux/Unix). Windows agent data types are defined in knt.atr. I.e. k<product code>.atr.
  • Look for lines starting with name XXX.ZZZ. Where XXX is an object name that can be queried via SOAP.
    Windows disk data first entry is: name NT_Logical_Disk.Server_Name, so the object name is NT_Logical_Disk.
  • Test the data by using the ITM SOAP web page:  http://<HUB_TEMS>:1920///cms/soap
    The payload section will look like this:

splunk soap test

  • Recall that our Python script will add the CT_Get sections for us (so we don’t have to add them every time or have the connection credentials in every file). Therefore, create an xml file in the payloads directory containing the <object>XX</object> contents only.
  • Our Windows file is
  • Content:
  • That’s it. Next time Splunk runs our script the new xml file will be processed and the data collected. Check the splunk_soap.log file for errors and use the SOAP web page test above for some sample search ideas; e.g. Free source=script:

splunk disk

Wrap up

In this article we have explored leveraging Splunk’s search capabilities with ITM SOAP data.

The more you play with Splunk and SOAP the more useful the application of these two technologies will become.

Hopefully you will learn more about ITM by poking around under the covers and via Splunk be able to dig around your Enterprise data in a manner not normally accessible.

Splunk is a large product and we have only scratched the surface. Be sure to refer to the online documentation for more examples and a deeper understanding.

Views: 69