I’ve just read an excellent book by Robert Harris called the Fear Index. In the book a computer scientist called Dr Alex Hoffmann has developed a revolutionary form of artificial intelligence that tracks human emotions via internet news feeds , enabling it to predict movements in the financial markets with uncanny accuracy. In the story his hedge fund, based in Geneva, makes billions. The book is listed as a work of fiction but the technology used in it is not. Predictive Analytics Software that analyses how one metric affects and relates to another is available today and being used in many large organisations.
New Year Predictions
As one year rolls into the next it is customary for analysts and journalists alike to make predictions for the year ahead and with this in mind I will forecast that this year will be the year when this technology comes of age. Indeed if you look at the web site of the current leading supplier of predictive analytics software, Netuitive they state that 8 of the World’s 10 Largest Banks already use their software. However I expect that the big difference this year will be when IBM enters the fray in Q1.
So what does it do and why would you want it? Essentially Predictive Analytics uses existing tools such as Tivoli Monitoring or HP BAC, to collect raw numeric performance data for each key performance indicator (KPI). The software then learns the behaviour patterns of each individual metric for a given period of time and also self-learns the interdependencies between each “vital sign” so that it understands how each KPI influences the other. This means that it knows what normal performance is and also knows that when one correlated metric starts diverging from normal behaviour (and other metrics) that a problem could be about to occur.
This doesn’t mean that you should throw out any of your other tools you have invested in, indeed because Predictive Analytics software uses mathematical formulae for its analysis it works best when the data comes from as many different sources as possible. However it does mean that you will get better value from your existing tools as you will no longer need to set thresholds for performance alerts and the events you do get will be fewer and more accurate.
So how does it work?
In this example performance data such as CPU, SQL statement duration, memory and transaction speed are being collected to analyse a system’s performance. You can see in the graph that these statistics have a direct relationship to each other and it is these relationships that Predictive Analytics uses to signify “standard behaviour”. You can also see that at the end of the graph when the “SQL Statement duration” metric starts diverging from the other KPIs an alert is raised.
Additionally it learns the normal behaviours of each individual metric. This is useful as throughout a week (or even a month) there will be times that are busier than others. For example take a credit card company payment authorisation system. This would expect to be busier on Saturday than on a Monday. Predictive Analytics learns this cycle and will only raise an alert if the performance is outside the expected range. It is also important to note that it is not just high spikes that it is looking for as low usage of a trading floor system at a busy time is just as much of a concern as an over-used and slow system.
Time will tell if I’m right but while we wait to see why don’t you find out for yourself?