This is an excerpt from Mike Beire's "Business Intelligence for the Enterprise." You can purchase the book her...
"If it weren't for these end users…!"
There are two distinct ways of defining BI users: by skill and by processing requirements. In Chapter 5, we discussed some of the user segmentation criteria to ponder. Now we will spend more time defining users by specific BI tools and processing requirements.
Have you ever have an irrational urge to buy one of those weird gadgets you see on late night television? Sometimes, users will react that way when they see a BI tool. "We're not quite sure what we'll do with it, but we sure want one!" Part of a vendor's modus operandi is to convince everyone in the audience that they not only need the tool being shown, but they can use it to cure all their processing ills.
All business people have to deal with data in some form and with computing technologies to get their jobs done. Some can work with tools such as spreadsheets, and some are quite "dangerous." The varying degree of skills and knowledge base among users can be staggering. People tend to learn from others, and they learn bad practices along with the good ones.
There are many shortcuts and tricks involved with tools and systems. In BI, you'll find people who know how to do things with the same tool you are using that you are not aware of. We can all be "out-geeked." The BI tools of today offer more and more function, while printed manuals appear to be going the way of the dinosaurs. How are you supposed to find and use all this advanced functionality? It simply takes time and a need to go deep.
Those of us who do not have the time or technical background to play with a tool, read all the help, and check out the web site are often locked into performing a few simple operations with it, or we rely on others. If this is not going to keep you awake at night or diminish your self-worth, so what? The more important things to consider are whether your BI tools can be used to make your job easier and more efficient or change the way the business operates.
Will an executive user become proficient with a BI tool? Will a mid-tier decision maker spend hours learning how to tweak a report to get a different result? Will our company use one-tenth of the functions we saw in that neat new BI tool demonstrated last week? Will I spend an additional five hours trying to get that bar chart to look the way my users want it to? The only one of these questions that deserves a "yes" is the last one, and that is debatable. If the interpretation of the chart is incorrect due to some limitation in its formatting capability out of the box, then extra time may be warranted. Otherwise, we have taken our eyes off the "BI prize" and are tied up in window dressing exercises.
Here is another real-world example. A customer had a BI query and reporting tool installed that had been spreading into several business areas where the users were pleased with its capabilities. Most of the analysis had been developed and delivered via IT since the user's needs were far beyond what most non-technical individuals could produce. This particular tool vendor had fallen on tough times, and the customer was getting nervous about its long-term viability.
Major industry pundits were recommending that customers switch from the old tool to one of several new ones. A new vendor was brought in to perform a proof of concept and replace the old tool if they could. There was a series of charts, which were very difficult to produce in the new tool and had been abominable to produce with the old one as well.
It turned out that the user's output made no logical business sense—in any tool! The IT support people were constantly railing about how silly this entire exercise was, but they had been forced to produce this rather useless output. Hundreds of hours had been invested getting the output together the first time, despite the fact it had no real value anyway. No one would step forward and tell the users, "Look, if you want this output to look like this, do it yourself. It makes no business sense, and it's way out of the design specifications for this product anyway!"
Had our example application delivered significant and difference-making results, it would make sense to either stick with a tool and beat the truth out of the data, or to perform the same work with a new tool due to the business value. In light of the rather senseless exercises involved, there was no business benefit with the original tool and a very poor first application selected for the second.
Were there other applications that may have provided a higher ROI? There was no way to know because we couldn't get past creating these worthless exercises in output. The final kicker was that the budget was cut in mid-project, and the original tool was kept. New projects were curtailed due to the concern over the viability of the existing vendor, but the same meaningless application is probably spewing out worthless output to this day.