Reframing Business Intelligence: Embracing change in our own discipline

I tend to see Business Intelligence as a field that not only delivers analytics but also generates change. A successful BI solution has the power to influence a company’s culture. The more fact-based decision making, the less personal bias there is. This is good, as decisions based on actual data are more likely to increase the ability of a business to increase profits or reduce costs.


On the other hand, many practitioners of Business Intelligence have known for a long time that the success ratio of BI/DW projects is not quite what it should be. To be sure we are in the same page, let me define what I consider success: a pervasive implementation, one in which the delivered tools are widely used by the intended target audience.

How many times have you seen technically perfect implementations go into misuse? The reasons are varied, but in a good number of cases we can track them down to the human side of the equation. Taking the shape of corporate politics, misunderstandings or quite simply resistance to change, people at every level of the organization can jeopardize a beautiful ETL/cube/report deployment.

To top this off, BI projects have been traditionally kicked off at a level of the organization that is neither senior management, nor front line data analysts. Sometimes this approach can correctly align us with the people that manage budgets, but this may not necessarily translate in accurate implementations. Not surprisingly, when deciding what technology to use, purchasers heavily rely on tools that have the potential to woo the highest levels of the organization with fancy visualizations.

There is nothing wrong with eye candy, except when it doesn’t serve the purpose of generating real business insight.

It appears to me we must also embrace change ourselves, so we can better align with real business needs.

We need is a solution that can help us stay close to the actual subject matter experts from the very first stages of BI development. With this, not only we ensure accuracy and relevancy but also gain an important asset: internal sponsorship. Gartner has said that this factor is one of the most important elements contributing to the success of a Business Intelligence project.

Clearly, the end-to-end framework of the xVelocity engine and DAX comes in handy when bridging the gap between business experts and IT folks. Business Intelligence success can occur more naturally when it happens through organic growth based on concrete, small deliverables that come to the aid of information workers. These are people whose main purpose is to produce business insight and without proper guidance can spend many hours per day copying and pasting instead of doing what they are supposed to do: analyze data. With tabular technology, we now have a way to directly assist them and grow with them. It is not unlikely that an engagement that starts as a punctual, report-specific project can turn into a more involved data mart/ETL development initiative.

As you know, I am fascinated with PowerPivot (well, ok, and Tabular technology). I tend to think that using PowerPivot for active/passive prototyping can be fantastic way to achieve this precise goal. By including subject matter experts’ PowerPivot data models in the corporate BI infrastructure (and giving credit where credit is due), we ensure accuracy along with a deliverable that is well understood and has been championed in collaboration with key users from all of the business areas involved (this generally translates into sponsorship).

Notice here there is change on focus: instead of a “let’s fix the world” approach to data warehousing, which can take months or years to be implemented (by that time business needs have probably changed), we are focusing in smaller, targeted successes. These in turn can naturally grow into a more comprehensive solution, all under the scrutiny of subject matter experts that naturally attempt to achieve the highest level of data quality and measure accuracy they can get.

We must directly target the data analysts from the very first moment of the Business Intelligence engagement. And what better tool to do that than, yes, Microsoft Excel (with PowerPivot, of course). This is a big contrast with the way things have been done traditionally, in which it is common to engage only at higher levels, defining an overarching structure, and after months of ETL work deliver SSRS reports which are frequently received with comments like, “Hmmm, that’s not exactly what I was looking. ” A round of expensive refactoring must start at that point.
So how can we engage data analysts early on, yet give decision makers a chance to try out the solution without incurring a big financial risk? Yes, you heard it before: we must prototype. PowerPivot can be an excellent prototyping tool.

Prototyping, as part of a pilot project, can be a great way to sample a solution prior to a full commitment to the endeavor. Even in the case of failure, much will be learned regarding the actual viability of the real project. In addition to this, when using PowerPivot to prototype, it is possible to generate a tabular model that is reusable once the actual project has been approved (PowerPivot data models are extensible).
This is not to say traditional data warehousing should be shelved. What I advocate for is a mixed approach:  a top/down strategy to data warehousing married to a bottom/up data exploration process can be a powerful combination.

At my company, Mariner, we have embraced this philosophy and have formal programs for pilot projects and prototyping that can help an organization increase the chances of success in Business Intelligence.   Please follow this link for more detailed info.