Enabling Tableau to read PowerPivot model updates
July 19, 2012 10 Comments
One of those interesting opportunities that opens up when using PowerPivot in conjunction with SharePoint 2010 is that of leveraging the PowerPivot cube-like interface of its inherent data model in order to extend the analytical arsenal of an organization.
Tableau is a great tool that can extend the reach of PowerPivot deployments due to its advanced visualization capabilities.
Its typical for data analysts to go over many iterations of a PowerPivot report (model) before they consider it good enough to share it with others. At this point, though, Tableau 7.0 does not by default grab the latest and greatest version of the PowerPivot analytical database, the one with the must current data model and DAX calculations.
Lets walk through a typical scenario. A data analyst creates a PowerPivot report and publishes it to SharePoint. He then wants to create additional visual analysis using Tableau against the data model that was loaded into PowerPivot.
A typical PowerPivot report URL location would look like this:
In Tableau, he can then use this URL to connect to the cube-like interface of the underlying PowerPivot data model. To do this, he uses the Microsoft Analysis Server option, and instead of typing a server name simply he uses the URL location of the PowerPivot workbook:
To be clear, you are not connecting to the Excel pivot table, but to the underlying PowerPivot model. This normally contains more data and metadata than the actual Excel report rendered in the PowerPivot gallery. At this point, he is able to create interesting Tableau analytics using PowerPivot as the source:
Now he wants to add additional DAX measures to extend the analysis. He adds the following expression to PowerPivot:
% of Total Sales:=[Sum of SalesAmount] /
CALCULATE([Sum of SalesAmount], ALL(DimSalesTerritory) )
And republishes to SharePoint. By interacting with the newly published PowerPivot report by either selecting a new filter, invoking the workbook through a client via the SharePoint URL or simply by scheduling a data refresh, an updated version of the PowerPivot underlying model is stored in the dedicated instance of Analysis Services that SharePoint uses.
At this point, If you imagine being the data analyst in question, you may now attempt to re-open the Tableau workbook but you will get an error message:
And the report is wiped out. Hmm not good! Interestingly, other clients like SSRS, Report Builder and Excel can get the latest version of the PowerPivot model without an issue. So how can we enable Tableau to leverage the PowerPivot model changes seamlessly?
The solution is fairly simple, though not very intuitive. Prior to doing any changes to the PowerPivot model, we must save the Tableau connection:
This process creates an actual file that we can then edit in notepad. We must remove the database name from the connection file (highlighted below) and re-save:
This is necessary as Tableau is creating a connection to a specific version of the PowerPivot model which is not valid anymore. By removing the database name, SharePoint is able to point Tableau to the newest PowerPivot model available, which will then enable the report to work without errors and with all the changes to the underlying model. After implementing the solution, the data analyst can now use this saved connection to create Tableau workbooks that will automatically update the metadata to match that one of the newest PowerPivot model.
In the scenario discussed, he would see the new DAX measure in the Tableau measures pane and use it as part of the visualization:
It is possible to save these connection files to Tableau server and share them with others, so newly created views will benefit from automatic synching with the most updated PowerPivot hosted models.
Keep in mind that one of the great benefits of connecting to PowerPivot workbooks hosted in SharePoint is that you get additional features like automatic data refresh and security, along with the ability to centralize one model that can serve many reports across multiple platforms and tools.
In my opinion, this technique allows data analysis to benefit from the best of both worlds when conducting rapid iteration self-service BI.