Dangers in Dashboard Design

Dangerous Dashboard Design

Dangerous Dashboard Design

I recently had the great pleasure of working on a dashboard project using PerformancePoint in SharePoint 2013 with a great team of individuals.  We had to overcome several obstacles during the process and I thought it would be a good idea to document how we accomplished some of our goals with the hope that I can spare someone else (or my future self) from having to go through the same pain.  Our dashboard is currently in closed beta and we will be releasing it to the wild in the next couple of weeks.

Dynamic Active Directory Security Without Kerberos 

The biggest challenge we faced was how to implement dynamic active directory security without using Kerberos.  I had applied dynamic security to several Analysis Services databases in the past; but, had never followed that implementation through to PerformancePoint dashboards.  This challenge is more difficult in that respect because within a dashboard you will typically have more sources of data than just an analysis services database.  We had SSRS reports that ran against our ODS and our relational data warehouse.  You can read more about how we achieved this in my recent blog post, PerformancePoint 2013 Dashboards: Applying Dynamic Security using CustomData and EffectiveUserName.

Our solution required three roles.  Two roles for the managers who would be restricted to seeing only their employees.  One role for the PerformancePoint objects which uses the CustomData function in the security and another for the SSRS MDX based reports and Excel Services based reports that uses the UserName function.   Finally we needed a separate unrestricted role for the managers and district leaders that could see all of the data.  This allowed us to keep our bridge table relatively small.  Otherwise every person that needed to see every person would require about 2,500 rows in the bridge table.

During develoment, I mistakenly thought we could create the entire dashboard for the restricted users, save a copy and rename it to unrestricted, change the data source to use the unrestricted role and be on our way.  I was woefully mistaken.  Apparently PerformancePoint uses some sort of invisible GUID or something for the data source name.  Worse yet, you can’t change the data source for existing Analytic charts, grids, and scorecards.  So we couldn’t make a copy of the objects, rename them, and point them to a different data source.  We actually had to use two different site collections for the restricted and the unrestricted dashboards!

I Know Who You Are and What You Did Last Quarter

Once we had the dynamic security in place we wanted the dashboard to automatically know who you are.  We accomplished this by adding an attribute to our person dimension that would hold the active directory account name.  I use the word person in it’s most generic sense.  A person can be a vendor, customer, employee, manager, or sales person.  I created a MDX filter in PerformancePoint and used the following MDX code to achieve this: 

STRTOSET(“[People].[Active Directory Login].&[” + CUSTOMDATA() + “]”)

I then added this attribute to any analytic charts, grids, or scorecards in the background quadrant of the design canvas so that the report could be filtered by it.  I connected the filter to the report as one typically would and later hid the filter using the edit web part interface within the SharePoint 2013 site.

Save the Trees

My good buddy Michael Simon used some fancy MDX to set the trailing six quarters for an analytic bar chart only to discover that we lost the ability to right click and get the decomposition tree and all the other neat PerformancePoint features.  He discovered a clever work around using dynamic named sets.  We had our right clicks back.

Yesterday… Not Such an Easy Game to Play

Another challenge we faced is a very common one.  How to get the dashboard to automatically present data as of yesterday.  Typically this wouldn’t be a big deal; but, in our case we use a reporting calendar that doesn’t include Saturdays, Sundays or Holidays.  Any sales that occur on those days roll forward to the next business day.  If you don’t have such a dilemma you can use either of these two snippets to accomplish this feat:

1. STRTOSET(“[Date].[Calendar].[Date].&[“+ VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”)

2. STRTOSET(“[Date].[Calendar].[Date].&[“+ VBAMDX.Format(VBAMDX.Now(), “yyyyMMdd”) +”].Lag(1)”)

In our case, this wouldn’t work.  If the user wanted to look at the dashboard on a weekend or a holiday they would get an error. 

I initially tried to set the default member of the dimension to yesterday using MDX.  I ran into a problem in that once I set the default member for the date dimension in the cube I could no longer put the date dimension in the report filter area of an Excel Pivot table.  It would no longer work and would just show yesterday’s data.  See this link for more information about that issue:  Default Member Trap (be careful of what you filter)

My next strategy was to add an attribute to our Gregorian date dimension called Last Business Date and update the corresponding relational table with this information.  I then attempted to parse together the information to get the correct reporting date.  We have two date dimensions in our cube.  One is a regular calendar date dimension, the other one is a fiscal reporting date dimension that doesn’t include weekends or holidays.  Our fact table has keys for both.  You can see my post asking for help from the BIDN community here: 

MDX Help Please – The + function expects a tuple set expression for the 1 argument. A string or numeric expression was used

Before I received the kind reply from mreedsqlbi, I found another way to accomplish what I needed.  This post helped me find another solution:  Member Does Not Exist in MDX Calculations

I created four dynamic named sets in the cube called, OneDayAgo, TwoDaysAgo, ThreeDaysAgo, and FourDaysAgo.

Here is the code for OneDayAgo:

iif(iserror(STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”)),{},STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”))

For each of the other named sets, I just changed the –1 to a –2, –3, and –4 respectively.

The code for my MDX filter used the dynamic named sets to figure out if the member exists:

IIF([OneDayAgo].Count > 0
            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-1, "yyyyMMdd") +"]")
            ,IIF([TwoDaysAgo].Count > 0
                  ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-2, "yyyyMMdd") +"]")
                  ,IIF([ThreeDaysAgo].Count > 0
                        ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-3, "yyyyMMdd") +"]")
                        ,IIF([FourDaysAgo].Count > 0
                            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-4, "yyyyMMdd") +"]")
                            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-5, "yyyyMMdd") +"]")


This works better because it works in Dev which doesn’t always have data for yesterday.

TopCount(ORDER( NONEMPTY( {[Date – Reporting].[Date].Members

     – [Date – Reporting].[Date].[All]}

   , [Measures].[Sales Amt] )

   , [Date – Reporting].[Date].CurrentMember.MEMBER_KEY

   , DESC )



At one point during Beta testing, I was asked to present the user with a date dropdown that defaults to yesterday.  This was later scrapped because it required a server wide change within the Central Administration Application Settings for PerformancePoint in which we had to set the user selected filters to expire every day.

For anyone who is interested, I got that to work by removing the TOPCOUNT from the MDX formula filter in Dashboard Designer:

ORDER( NONEMPTY( {[Date – Reporting].[Date].Members – [Date – Reporting].[Date].[All]},                 
 [Measures].[Sales Amt] ), [Date – Reporting].[Date].CurrentMember.MEMBER_KEY, DESC )

This only shows dates that have data, excludes the all member, and sorts it in descending order.  The result is that it defaults to yesterday but allows you to change it. 


I think I’m crazy; but, I could have sworn the dashboard was showing Friday’s data on Monday.  Looking at the code, I don’t see how; but, I am amazed no one has pointed it out.

I had to modify the code to eliminate Today’s date from the data set.

          [Date – Reporting].[Date].MEMBERS
   – [Date – Reporting].[Date].[All]
   – STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now(), “yyyyMMdd”) +”]”)
       ,[Measures].[Sales Amt]
     ,[Date – Reporting].[Date].CurrentMember.Member_Key


Another challenge we faced was how to have MDX filters control relational reports.  This was pretty easy.  Just use the DisplayName attribute of the filter instead of MemberUnique and let the report figure out how to use that information.  We were unable to use multi-select MDX member selection filters.  When we passed these as DisplayValue to our relational SSRS reports it would only pass the first selection.  We were unable to overcome this limitation the way we would have liked.  We ended up not using multi-select filters and instead had filters that only contained the parent level of the hierarchy.  In other words, our filter contained managers and the report would display the managers employees.  We lost the ability to have the user only select some employees of that manager; but, not others.  To my knowledge, there is not a way to pass MDX multi-value filters using the DisplayValue property to relational SSRS reports.

Staying On Top of Reporting Services

One aggravation we experienced was that the column headers on the SSRS reports would not stay at the top of the web part when the user scrolled through the report part.  There are many blog posts out there on how to achieve this; but, within SharePoint 2013 it would work in Firefox and Chrome; but, not in Internet Exploder (9 or 10.)  I had to workaround this issue by sizing the report so it wouldn’t scroll, and instead display the toolbar and have the user page through the data instead of scroll.  Which in my opinion, looked nicer anyway.

I’ve got to mention one gotcha we experienced during our dashboard design endeavor that I had experienced many years ago.  I thought I had blogged about it in my What’s New BI-wise section; but, I couldn’t find it so I must have dreamed I had blogged about it.  Any way here are some words of wisdom:

When you are creating an SSRS MDX based report and need to make a customization to the underlying MDX that is created using the easy to use drag and drop interface, make a backup or better yet, check it into source control and label it ‘Pre-MDX customizations’ or something very clear like that.

Once you have made those customizations to the MDX (to allow that member property to be displayed for example) you can’t go back!  No longer can you simply delete all your parameters and add different ones easily.  Everything has to be done through hand coded MDX from that point forward.  It is far easier to design your report as much as possible, make a backup or labeled source control check-in, and then make your MDX customizations.  If you need to add/remove fields or parameters in the future it is easier to go back to that pre-MDX customized version and do it using the interface and then re-do the MDX customization than to work with it outside the interface.

Scorecards Don’t Always Score 

Scorecards are really cool gadgets to use in PerformancePoint dashboards because you can use them to control other objects.  Filters and scorecards are the only way to control objects in the PerformancePoint dashboard.  When you see a scorecard functioning well inside a PerformancePoint dashboard it looks really cool.  As you click on the various members within the scorecard you can have charts and data change according to the context.

We could not use scorecards the way we wanted because we simply had too many members in our dimension for them to render quickly enough.  They don’t work as you would expect.  Doing the same type of thing in an analytic grid or Excel services report is a piece of cake and renders very fast.  For whatever reason, regardless of the fact that I filtered the scorecard on the manager, it would insist on rendering every employee in the company and then hiding the ones that weren’t children of the selected manager.  In our case that was a little less than 5,000 members.  I reduced this set to about 2,800 by deleting employees that weren’t used as keys in any of our fact tables; but, it was too late.  The powers that be had already decided that scorecards stink due to not only the rendering speed; but, the hoops we had to jump through to develop one using that many members.  Even in the developer’s IDE, Dashboard Designer, it would pull back every single representative regardless of being filtered on a single parent member.  It would display those in a red font indicating that they would be hidden; but, the time it took the designer to render made you think it had crashed.  We actually had to build a cube containing a very small subset of the data in order to design the scorecard the way we wanted.

From a pure developer’s standpoint it seems that the scorecard design process in PerformancePoint needs some maturing.  Some configurations that are set within the wizard for creating an Analysis Services based scorecard are simply not editable outside of the wizard.  One would need to take screenshots of how the scorecard was developed to document the process in the event that a change was ever needed to one of those initial configurations.

While I’m on the subject of scorecards, another gripe we had about the way scorecards work is that any time you have more than one set of KPI actuals and targets within a scorecard it displays them with all of the actuals on the left and trends or targets on the right.  In other words, if I have a sales actual and goal KPI on the same scorecard as a gross profit actual and goal, I can only show them as actual sales, actual gross proft, target sales, target gross profit.  I would like some control to show the targets next to their corresponding actuals. 

Easy Migrations 

One new feature of SharePoint 2013 that I really like is the new way it handles exporting and importing a dashboard project.  In previous versions of SharePoint, I had to create long lists of customizations that had to be done to the web parts after Dashboard Designer had deployed the dashboard.  Things such as hiding filters, resizing objects, and changing the chrome settings had to be done over and over again every time we made a change to the dashboard that required a re-deployment.  Moving the project from the development environment to the QA environment and finally to the production environment was a manual tedious process.  Furthermore, these customizations were not stored in source control.  I was pleased as punch to discover the export/import feature of SharePoint 2013 solves both problems.  I can extract the resulting .CMP file using a cab extractor and put the entire project into source control and I don’t have to repeat those customizations over and over again.

Accepting the Unacceptable 

I’ve been creating dashboards since PerformancePoint 2007.  Make no mistake, I do not consider myself an expert at dashboard design.  I would however consider myself pretty advanced in using the Microsoft BI stack to produce great dashboards quickly.  I really like using PerformancePoint.  If you are willing to accept some of the quirks w/ PerformancePoint, it is a great tool to allow a developer to very quickly produce meaningful dashboards with a lot of functionality right out of the box.  When you start getting very particular about the way things look it gets more complicated.  For example, I can only show the legend at the top or to the right of my analytic charts.  I can’t show them on the left and I can’t show them on the bottom.  I can’t control the color of the bars in the bar charts.  I sometimes joke that designing PerformancePoint dashboards is a great opportunity for someone who likes to say no.  Sure, we can give you a bar chart w/ the colors you want or with the legend on the left; but, I have to do that in Reporting Services and we lose all of the cool right click functionality that is included in the analytic chart such as the decomposition tree. 


Fifty Shades Of Gray Is Not Enough 

I’ll never forget one project I worked on as a consultant with a very lovely lady who had designed her dashboard on paper before she had any idea of what PerformancePoint could and could not do easily.  A simple thing like showing the total dollar amount at the top of the bar chart involved creating a separate reporting services web part that only showed that total and placing it above the chart. The end result was a really good looking dashboard that looked remarkably similar to what had been envisioned.  It just took a lot of extra time to make it look exactly how it was imagined.

One other piece of advice I can give is to apply a lot of rigor in your testing.  Make sure each page of the dashboard uses consistent nomenclature.  Make sure that the numbers on all pages agree with each other.  If one page says that gross profit is this and another says gross profit was that, you’re going to get a lot of questions from your user community.  Be consistent in your color and number formatting as well.  Are you going to display zeros or leave the cells blank?  I personally prefer to leave the cells blank; but, it is more important that whatever you decide you stay consistent.  Decide beforehand what screen resolution you are designing to.  If everyone at your company has widescreen 17 inch monitors than it might be best to design your dashboard to look best at 1920×1080.  If you need to show your dashboard on an iPad or a Surface than you better have one lying around.  Good luck with that by the way.  Consider which browser is your company’s standard.  HTML5 or not, things look different in Chrome, Firefox, Safari and Internet Exploder versions 8, 9, 10, or 11.  


As I wrap this up, I just want to give a big warm thank you to the other members of my team, Sherri McDonald and Michael Simon.  A lot of hard work and long hours went into this project and it feels great to have professionals like them to ease the burden.  It’s a much better experience than working solo.  Three heads are much better than one and it feels great to have others throw rocks at your stuff and see where it breaks before the real critics get their hands on it.

Do you have a good dashboard design story?  Did I leave a danger out?  Please let us know in the comments. 


Other Dashboarding Resources

One person that I would consider an expert on dashboard design is Stephen Few.  I recently read on his blog that a second edition of his book, Information Dashboard Design: The Effective Visual Communication of Data, is being released shortly.  He mentioned in that post that one of the criticisms he received on the first edition that is addressed in the second edition is the lack of real examples of effective dashboards by real products.  His explanation is that at that time there simply weren’t any products released that were capable of producing what he considered really good dashboard designs.  Be sure to read his article Common Pitfalls in Dashboard Design for a great list of dashboard design gotchas.


PerformancePoint 2013 Dashboards: Applying Dynamic Security using CustomData and EffectiveUserName

What are Web Parts?

PerformancePoint – Effectively Disable Analytic Grid Right Click Select Measures Functionality

Time Intelligence Filters in PerformancePoint 2010

Cascading Filters in Performance Point Services dashboard using SharePoint 2013

Performance Point Relative Date Time Intelligence with current date time

How to make SSRS reports in a PerformancePoint dashboard ‘pop out’ or open in a new window.

Clutter, data overload put dashboard designs on path to failure

Data Visualization and Dashboard Design

Formatting your PerformancePoint Analytic Grid!

Using SSAS MDX Calculation Color Expressions

Time Intelligence Post Formula Filters

Set Default Value of PerformancePoint Filter – Note:  Caveat to this tip- Once a user changes the default value SharePoint will remember the changed value and not reset the default value.

Add conditional formatting to your PerformancePoint Services Analytic Grid, by defining a custom Scope calculation from within Analysis Services (SSAS)

Filter Connection Quick Reference


All my PerformancePoint links – Updated as I update them!


PerformancePoint Gotcha!  – “PerformancePoint Services could not connect to the specified data source.”  When creating a new dashboard, you MUST do a save all right after you create your data sources otherwise you won’t be able to create anything that uses that data source.


Gotcha! PerformancePoint Time Intelligence and SSRS

I was trying to use a PerformancePoint time intelligence filter to pass values to a SSRS report. I discovered that the MonthToDate, YearToDate, QuarterToDate syntax does not work with SSRS. Instead use Month.FirstDay:Day, Year:FirstDay:Day, Quarter.FirstDay:Day


Zero to Dashboard- Intro to PerformancePoint    
Come check out the PerformancePoint hotness! Mike will demonstrate the functionality in PerformancePoint services in SharePoint 2010 and show you how to quickly build some dynamic dashboards and reporting functionality your end users will crave.



PerformancePoint Tip – Right clicking an analytic grid in the leftmost columns allows the user to select measures.  Some people may not want that, so how do you disable that ability?  PerformancePoint Services Application Settings – Select Measures Control – set maximum value to 0.  (default is 1000).



PerformancePoint 2013 Dashboards: Applying Dynamic Security using CustomData and EffectiveUserName

I recently had the great pleasure of solving a PerformancePoint 2013 puzzle and I thought I’d share how I did it to help others and as a reference for my fellow colleagues.  Our team was trying to implement dynamic active directory security for PerformancePoint 2013 dashboards.  A variety of PerformancePoint web parts were tested as part of this endeavor including Analysis Services scorecards, analytic charts and grids, SSRS reports (native mode), and Excel Services.

Devin Knight does an excellent job of explaining how to get this to work in his blogpost:  Analysis Services Dynamic Security.

For this method to work with PerformancePoint and Excel Services we will need two Roles:  ReadUserName and ReadCustomData.  The first one, ReadUserName is set up exactly how Devin describes.  This will be the role that Excel and SSRS will use.  The second one, ReadCustomData is the same except you need to replace the function UserName() with CustomData().  This ReadCustomData role will be the role that is referenced in the data source editor of PerformancePoint dashboard designer.

To test this I had to make sure that I wasn’t in any other AD groups or roles.  I only had to add myself to the ReadUserName role.  I did not have to be in the ReadCustomData role, only the service account that runs the SharePoint 2013 PerformancePoint content (a.k.a. Proxy.)  We also have a ReadUnrestricted Role.  I had to make sure I wasn’t in that role.

Here is another link, that describes how to get this working with PerformancePoint 2010:  How to configure dynamic security in Analysis Services and Performance Point 2010 without Kerberos.

Here is a link that describes various methods for testing if the security is working correctly:

Unit Testing Role Security in Analysis Services.

The series of posts below are wonderful!  I won’t repeat what is said there; but, I will say I was relieved to see that this person also came to the conclusion that two roles would be needed.




Here is where things got interesting… No, really.

This post:  Using EffectiveUserName To Impersonate in SSAS mentions that you have to set up Kerberos on your SSAS server to get this to work.  I can tell you we got it to work without that.  I did set up the data source in SSRS as this person describes.  Credentials stored securely in the report server, use as windows credentials, Impersonate the authenticated user after a connection has been made to the data source.

I got a little lucky (ok, maybe a lot lucky) in that my account had certain permissions that no one else in my active directory group had.  If it weren’t for that, I don’t know if I would have ever gotten any of this working.

Our initial goal was to set these permissions on either the SharePoint proxy account or the domain service account that process the cube.  That worked for the SSRS MDX reports; but, the Excel Services dashboard pages quit working.  Currently we are faced with the possibility that these permissions will need to be set for all the dashboard users and we are investigating what that actually means.  (http://windowsitpro.com/active-directory/hiding-data-active-directory-part-3-enabling-list-object-mode-forest)


We think the List contents is the one that made the difference.  My Active Directory sent me these links that he felt describe what is going on pretty well.  Here is what he said:

‘The link below explains the end result as the Account authenticating for the SSRS reports needs Read access to the TokenGroupGlobalAndUniversal attribute. This is done by adding the required account to the Windows Authorization Access  group in Active Directory.

Originally it was thought the SharePointProxy account was the account being used. I did add it to the group last Saturday and it did not resolve the issue. But today it was identified that the service account that connects to the data warehouse for cube processing was the account needing to verify the users group memberships to present the reports based on this information to the clients.

I also confirmed this to be the attribute by allowing Read access to all the User Object attributes and removing them until this single attribute was remaining. Removing this last attribute denied the reports with the Unknown username or password error.’



The other piece that seemed to make a difference was that the SSAS service is running under the NT AUTHORITY\NetworkService account.

Post in the comments if you have questions.

Time Intelligence Filters in PerformancePoint 2010

The time intelligence filters provided with PerformancePoint 2010 gives developers an easy way to provide users with a method for specifying time periods using common English terms such as “Last 6 months”, “Same Period Last Year”, “Rolling 3 months”, and so on. These filters can be linked inside your dashboard to control Excel Services reports, SSRS reports, scorecards, and analytic girds and charts. Behind the filters are formulas based on the Simple Time Period Specification (STPS.)

In the text that follows, I hope to demonstrate the use of these concepts:

  1. Setting up and using Time Intelligence with both tabular and multi-dimensional data sources
  2. Using Time Intelligence with KPIs and Scorecards
  3. Using Time Intelligence with Analytic Grids, Charts, Excel Services reports, and SSRS reports
  4. Using both types of Time Intelligence Filters, the standard time intelligence filter and the connection formula
  5. Using the TI Connection formula to provide users with a From Date To Date range functionality

I have modified AdventureWorksDW2008 relational database by adding three views. One to increase our date dimension, one to extrapolate data to the current date, and one to use as a tabular data source. I made some pretty massive changes to the Adventure Works cube to simplify my demonstration process.

You can download those views and the XMLA for the altered SSAS database, here.

Once the cube was processed I had accomplished two things:

  1. My date dimension starts at the beginning of the year (best practice recommendation for working with time intelligence in PerformancePoint; but, not required.)
  2. My fact table has data through the current date.

In order for the time intelligence formulas to work properly, certain things must be set up on the data source connection. We will create two data connections for these examples. One will be a multi-dimensional data source to our cube and the other will be a tabular data source to a view that combines the data I need from the relational database. The process to add the time intelligence to these two data sources is similar; but, different.

Open PerformancePoint and create a new data connection by right clicking the data connection folder in the workspace browser and selecting “New Data Source.” Select tabular list and SQL Server table. In the table field select the view included in the project files above named “vw_InternetSalesTabularExample.” Select the time tab and select the options checked below in the screenshot.


Next, create the data connection to the OLAP cube. Right click the data connections folder, select “New data source”; but, this time select the “Multidimensional” tab and select “Analysis Services.” Below is a screenshot demonstrating what the time tab should look like once complete.


Now that we have both data connections set up, I will first demonstrate using the time intelligence with a tabular source. The tabular data source can only be used with filters, KPIs and scorecards. Right click your PerformancePoint content list in the workspace browser and select “New”, “KPI.” Name the KPI “Internet Sales.”

Within the KPI, rename Actual to MTD and rename Target to MTDLY. Click the data mappings for MTD, change the source, select “SalesAmount” in the measure dropdown and click the “New Time Intelligence Filter” button. Enter the following formula into the dialog:


Now, change the data mappings for MTDLY, change the source, again select “SalesAmount” in the measure drop down, click the “New Time Intelligence Filter” button and enter the following formula into the dialog:


Now, right click your PP content list folder in the workspace browser and select “New”, “Scorecard.” In the scorecard template dialog, select the “Tabular” tab and “SQL Server Table.” Click Ok and the wizard will walk you through the next steps. Select your tabular data source, click next. Click the Select KPI button and choose the Internet Sales KPI we just created. Click Next a couple of times and the click Finish.

Delete the MTD column by right clicking on it. Then right click the MTDLY and select Metric Settings. Rename it to MTDLY vs MTD and select Actual in the “Additional data value” drop down. Click Ok. Drag ProductSubCat above “MTDLY vs MTD.” Drag “Order Country” as the parent of the Internet Sales KPI. Click the Edit tab and then the update button. Your screen should look like this:


To create a scorecard using the multidimensional data source; the steps would be pretty much identical.

For the next demonstration we will create a dashboard with an Analytic Grid that uses a standard time intelligence filter. Start out by creating an analytic grid using the SSAS data source that looks like the one in the screenshot below.


Notice the “Date Calendar” is in the background of the grid. Now create a new time intelligence filter using the SSAS data source. We will enter the following Formula/Display Name combinations by clicking the Add Formula button for each one.



Clicking the preview button will show the MDX behind the formulas.


Note: The second row says no results because I ran this on Feb 29th, 2012. There was not a Feb 29th in 2011.

Notice the difference between the monthtodate, yeartodate, quartertodate, and fullmonth formulas and their SSRS compatible counterparts. SSRS can not handle these formulas so I used an alternate syntax to demonstrate how to accomplish the same thing using an alternate syntax.

Now create a new dashboard and drag the Analytic Grid to the design surface. Then drag the TI standard filter to the dashboard and connect it to the grid by selecting Member Unique Name from the TI filter and dropping it onto the drop zone in the analytic grid space. When the connection dialog comes up, make sure you select “Date Calendar” in the “Connect to” drop down.


Deploy your dashboard and test out the filter.

Now, we will create a TI Connection Formula Filter. Right click in your workspace browser and select New, Filter; but, this time select the time intelligence connection formula filter. Add your OLAP data source and click Next and Finish. Next create a new dashboard or a new page on your existing dashboard. We’ll use the same analytic grid; but, this time hook up the TI connection formula filter to dashboard. Everything is the same as the screenshot above except this time click the “Connection formula” button and enter the following into the dialog:


This is saying, calculate from the selected date 6 quarters back (1.5 years) and aggregate from there to three quarters out. So if today is 1/1/2012, 6 quarters back would be 7/1/2010 (1.5 years away). That is our start range.

3 quarters back from 1/1/2012 would be 4/1/2011.

Deploy the new dashboard, open SQL Server Management Studio’s cube browser and verify the results.





Note: select DATEADD(quarter,-6,’1/1/2012′), DATEADD(quarter,-3,’1/1/2012′) returns 7/1/2010 and 4/1/2011. So why did I filter from 7/1/2010 to 6/30/2011 above? Because we are working in quarters. 4/1/2011 starts a new quarter and we include the entire quarter in the results. If this doesn’t make sense to you, try an example using month-6:month-3 instead.

Next I will demonstrate how to provide your users with a From Date parameter and a To Date parameter to provide range based queries using the PerformancePoint Reports. You can provide similar functionality by using a single date parameter and using the Multi-Select tree display method; however, you and/or your users may prefer to use range based parameters.

The first step is to copy your existing Analytic Grid and give it a new name. We will work from this copy. Open this copy and click on the Query tab. Locate this line of code and position your cursor just before date.calendar part.

WHERE ( [Measures].[Internet Sales Amount], [Date].[Calendar].DEFAULTMEMBER )

Type FromDate in the parameter text box and click the insert button. Create another parameter in the same fashion by typing ToDate in the textbox and clicking insert. Modify the where clause so it looks like this:

WHERE ( [Measures].[Internet Sales Amount],HIERARCHIZE({<<FromDate>>:<<ToDate>>}) )

Create two new filters, one called TI From Date, and another called TI To Date using the time intelligence connection formula filter.

Now create a new page in your dashboard, drag the copied analytic grid to the design surface and then the two TI connection formula filters. Connect both filters to the analytic chart making sure to connect them to the proper parameters. Use Day:Day in both connection formulas. Publish your dashboard and test.

Next we will create a new report in PerformancePoint that connects to a SSRS report with a date calendar parameter. We can create two dashboard pages to demonstrate this functionality. One using the standard TI filter, and one using the TI connection formula filter. The only caveat you need to be aware of is that the members in the standard TI filter that use the …toDate or …Full(Month/Quarter) syntax will not work. You will have to use the alternate syntax describe earlier to get that same functionality.




The YTD (Non SSRS) filter produces the error:

  • Default value or value provided for the report parameter ‘DateCalendar’ is not a valid value. (rsInvalidReportParameter)


Note: These numbers match the earlier example we did using the Analytic Grid.

Next we will hook the TI filters up to an Excel Services report. The only issue I had when preparing this demonstration was an error that occurred when previewing the dashboard. “Attempted request on an invalid state. Unable to perform the operation” Google to the rescue!

Basically, I had to uncheck the box in SharePoint Central Administration for my Excel Services application that says ‘Refresh warning enabled.’

I also received an error when I used the YTD (Non SSRS) filter with Excel Services.


‘An error occurred while attempting to set one or more parameters in this workbook. As a result, none of the parameters have been set. Click Ok to return to the workbook.’


TI Standard Filter with Excel Services report


TI connection formula filter with Excel Services report. Numbers match previous examples.

It should be noted that the ‘Show Details’ and ‘Additional Actions’ features are greyed out when using the TI filters with a date dimension in the background of the analytic grid and chart. One (not very good) workaround is to put the date dimension on the rows or columns to get this functionality back. The reason this work around is not very good is that your report does not look the same.

Example using analytic grid:

Dashboard page looks like this:image_thumb47

Right clicking a cell has ‘Show Details’ and ‘Additional Actions’ grayed out.


I created a copy of the original and dragged the date calendar from the background to the rows underneath the geography to demonstrate what it would take to get the ‘Show Details’ functionality back.


Not too bad using the YTD (Non SSRS) filter. (Not too good either…)


But, change it to last 10 days, and it becomes very ugly, very fast.


Moving the date calendar above the Geo in the rows helps a bit in some cases.

So nutshell, there are some caveats with working with the time intelligence features in PerformancePoint. Overall, they are a great feature to add some great functionality to your dashboard with minimal effort.

Miscellaneous Facts

Suppose you have a data source that has multiple time dimensions and you want to      use both time dimensions. The solution is to create a new data source for each time       dimension you want to use in your PPS solution. For example, if the cube you are using       has both calendar year and fiscal year dimensions, you can create two data sources       using the same server and cube information with the only difference being the time       dimension selected in the Time tab of each data source. When creating KPIs or filters,       select the data source with the time dimension that makes sense for that object.

Colon (:)      The colon is used to indicate a range of dates. For example, the statement Day-       1:Day-7 selects all the days between yesterday and a week ago inclusively.       Comma       (,)       The comma is used to combine two members. For example, the statement Day-       1,Day-7 selects today and a week ago today as distinct dates.

You can create two kinds of Time Intelligence dashboard filters:      1. Time Intelligence dashboard filters that include a list of dynamic time periods that you specify       2. Time Intelligence Connection Formula dashboard filters that use a calendar control to specify information as of a particular date. When you create a Time Intelligence Connection Formula dashboard filter, you do not specify a formula until you connect that filter to a report or a scorecard.

Periods-To-Date      Periods-to-date are a NEW type of TI formula added in Office 14. The result of a to-date period is an       aggregation of all time periods to date up to the last completed full period. Incomplete time periods are       automatically excluded. They are evaluated to the lowest degree of granularity in the data source by default. For example, if most granular time period in the data source were days, then the month to date expression will       aggregate all days from the beginning of the month to the last completed full day in the month. (The opposite is true for standard time periods They automatically include incomplete periods

Periods to date are not compatible w/ SSRS (personal experience.)

Here are some links that helped me put this blog post together.

PerformancePoint Relative Date Time Intelligence with Current Date Time

How to use Time Intelligence Filters with Excel Services or How to Pass a Range Selection into your Excel Report

From Date To Date in PerformancePoint Analytical Chart

Time Intelligence Post Formula Filter Template in PerformancePoint Server

PerformancePoint Time Intelligence – BI for the Masses

Create a Time Intelligence Filter by Using Dashboard Designer

Time Intelligence Differences Between Grids and Scorecards

PerformancePoint – Effectively Disable Analytic Grid Right Click Select Measures Functionality

The Analytic Grid available in Microsoft PerformancePoint 2010 is a powerful tool for allowing users to dynamically explore data within a dashboard.  One of the features it provides is the ability to right click on left most column which contains the row labels and select measures other than the ones the designer chose for the original dashboard element.  This is great unless you really don’t want to give your users such easy access to other data points within your SSAS OLAP cube.

Googling for answers I found that we were not the first to have the desire to disable this functionality.



So we started exploring other options including cell based security in SSAS which has a known performance impact.  One side note here is that I found a great idea on how to implement security on the measures but without resorting to cell based security.


Finally, I found a workaround.  Unfortunately, it disables this functionality for the entire Sharepoint server.


Select Measure Control – Set the maximum number of measures to retrieve and insert into a dashboard Select Measure control.

In Sharepoint central administration, navigate to the services application settings and select PerformancePoint Services.

This defaults to 1000.  Changing it to 0 eliminates the functionality.