Dangers in Dashboard Design

Dangerous Dashboard Design

Dangerous Dashboard Design

I recently had the great pleasure of working on a dashboard project using PerformancePoint in SharePoint 2013 with a great team of individuals.  We had to overcome several obstacles during the process and I thought it would be a good idea to document how we accomplished some of our goals with the hope that I can spare someone else (or my future self) from having to go through the same pain.  Our dashboard is currently in closed beta and we will be releasing it to the wild in the next couple of weeks.

Dynamic Active Directory Security Without Kerberos 

The biggest challenge we faced was how to implement dynamic active directory security without using Kerberos.  I had applied dynamic security to several Analysis Services databases in the past; but, had never followed that implementation through to PerformancePoint dashboards.  This challenge is more difficult in that respect because within a dashboard you will typically have more sources of data than just an analysis services database.  We had SSRS reports that ran against our ODS and our relational data warehouse.  You can read more about how we achieved this in my recent blog post, PerformancePoint 2013 Dashboards: Applying Dynamic Security using CustomData and EffectiveUserName.

Our solution required three roles.  Two roles for the managers who would be restricted to seeing only their employees.  One role for the PerformancePoint objects which uses the CustomData function in the security and another for the SSRS MDX based reports and Excel Services based reports that uses the UserName function.   Finally we needed a separate unrestricted role for the managers and district leaders that could see all of the data.  This allowed us to keep our bridge table relatively small.  Otherwise every person that needed to see every person would require about 2,500 rows in the bridge table.

During develoment, I mistakenly thought we could create the entire dashboard for the restricted users, save a copy and rename it to unrestricted, change the data source to use the unrestricted role and be on our way.  I was woefully mistaken.  Apparently PerformancePoint uses some sort of invisible GUID or something for the data source name.  Worse yet, you can’t change the data source for existing Analytic charts, grids, and scorecards.  So we couldn’t make a copy of the objects, rename them, and point them to a different data source.  We actually had to use two different site collections for the restricted and the unrestricted dashboards!

I Know Who You Are and What You Did Last Quarter

Once we had the dynamic security in place we wanted the dashboard to automatically know who you are.  We accomplished this by adding an attribute to our person dimension that would hold the active directory account name.  I use the word person in it’s most generic sense.  A person can be a vendor, customer, employee, manager, or sales person.  I created a MDX filter in PerformancePoint and used the following MDX code to achieve this: 

STRTOSET(“[People].[Active Directory Login].&[” + CUSTOMDATA() + “]”)

I then added this attribute to any analytic charts, grids, or scorecards in the background quadrant of the design canvas so that the report could be filtered by it.  I connected the filter to the report as one typically would and later hid the filter using the edit web part interface within the SharePoint 2013 site.

Save the Trees

My good buddy Michael Simon used some fancy MDX to set the trailing six quarters for an analytic bar chart only to discover that we lost the ability to right click and get the decomposition tree and all the other neat PerformancePoint features.  He discovered a clever work around using dynamic named sets.  We had our right clicks back.

Yesterday… Not Such an Easy Game to Play

Another challenge we faced is a very common one.  How to get the dashboard to automatically present data as of yesterday.  Typically this wouldn’t be a big deal; but, in our case we use a reporting calendar that doesn’t include Saturdays, Sundays or Holidays.  Any sales that occur on those days roll forward to the next business day.  If you don’t have such a dilemma you can use either of these two snippets to accomplish this feat:

1. STRTOSET(“[Date].[Calendar].[Date].&[“+ VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”)

2. STRTOSET(“[Date].[Calendar].[Date].&[“+ VBAMDX.Format(VBAMDX.Now(), “yyyyMMdd”) +”].Lag(1)”)

In our case, this wouldn’t work.  If the user wanted to look at the dashboard on a weekend or a holiday they would get an error. 

I initially tried to set the default member of the dimension to yesterday using MDX.  I ran into a problem in that once I set the default member for the date dimension in the cube I could no longer put the date dimension in the report filter area of an Excel Pivot table.  It would no longer work and would just show yesterday’s data.  See this link for more information about that issue:  Default Member Trap (be careful of what you filter)

My next strategy was to add an attribute to our Gregorian date dimension called Last Business Date and update the corresponding relational table with this information.  I then attempted to parse together the information to get the correct reporting date.  We have two date dimensions in our cube.  One is a regular calendar date dimension, the other one is a fiscal reporting date dimension that doesn’t include weekends or holidays.  Our fact table has keys for both.  You can see my post asking for help from the BIDN community here: 

MDX Help Please – The + function expects a tuple set expression for the 1 argument. A string or numeric expression was used

Before I received the kind reply from mreedsqlbi, I found another way to accomplish what I needed.  This post helped me find another solution:  Member Does Not Exist in MDX Calculations

I created four dynamic named sets in the cube called, OneDayAgo, TwoDaysAgo, ThreeDaysAgo, and FourDaysAgo.

Here is the code for OneDayAgo:

iif(iserror(STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”)),{},STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”))

For each of the other named sets, I just changed the –1 to a –2, –3, and –4 respectively.

The code for my MDX filter used the dynamic named sets to figure out if the member exists:


IIF([OneDayAgo].Count > 0
            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-1, "yyyyMMdd") +"]")
            ,IIF([TwoDaysAgo].Count > 0
                  ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-2, "yyyyMMdd") +"]")
                  ,IIF([ThreeDaysAgo].Count > 0
                        ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-3, "yyyyMMdd") +"]")
                        ,IIF([FourDaysAgo].Count > 0
                            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-4, "yyyyMMdd") +"]")
                            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-5, "yyyyMMdd") +"]")
                        )
                    )
                )
            )

UPDATE:

This works better because it works in Dev which doesn’t always have data for yesterday.

TopCount(ORDER( NONEMPTY( {[Date – Reporting].[Date].Members

     – [Date – Reporting].[Date].[All]}

   , [Measures].[Sales Amt] )

   , [Date – Reporting].[Date].CurrentMember.MEMBER_KEY

   , DESC )

,1)

UPDATE:

At one point during Beta testing, I was asked to present the user with a date dropdown that defaults to yesterday.  This was later scrapped because it required a server wide change within the Central Administration Application Settings for PerformancePoint in which we had to set the user selected filters to expire every day.

For anyone who is interested, I got that to work by removing the TOPCOUNT from the MDX formula filter in Dashboard Designer:

ORDER( NONEMPTY( {[Date – Reporting].[Date].Members – [Date – Reporting].[Date].[All]},                 
 [Measures].[Sales Amt] ), [Date – Reporting].[Date].CurrentMember.MEMBER_KEY, DESC )

This only shows dates that have data, excludes the all member, and sorts it in descending order.  The result is that it defaults to yesterday but allows you to change it. 

UPDATE

I think I’m crazy; but, I could have sworn the dashboard was showing Friday’s data on Monday.  Looking at the code, I don’t see how; but, I am amazed no one has pointed it out.

I had to modify the code to eliminate Today’s date from the data set.

TopCount
  (
    Order
    (
      NonEmpty
      (
        {
          [Date – Reporting].[Date].MEMBERS
   – [Date – Reporting].[Date].[All]
   – STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now(), “yyyyMMdd”) +”]”)
        }
       ,[Measures].[Sales Amt]
      )
     ,[Date – Reporting].[Date].CurrentMember.Member_Key
     ,DESC
    )
   ,1
  )

Filters

Another challenge we faced was how to have MDX filters control relational reports.  This was pretty easy.  Just use the DisplayName attribute of the filter instead of MemberUnique and let the report figure out how to use that information.  We were unable to use multi-select MDX member selection filters.  When we passed these as DisplayValue to our relational SSRS reports it would only pass the first selection.  We were unable to overcome this limitation the way we would have liked.  We ended up not using multi-select filters and instead had filters that only contained the parent level of the hierarchy.  In other words, our filter contained managers and the report would display the managers employees.  We lost the ability to have the user only select some employees of that manager; but, not others.  To my knowledge, there is not a way to pass MDX multi-value filters using the DisplayValue property to relational SSRS reports.

Staying On Top of Reporting Services

One aggravation we experienced was that the column headers on the SSRS reports would not stay at the top of the web part when the user scrolled through the report part.  There are many blog posts out there on how to achieve this; but, within SharePoint 2013 it would work in Firefox and Chrome; but, not in Internet Exploder (9 or 10.)  I had to workaround this issue by sizing the report so it wouldn’t scroll, and instead display the toolbar and have the user page through the data instead of scroll.  Which in my opinion, looked nicer anyway.

I’ve got to mention one gotcha we experienced during our dashboard design endeavor that I had experienced many years ago.  I thought I had blogged about it in my What’s New BI-wise section; but, I couldn’t find it so I must have dreamed I had blogged about it.  Any way here are some words of wisdom:

When you are creating an SSRS MDX based report and need to make a customization to the underlying MDX that is created using the easy to use drag and drop interface, make a backup or better yet, check it into source control and label it ‘Pre-MDX customizations’ or something very clear like that.

Once you have made those customizations to the MDX (to allow that member property to be displayed for example) you can’t go back!  No longer can you simply delete all your parameters and add different ones easily.  Everything has to be done through hand coded MDX from that point forward.  It is far easier to design your report as much as possible, make a backup or labeled source control check-in, and then make your MDX customizations.  If you need to add/remove fields or parameters in the future it is easier to go back to that pre-MDX customized version and do it using the interface and then re-do the MDX customization than to work with it outside the interface.

Scorecards Don’t Always Score 

Scorecards are really cool gadgets to use in PerformancePoint dashboards because you can use them to control other objects.  Filters and scorecards are the only way to control objects in the PerformancePoint dashboard.  When you see a scorecard functioning well inside a PerformancePoint dashboard it looks really cool.  As you click on the various members within the scorecard you can have charts and data change according to the context.

We could not use scorecards the way we wanted because we simply had too many members in our dimension for them to render quickly enough.  They don’t work as you would expect.  Doing the same type of thing in an analytic grid or Excel services report is a piece of cake and renders very fast.  For whatever reason, regardless of the fact that I filtered the scorecard on the manager, it would insist on rendering every employee in the company and then hiding the ones that weren’t children of the selected manager.  In our case that was a little less than 5,000 members.  I reduced this set to about 2,800 by deleting employees that weren’t used as keys in any of our fact tables; but, it was too late.  The powers that be had already decided that scorecards stink due to not only the rendering speed; but, the hoops we had to jump through to develop one using that many members.  Even in the developer’s IDE, Dashboard Designer, it would pull back every single representative regardless of being filtered on a single parent member.  It would display those in a red font indicating that they would be hidden; but, the time it took the designer to render made you think it had crashed.  We actually had to build a cube containing a very small subset of the data in order to design the scorecard the way we wanted.

From a pure developer’s standpoint it seems that the scorecard design process in PerformancePoint needs some maturing.  Some configurations that are set within the wizard for creating an Analysis Services based scorecard are simply not editable outside of the wizard.  One would need to take screenshots of how the scorecard was developed to document the process in the event that a change was ever needed to one of those initial configurations.

While I’m on the subject of scorecards, another gripe we had about the way scorecards work is that any time you have more than one set of KPI actuals and targets within a scorecard it displays them with all of the actuals on the left and trends or targets on the right.  In other words, if I have a sales actual and goal KPI on the same scorecard as a gross profit actual and goal, I can only show them as actual sales, actual gross proft, target sales, target gross profit.  I would like some control to show the targets next to their corresponding actuals. 

Easy Migrations 

One new feature of SharePoint 2013 that I really like is the new way it handles exporting and importing a dashboard project.  In previous versions of SharePoint, I had to create long lists of customizations that had to be done to the web parts after Dashboard Designer had deployed the dashboard.  Things such as hiding filters, resizing objects, and changing the chrome settings had to be done over and over again every time we made a change to the dashboard that required a re-deployment.  Moving the project from the development environment to the QA environment and finally to the production environment was a manual tedious process.  Furthermore, these customizations were not stored in source control.  I was pleased as punch to discover the export/import feature of SharePoint 2013 solves both problems.  I can extract the resulting .CMP file using a cab extractor and put the entire project into source control and I don’t have to repeat those customizations over and over again.

Accepting the Unacceptable 

I’ve been creating dashboards since PerformancePoint 2007.  Make no mistake, I do not consider myself an expert at dashboard design.  I would however consider myself pretty advanced in using the Microsoft BI stack to produce great dashboards quickly.  I really like using PerformancePoint.  If you are willing to accept some of the quirks w/ PerformancePoint, it is a great tool to allow a developer to very quickly produce meaningful dashboards with a lot of functionality right out of the box.  When you start getting very particular about the way things look it gets more complicated.  For example, I can only show the legend at the top or to the right of my analytic charts.  I can’t show them on the left and I can’t show them on the bottom.  I can’t control the color of the bars in the bar charts.  I sometimes joke that designing PerformancePoint dashboards is a great opportunity for someone who likes to say no.  Sure, we can give you a bar chart w/ the colors you want or with the legend on the left; but, I have to do that in Reporting Services and we lose all of the cool right click functionality that is included in the analytic chart such as the decomposition tree. 

 

Fifty Shades Of Gray Is Not Enough 

I’ll never forget one project I worked on as a consultant with a very lovely lady who had designed her dashboard on paper before she had any idea of what PerformancePoint could and could not do easily.  A simple thing like showing the total dollar amount at the top of the bar chart involved creating a separate reporting services web part that only showed that total and placing it above the chart. The end result was a really good looking dashboard that looked remarkably similar to what had been envisioned.  It just took a lot of extra time to make it look exactly how it was imagined.

One other piece of advice I can give is to apply a lot of rigor in your testing.  Make sure each page of the dashboard uses consistent nomenclature.  Make sure that the numbers on all pages agree with each other.  If one page says that gross profit is this and another says gross profit was that, you’re going to get a lot of questions from your user community.  Be consistent in your color and number formatting as well.  Are you going to display zeros or leave the cells blank?  I personally prefer to leave the cells blank; but, it is more important that whatever you decide you stay consistent.  Decide beforehand what screen resolution you are designing to.  If everyone at your company has widescreen 17 inch monitors than it might be best to design your dashboard to look best at 1920×1080.  If you need to show your dashboard on an iPad or a Surface than you better have one lying around.  Good luck with that by the way.  Consider which browser is your company’s standard.  HTML5 or not, things look different in Chrome, Firefox, Safari and Internet Exploder versions 8, 9, 10, or 11.  

Awwww…

As I wrap this up, I just want to give a big warm thank you to the other members of my team, Sherri McDonald and Michael Simon.  A lot of hard work and long hours went into this project and it feels great to have professionals like them to ease the burden.  It’s a much better experience than working solo.  Three heads are much better than one and it feels great to have others throw rocks at your stuff and see where it breaks before the real critics get their hands on it.

Do you have a good dashboard design story?  Did I leave a danger out?  Please let us know in the comments. 

——————————————————————————- 

Other Dashboarding Resources

One person that I would consider an expert on dashboard design is Stephen Few.  I recently read on his blog that a second edition of his book, Information Dashboard Design: The Effective Visual Communication of Data, is being released shortly.  He mentioned in that post that one of the criticisms he received on the first edition that is addressed in the second edition is the lack of real examples of effective dashboards by real products.  His explanation is that at that time there simply weren’t any products released that were capable of producing what he considered really good dashboard designs.  Be sure to read his article Common Pitfalls in Dashboard Design for a great list of dashboard design gotchas.

Links

PerformancePoint 2013 Dashboards: Applying Dynamic Security using CustomData and EffectiveUserName

What are Web Parts?

PerformancePoint – Effectively Disable Analytic Grid Right Click Select Measures Functionality

Time Intelligence Filters in PerformancePoint 2010

Cascading Filters in Performance Point Services dashboard using SharePoint 2013

Performance Point Relative Date Time Intelligence with current date time

How to make SSRS reports in a PerformancePoint dashboard ‘pop out’ or open in a new window.

Clutter, data overload put dashboard designs on path to failure

Data Visualization and Dashboard Design

Formatting your PerformancePoint Analytic Grid!

Using SSAS MDX Calculation Color Expressions

Time Intelligence Post Formula Filters

Set Default Value of PerformancePoint Filter – Note:  Caveat to this tip- Once a user changes the default value SharePoint will remember the changed value and not reset the default value.

Add conditional formatting to your PerformancePoint Services Analytic Grid, by defining a custom Scope calculation from within Analysis Services (SSAS)

Filter Connection Quick Reference

——————————————————————————-

All my PerformancePoint links – Updated as I update them!

——————————————————————————-

PerformancePoint Gotcha!  – “PerformancePoint Services could not connect to the specified data source.”  When creating a new dashboard, you MUST do a save all right after you create your data sources otherwise you won’t be able to create anything that uses that data source.

——————————————————————————-

Gotcha! PerformancePoint Time Intelligence and SSRS

I was trying to use a PerformancePoint time intelligence filter to pass values to a SSRS report. I discovered that the MonthToDate, YearToDate, QuarterToDate syntax does not work with SSRS. Instead use Month.FirstDay:Day, Year:FirstDay:Day, Quarter.FirstDay:Day

——————————————————————————-

Zero to Dashboard- Intro to PerformancePoint    
Come check out the PerformancePoint hotness! Mike will demonstrate the functionality in PerformancePoint services in SharePoint 2010 and show you how to quickly build some dynamic dashboards and reporting functionality your end users will crave.

http://pragmaticworks.com/Resources/webinars/WebinarSummary.aspx?ResourceID=327

——————————————————————————-

PerformancePoint Tip – Right clicking an analytic grid in the leftmost columns allows the user to select measures.  Some people may not want that, so how do you disable that ability?  PerformancePoint Services Application Settings – Select Measures Control – set maximum value to 0.  (default is 1000).

http://technet.microsoft.com/en-us/library/ee620542.aspx

——————————————————————————-

Advertisements

What are Web Parts?

The term “web parts” can mean different things to different people.  A simple web search for “Microsoft web parts” returns some interesting results.  My introduction to web parts began with SQL Server 2008 R2 Reporting Services and the introduction of the report viewer web part (not to be confused w/ the report part.)  This allowed you to embed reporting services .RDL files within a PerformancePoint 2007 dashboard.  I believe you could also embed a report in any SharePoint page using the add web part feature.

A Developer’s Introduction to Web Parts (pertaining to SharePoint 2003) states: “Developers can build Web Parts as ASP.NET custom controls. Administrators can install Web Parts on any site based on Windows SharePoint Services. Users can add Web Parts to pages by dragging and dropping in a browser, and they can personalize them by setting properties. Web Parts can connect to other Web Parts using standard interfaces.”

The Introduction to Customizing Pages by Using Web Parts (applies to SharePoint Server 2007) provides an excellent overview of Web Parts and Web Part Pages:

——————————————————————————-

A Web Part is a modular unit of information that forms the basic building block of a Web Part Page. You can add Web Parts to Web Part zones in a Web Part Page and then customize the individual Web Parts to create a unique page for your site users.

The following example uses the Image Web Part to describe the basic features of a Web Part.


Sample Image Web Part

Callout 1 The Web Part title bar contains the heading for the Web Part.

Callout 2 The Web Part menu contains functions that enable you to minimize or close the Web Part, edit the Web Part, or get Help for a specific Web Part. When the page is in edit mode, you can also use this menu to delete the Web Part or connect it to other Web Parts, depending on the type of Web Part that you are using.

Callout 3 The body of the Web Part contains the content that you specified for the type of Web Part that is being used. In this example, the Web Part is an Image Web Part, which displays an image.


A Web Part Page is a special type of Web page in which you can use Web Parts to consolidate data, such as lists and charts, and Web content, such as text and images, into a dynamic information portal that is built around a common task or special interest.

——————————————————————————-

Finally, WikePedia defines a web part this way:

——————————————————————————-

A Web Part, also called a Web Widget, is an ASP.NET server control which is added to a Web Part Zone on Web Part Pages by users at run time. The controls enable end users to modify the content, appearance, and behavior of Web pages directly from a browser. It can be put into certain places in a web page by end users, after development by a programmer.

Web Parts can be used as an add-on ASP.NET technology to Windows SharePoint Services.

——————————————————————————-

Web Parts can be custom created for your needs or purchased through third party vendors.

Here are some Free Microsoft SharePoint Web Parts.

As a PerformancePoint dashboard designer, the web part customization is done after the dashboard has been deployed.  The frustration for me as a developer is that the customization tasks must be repeated every time the dashboard is re-deployed.  I’ve had to create long text cheat sheets of the various web part height, width, and chrome settings for a dashboard.  If anything needs changed to the dashboard using Dashboard Designer that causes you to have to re-deploy, these needs to be done individually by hand.  As far as I know, there is still no way to automate this task.  This always baffled me since these configurations can’t really be stored in source control and can’t be automated as the solution is deployed from dev to QA to production.

UPDATE!  SharePoint 2013 allows for easy migrations!

One new feature of SharePoint 2013 that I really like is the new way it handles exporting and importing a dashboard project.  In previous versions of SharePoint, I had to create long lists of customizations that had to be done to the web parts after Dashboard Designer had deployed the dashboard.  Things such as hiding filters, resizing objects, and changing the chrome settings had to be done over and over again every time we made a change to the dashboard that required a re-deployment.  Moving the project from the development environment to the QA environment and finally to the production environment was a manual tedious process.  Furthermore, these customizations were not stored in source control.  I was pleased as punch to discover the export/import feature of SharePoint 2013 solves both problems.  I can extract the resulting .CMP file using a cab extractor and put the entire project into source control and I don’t have to repeat those customizations over and over again.

Setting up a Three-Tier SharePoint Farm Including PowerPivot

This post will detail some of the challenges I encountered while setting up a three tier SharePoint 2010 farm that included PowerPivot.  Hopefully these notes will help somebody else.

Our three-tier farm will include:

    1. One server for the SQL Server 2008 R2 relational database that SharePoint will use.  (SQL)
    2. One application server that will run the central administration web site, the web application web site which will house our site collections, and a PowerPivot installation which includes an Analysis Services database running in Vertipaq mode.  –Best practice would be to have a separate server for the PowerPivot installation.  (APP)
    3. One web front end server that will render the html web pages.  (WFE)

These instructions will be based on Microsoft’s documentation, Multiple Servers for a Three-tier farm (SharePoint Server 2010.)  I will point out some of the things that confused me.  I will also share some links to blog posts that helped me overcome my confusion.  The instructions are a little confusing as there are certain hotfixes that are needed if you are running SQL Server 2008 that you do not need if you are running SQL Server 2008 R2.

Pay special attention to the Before you begin section in the documentation.  Especially the hardware and software requirements.

——————————————————————————-

Gotcha #1

My first gotcha occurred with the permissions of the active directory account(s) we were using for the setup administrator and farm account.  When I ran the the SharePoint 2010 products and configuration wizard I received the following error at step 3:

Failed to create the configuration database.
An exception of type System.Collections.Generic.KeyNotFoundException was thrown. Additional exception information: The given key was not present in the dictionary.
System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.
at System.ThrowHelper.ThrowKeyNotFoundException()
at System.Collections.Generic.Dictionary`2.get_Item(TKey key)

 

Here is an excerpt from the link that led me down the path to the solution:

 

“I installed 2010 once and got this problem fixed using this URL – http://ethan-deng.com/SharePoint2010InstallationIssues.aspx which at one time gave me the answer, but no longer does.

This is what fixed my challenges.  I was creating a new environment with new IDs, SPInstallTest, SPFarmTest.  They had all the correct permissions on the SharePoint boxes and the SQL boxes (Server 2008 R2 and SQL 2008 R2); however, every time I tried to configure the farm I got the error mentioned above.  The problem is that the AD attributes for these new IDs do not have the correct security in AD that they need.  (this will bite you with the people picker too with multiple domains).  Go into the AD accounts you created for your farm (Admin and Farm, except I used Install), make sure Advanced Features are selected in AD.  Click on the Security tab of these SharePoint accounts, highlight “Authenticated Users” and give them “read” access.  This is also important for Search to work properly!”

 

I was able to find the solution in this thread SharePoint 2010 – Error while running Configuration wizard for SP 2010 server farm.

You have to read until the end where it is suggested to go to Active Directory and to edit the properties of the following accounts:

  • Setup administrator (account used to install SharePoint)
  • Farm account (account used in the SharePoint Products Configuration wizard)
  • Look at administrative and service accounts for a description of accounts you need to install SharePoint

In the security tab you need to give the authenticated users group read permission:

clip_image002

After that the install has worked as expected.

http://andreasglaser.net/post/2010/11/30/Failed-to-create-the-configuration-database-An-exception-of-type-SystemCollectionsGenericKeyNotFoundException-was-thrown.aspx

——————————————————————————-

Gotcha #2

The first time I ran the SharePoint 2010 Product Preparation Tool on each server (APP and WFE) it failed.  The error message indicated that the .Net framework 3.5 service pack 1 failed.  After wasting about an hour, I realized that 3.5.1 was already installed before I even began.  Running the Product Preparation Tool again on each server showed success.  So in short, if the Product Preparation Tool fails the first time, just run it again.

——————————————————————————-

Gotcha #3

Documentation tells you to install several hotfixes that wouldn’t install.  From the documentation:

After you install the prerequisites on each farm server, you will need to install Error message when you use the “IMSAdminBase::CopyKey” method as a part of the IIS 6.0 Compatibility components in IIS 7.0: “Exception from HRESULT: 0x80070003” on the application server first, and then on the two Web servers.

This update seems to only apply to Vista.

also…

After you complete the Microsoft SharePoint Products Preparation Tool, you must install KB 949516 (http://go.microsoft.com/fwlink/p/?LinkId=148917) and KB 971831 (http://go.microsoft.com/fwlink/p/?LinkID=165750). You might also need to restart the server after installing this hotfix.

These also wouldn’t install.

 

My recommendation is to try and install these anyway as your experience may vary.  This installers for these hotfixes are apparently smart enough to know when they don’t apply.

 

——————————————————————————-

 

Gotcha #4

 

Should you choose Windows Classic authentication or Windows Claims authentication for your first web application?  I initially chose Windows Claims authentication because it included Windows Classic plus two other authentication methods.  I figured this would give me the most flexibility.  We could start w/ NTLM and if we ever needed to switch to Kerberos we could.  Big mistake when it came to the PowerPivot installation.  I spent many hours trying to figure out what was wrong when I noticed this quote from a MSDN article:

When creating new web applications, be sure to select the Classic mode authentication option in the Create New Web Application page.  

For existing web applications, use the following instructions to verify the applications are configured to use Windows authentication. When creating new web applications, be sure to select the Classic mode authentication option in the Create New Web Application page.

1. In Central Administration, in Application Management, click Manage web applications.

2. Select the web application.

3. Click Authentication Providers.

4. Verify that you have one provider for each zone, and the Default zone is set to Windows.

Ref: Plan PowerPivot Authentication and Authorization

 

Another thing I messed up was kind of silly looking back on it now.

In the instructions for creating your web application using Classic Mode, item #10, the public URL section I replaced the APP server with the WFE server.  Strangely enough, this worked fine for everything in PerformancePoint.  PowerPivot not so much.

 

Here is the excerpt from #10:

 

In the Public URL section, type the URL for the domain name for all sites that users will access in this Web application. This URL will be used as the base URL in links shown on pages within the Web application. The default URL is the current server name and port, and is automatically updated to reflect the current SSL, host header, and port number settings on the page. If you are deploying SharePoint Server 2010 behind a load balancer or proxy server, then this URL may need to be different than the SSL, host header, and port settings on this page.

The Zone value is automatically set to Default for a new Web application.

noteNote:

You can change the zone when you extend a Web application. For more information, see Extend a Web application (SharePoint Server 2010).

 

You want to leave the default application server in the public URL section.  Later on, once you have PowerPivot installed and working, extend your web application in the first link directly above.

 

 

Create a Web application that uses Windows-classic authentication (SharePoint Server 2010)

——————————————————————————-

 

Gotcha #5

Don’t enable anonymous access when creating the web application!  Excel services won’t work and the documentation doesn’t tell you that!  From the doc:

 

If you want users to be able to access any site content anonymously, you must enable anonymous access for the entire Web application zone before you enable anonymous access at the SharePoint site level; later, site owners can configure how anonymous access is used within their sites. If you do not enable anonymous access at the Web application level, you cannot enable anonymous access later, at the site level. For more information, see Choose security groups (SharePoint Server 2010).

After reading that, I thought I should enable it in case we ever needed it.  I was sorely mistaken.  Don’t do it.

 

I was trying to create a new excel services data connection. Item name drop down wouldn’t populate and test connection wouldn’t enable. Came across this:

http://sharepoint.stackexchange.com/questions/28238/performancepoint-dashboard-designer-new-excel-services-data-source

 

On this same point… you don’t need to create an Excel Services data connection within Dashboard Designer to test Excel Services.  Create an Excel Services report instead.

——————————————————————————-

 

Gotcha #6

 

The PowerPivot installation documentation stinks!  (PowerPivot for sharepoint existing farm installation)  It obviously hasn’t been kept up to date.

For example, #12 Create the PowerPivotService appilcation states:

 

b. Click Application Management, Manage Service Applications, New, and SQL Server Analysis Services Mid-Tier Service.

Well, that’s great; but, it doesn’t exist…. I used SQL server PowerPivot service application instead.

 

That’s just one of many examples.  A better source for how to install PowerPivot for an existing farm can be found here:  How to: Install PowerPivot for SharePoint in a Three-tier SharePoint.

 

——————————————————————————-

 

Other tips…

 

When you are setting up Excel Services go ahead and uncheck the warn on refresh.  While you’re at it, increase the maximum workbook size to something more reasonable like 2000MB and the max size for charts to 100MB.  You’ll also want to configure the maximum file size for a SharePoint web application.  See this link:

Configure Maximum File Upload Size (PowerPivot for SharePoint)

 

——————————————————————————-

Related Links

SharePoint 2010: Missing Server Side Dependencies

Configure SharePoint Server 2010 in a Three-Tier Farm

Creating a PowerPivot Gallery

SQL Server 2008 R2 and SharePoint 2010 Products: Better Together (white paper) (SharePoint Server 2010)

SharePoint 2010 Feature Gallery

PerformancePoint Demo – Click Demonstration.  To get the right click functionality you have to click the compatibility icon in Internet Exploder.

PowerPivot Demo

Please let me know in the comments if you have anything to add or if I have stated anything that is incorrect.