Address Parse Transform–Task Factory vs. SSIS+

clip_image001

clip_image002

This is an old post from September 2012 that lost the images during their upgrade.  http://bidn.com/Blogs/userid/470/address-parse-transform-ndash-task-factory-vs-ssis

Pragmatic Works Task Factory is a collection of twenty-six high performance SSIS components one of which is the Address Parse Transform component. This component is used to parse unformatted address data into USPS standardized address data. It can also be used to transform addresses from other countries as well and is extremely fast and efficient as I will demonstrate in this blog entry. It is so easy to use, even a caveman could do it. No offense to any cavemen.

Cozy Roc also has a collection of SSIS components called SSIS+ one of which also does address parsing. It has a weird limitation that your address has to be stored in a single column. Nonetheless, I thought it would be interesting to test the two and compare the differences.

The US data set and contains 350,000 records and comes from the following website: http://www.briandunning.com/sample-data/. I am going to test loading this data set twice. Once using it the way it comes and then once again using a slightly modified version where I have concatenated the address, city, state, and zip columns into one column. This will allow me to compare the two components side by side.

Before we begin, it is important to define exactly what it means for an address to conform to United States Postal Service address standards. The following excerpts were taken from Postal Addressing Standards (Publication 28) and provide a sample of their requirements.

Jointly developed by the Postal Service and mailing industry, standardized address information enhances the processing and delivery of mail, reduces undeliverable-as-addressed mail, and provides mutual cost reduction
opportunities through improved efficiency.

A standardized address is one that is fully spelled out, abbreviated by using the Postal Service standard abbreviations (shown in this publication or as shown in the current Postal Service ZIP+4 file), and uses the proper format
for the address style (shown in this publication).

Format all lines of the address with a uniform left margin. Uppercase letters are preferred on all lines of the address block. Lowercase letters in various type styles are acceptable provided they meet postal guidelines for OCR readability.

Secondary address unit designators, such as APARTMENT or SUITE, are preferred to be printed on the mail piece for address locations containing secondary unit designators. The preferred location is at the end of the Delivery Address Line. The pound sign (#) should not be used as a secondary unit designator if the correct designation, such as APT
or STE, is known or is shown in the ZIP+4 file. If the pound sign (#) is used, there must be a space between the pound sign and the secondary number.

Spell city names in their entirety. When abbreviations must be used due to labeling constraints, use only the approved 13-character abbreviations provided in the City State file.

Format the Last Line with at least one space between the city name, two-character state abbreviation, and ZIP+4 Code.

Test One – Transform 350,000 US Addresses into USPS Standardized Address Format

First, I created a new solution in BIDS and created a new package. I added a data flow task and renamed it to Test One. I dragged the flat file source to my data flow canvas and linked it to the CSV I downloaded containing 350K rows of sample US addresses. I set double quotes as my text qualifier and checked the box indicating that the first row contains the column names.

Now comes the fun part! I dragged the Address Parse Transform component onto my data flow designer and linked the flat file source to the component. Double clicking the component shows the setup screen below. For this test, I am going use the second option: Address data is spread across multiple columns. The Cozy Roc SSIS+ implementation doesn’t seem to allow you to do this. I mapped the columns to each field as shown below.

clip_image003

Next I clicked the Address Quality tab and checked the box to include only high quality parsed addresses. I decided to redirect errors to non parsable output so I selected that option from the drop down. Finally, I checked the box to replace ‘Null’ values with an empty string. According to the dialog, sometimes data from CSV or text files actually contain the text ‘Null’ where an actual NULL value was before. That can’t be good for anyone. I can imagine a letter carrier looking for someone named Null or a city named Null.

clip_image004

Now, I clicked the Parsed Output tab. I decided to replace the original columns where applicable and I changed the Output Case Formatting. I selected Upper Case from the dropdown since this is what the USPS prefers. Other options include: no formatting or proper case.

clip_image005

When I clicked Ok, I got an error message!

clip_image006

I don’t have a column for AddressLine2. It doesn’t look like I can ignore this column either so I went back to the Address Data Input tab and mapped Address to that field also. I’ll just ignore the AddressLine2 column downstream.

clip_image007

This is a screenshot showing what my data flow looked like once I was finished adding my destination components for the parsed addresses and for the bad addresses.

Finally, for a proper test we need to record when the data flow starts and when it finishes. Sure, I could have manually modified the package to log that info into a table; but, I have BIxPress so I did it the easy way!

The first run failed. clip_image008

The “output column “FirstName” (23)” failed because truncation occurred, and the truncation row disposition on “output column “FirstName” (23)” specifies failure on truncation.

A modification of my destination tables and explicitly setting the length of the string data types to 250 using the advanced editor for the source component on the input output properties tab should fix that.

clip_image009

I still received errors so I decided to redirect error rows from the source to an error table rather than fight them when I got a nice surprise from BIxpress!

clip_image010

Very cool! Of course I clicked Yes!

Next time it ran green. Let’s check the output in BIDS.

clip_image011

Check out what I get in BIxPress!

clip_image012

Not only does it tell me the row counts; but, it also tells me the buffers used! Not to mention how long it took which was almost 5 and a half minutes. Why can’t BIDS tell you that at least?

Tests are being performed on an IBM Lenovo X201 Tablet running Win 7 w/ an Intel I7 quad core, 6 GB RAM and a SSD.

Looking at the good addresses I see that it made everything uppercase like I wanted and used the USPS approved and recommended abbreviations. There are two new columns that the component made available to me: Quality and CSZQuality.

image

I was curious to see why one of the addresses was given a quality rating of medium so I went back to the source. That record had the following stored in the address field: 3202 W Charleston Blvd Bldg. So the ‘Bldg’ string was removed from the field and quality was set to medium. Neat!

Looking at examples where CSZQuality was medium I discovered entries where the zip code was missing; but, had all of the other elements which would still make it deliverable. Records with both columns set to medium were missing addresses and zip codes. These should be discarded or investigated.

Looking at the records in the bad addresses table it looks like most of them have incomplete information in the address field.

image

Test Two – Transform 350,000 US Addresses into USPS Standardized Address Format From a One Column Source

This time I modified the source to concatenate all of the address columns into one column. Let’s see how Address Parse Transform does now!

clip_image013

Wow! BIxPress totally blows me away again!

clip_image014

Of course I do!

The final destination component now shows an error. I’ve got to correct the mappings.

clip_image015

I delete the mappings to address, city, state, and zip. Then I mapped addressLine1 to address, TF Address Parse Transform City to city, TF Address Parse Transform State to State, PostalCode to Zip, Country to Country, Quality to Quality, and CSZQuality to CSZQuality. AddressLine2 can just remain unmapped.

clip_image016

Execute Package!

BIDS results:

clip_image017

BIxPress Results:

clip_image018

It took just under 6 minutes. A couple more thousand rows went to the bad address pipeline as would be expected under the circumstances.

Next, I want to test the same functionality using Cozy Roc’s SSIS+ Address Parse Component. I was unable to test this component in the same fashion as test one of Task Factory component because this one only parses addresses that are contained in a single column.

The following screenshots document how I set the component up for the test.

clip_image019

clip_image020

First, the result in BIDs:

clip_image021

BIxPress results:

clip_image022

So it took about 30 seconds longer using Cozy Roc’s component and there were ~30K bad addresses compared to Pragmatic Work’s ~15K.

Let’s look at some of the good addresses.  The first thing I notice is that the addresses are not using uppercase as is preferred by the USPS. Secondly I see for the third line it chopped the NW off of the street address and put it in front of the city.

image

Now, let’s look at those same addresses in the good address table from the Task Factory test two.  It also seems to have a hard time with the cities; but, the mail still should be deliverable.  Notice the city RAPIDS which should be GRAND and PARK PITTSFORD which should be just PITTSFORD.  Nonetheless, I would still consider the mail deliverable especially if we can assume the zip code is correct.

image

Nutshell: Pragmatic Works Task Factory Address Parse component is slightly faster and had half the number of bad addresses as Cozy Roc’s SSIS+ Address Parse component.  The Task Factory component also has the ability to accept addresses that are already in separate columns and the SSIS+ component does not.  The Address Parse Transform component from Pragmatic Works will be able to pay for itself in the long run by reducing the costs of returned mail.

Lost Letter Carrier

Analysis Services and Excel Tips and Tricks

I was speaking with a Business Analyst the other day and I couldn’t remember exactly how to do a cool trick I’d learned years ago working with Excel and Analysis Services.  It got me thinking about some other cool techniques that I suspect many do not know about.  If you’ve got a cool trick to share, please let us know in the comments.

Trick #1 – Use OLAP PivotTable Extensions to Add Calculated Measures

I want to demonstrate how to add a new calculated measure to a pivot table using MDX.

The Setup

  1. Download and install the add-in. http://olappivottableextend.codeplex.com/
  2. Open Excel and connect to the AdventureWorks SSAS database.  (Data, From Other Sources, Analysis Services, etc.)
  3. In the PivotTable Field list use the “Show fields related to:” drop down at the top select “Internet Sales.”
  4. Add the Internet Sales Amount to the values
  5. Add the Product Category user hierarchy to the rows and expand.
  6. Add the Date.Fiscal Weeks User hierarchy to the Report Filter and select Week 4 FY 2008.
  7. Right click a cell within the pivot table and select OLAP PivotTable Extensions.
  8. Enter the following in the Calculation Name drop-down field:  [SPPY Internet Sales Amt]  (SPPY = Same Period Prior Year)
  9. Enter the following in the formula text box: (ParallelPeriod([Date].[Fiscal].[Fiscal Year],1,[Date].[Fiscal].currentmember),[Measures].[Internet Sales Amount])
  10. Click the “Add to PivotTable” button.
  11. Place a check mark in the PivotTable Field list under the newly created “Values” measure group next to SPPY Internet Sales Amt.
  12. Verify that those numbers match Week 4 FY 2007.
  13. Remove the Date.Fiscal Weeks user hierarchy from the report filter.
  14. Add the Date.Fiscal user hierarchy to the Report Filter and select July 2007.
  15. Verify that those numbers match July 2006.

Hungry for more?

There are lots of examples of useful MDX you can use in your spreadsheets!  A great graphical tool to help you build (and learn) complex MDX queries is Cube Player.

Trick #2 – Filtering, the Easy Way!

Have you ever needed to analyze a particular set of products or customers based on a long list of customer names or USNs?  Going through the filter drop-down and individually clicking each one can be a tedious, time-consuming, and frustrating process.  If you’ve ever had to do this, I am about to become your best friend.

The Setup

  1. Open Excel and connect to the AdventureWorks SSAS database.  (Data, From Other Sources, Analysis Services, etc.)
  2. In the PivotTable Field list use the “Show fields related to:” drop down at the top select “Internet Sales.”
  3. Add the Internet Sales Amount to the values
  4. Add Customer to the row labels
  5. Highlight all the customer names that start with the first name “Michael” and copy them to your clipboard.
  6. Right click a cell in the PivotTable and select OLAP PivotTable Extensions.
  7. Click the Filter list tab and select [Customer].[Customer] from the filter hierarchy drop-down.
  8. Paste in your Michaels.
  9. Click Filter PivotTable.
  10. Add a comment to this blog post about how much you appreciate the time I spent writing this blog post.  Smile

Trick #3 – Using SSAS Time Intelligence in Excel

For this example, I added time intelligence to the AdventureWorks cube by following the wizard.  If you don’t know if the cube at your company has time intelligence, look in the date dimension for an attribute named something like “Fiscal Date Calculations” or just ask somebody.  If your cube doesn’t have time intelligence, maybe you can convince them to add it.

The Setup

  1. Open Excel and connect to the AdventureWorks SSAS database.  (Data, From Other Sources, Analysis Services, etc.)
  2. In the PivotTable Field list use the “Show fields related to:” drop down at the top select “Internet Sales.”
  3. Add the Internet Order Count measure from the Internet Orders measure group to the values.
  4. Add the Date.Fiscal user hierarchy to the report filter.
  5. Add Fiscal Date Calculations to the rows.
  6. Select November 2003 in  the Date.Fiscal report filter in cell B1.

image

Wow!

Trick #4 – Formatting, As You Like It!

For this example, let’s create a nicely formatted re-usable Balance Sheet for Adventure Works.

The Setup

  1. Open Excel and connect to the AdventureWorks SSAS database.  (Data, From Other Sources, Analysis Services, etc.)
  2. In the PivotTable Field list use the “Show fields related to:” drop down at the top select “Financial Reporting.”
  3. Check the box next to the Amount metric in the Financial Reporting measure group.
  4. Check the box next to Accounts in the Account dimension.
  5. Click the plus sign next to the Fiscal display folder in the Date dimension and check the box next to the “Date.Fiscal” user hierarchy.
  6. Click the drop down arrow in cell A2 and uncheck everything but Balance Sheet.
  7. Click the drop down arrow in cell B1 and uncheck everything except FY 2006, FY 2007, and FY 2008.
  8. Click the plus sign next to “Balance Sheet” in cell A3.  Click the plus sign on every plus you see until there are no longer any plus signs on the sheet.
  9. (Optional sub-tip)  The CFO wants to see the GL account next to the lowest leaf level rollups.  Right click “Cash” and select “Show Properties in Report” and choose “Account Number.”

image

Your screen should look like this.

What’s Wrong With This Picture?

This looks okay; but, it could certainly be better.  If you look for balance sheet examples on the internet you might notice that the totals are normally put underneath the category.  They also typically wouldn’t have a total column for the combined years aggregation.

First let’s do something about those totals.

  1. Click on a cell in the pivot table.  This action adds the PivotTable Tools section on your Excel ribbon.
  2. Click on the Design tab and click the Report Layout button third from the left on the ribbon.  Select “Show in Tabular Form.”
  3. Click the Grand Totals button and select “Off for Rows and Columns.”

It is starting to look better; but, now if I want to insert a row underneath the receivables total I get a very big error message!

image

This is the trick.

  1. Click the Options tab of the PivotTable Tools ribbon and click the OLAP Tools button fourth from the right.  Select “Convert to Formulas.”
  2. You’ll get an error.

image

To resolve the error:

  1. Click the filter icon in cell A2 and select all.
  2. Click the filter icon in cell G1 and select all
  3. Save a copy of your Balance Sheet with a file name such as “Dynamic Balance Sheet.XLSX” (I’ll explain why shortly.)
  4. Click the OLAP Tools button and select “Convert to OLAP formulas” again.

Continuing our beautification…

  1. Insert a row underneath “Receivables Total.”
  2. Insert rows underneath every total.
  3. Insert a column between FY 2007 and FY 2008.
  4. Label the column “YOY Var %” in cell I2
  5. Add the formula: =(H6-G6)/G6 and copy it down the balance sheet
  6. Format the column I as a percent
  7. Hide the rows 49 – 51
  8. Make the totals bold
  9. Hide column A
  10. Hide Row 1, Bold and Center Row 2.
  11. Highlight cells B2-F2 and press delete
  12. Highlight columns B-F and select Format,  Auto Fit Column Width and make bold.
  13. Copy column I to (the new) Column M
  14. Modify the formula in M3 so that it reads: =(J3-H3)/H3
  15. Hide columns I and M.  I just wanted to show you how easy it was to add a formula.
  16. Add top and bottom borders for each of the sub category total amounts, top and thick bottom borders to Current Assets, Property, Plant, Equipment, Liabilities, and Owners equity totals, and finally top and double bottom borders for Assets and Liabilities and Owners Equity Totals.
  17. Select columns G-J, right click a cell, select Format cells and decrease the precision so that there are no cents displayed.
  18. From the view tab uncheck the Gridlines checkbox.
  19. Print.
  20. Get some coffee on your way to the printer.

You’re spreadsheet should look like this:

image

There is one caveat you need to know.  If accounting adds a new GL rollup it will not automatically appear on the balance sheet.  You can manually add it if you’re careful or open the “Balance Sheet Dynamic” version and re-apply the formatting.

——————————————————————————-

That’s all for today, folks!  Please add to the comments if you would like to share a great Analysis Services in Excel tip or trick.

Dangers in Dashboard Design

Dangerous Dashboard Design

Dangerous Dashboard Design

I recently had the great pleasure of working on a dashboard project using PerformancePoint in SharePoint 2013 with a great team of individuals.  We had to overcome several obstacles during the process and I thought it would be a good idea to document how we accomplished some of our goals with the hope that I can spare someone else (or my future self) from having to go through the same pain.  Our dashboard is currently in closed beta and we will be releasing it to the wild in the next couple of weeks.

Dynamic Active Directory Security Without Kerberos 

The biggest challenge we faced was how to implement dynamic active directory security without using Kerberos.  I had applied dynamic security to several Analysis Services databases in the past; but, had never followed that implementation through to PerformancePoint dashboards.  This challenge is more difficult in that respect because within a dashboard you will typically have more sources of data than just an analysis services database.  We had SSRS reports that ran against our ODS and our relational data warehouse.  You can read more about how we achieved this in my recent blog post, PerformancePoint 2013 Dashboards: Applying Dynamic Security using CustomData and EffectiveUserName.

Our solution required three roles.  Two roles for the managers who would be restricted to seeing only their employees.  One role for the PerformancePoint objects which uses the CustomData function in the security and another for the SSRS MDX based reports and Excel Services based reports that uses the UserName function.   Finally we needed a separate unrestricted role for the managers and district leaders that could see all of the data.  This allowed us to keep our bridge table relatively small.  Otherwise every person that needed to see every person would require about 2,500 rows in the bridge table.

During develoment, I mistakenly thought we could create the entire dashboard for the restricted users, save a copy and rename it to unrestricted, change the data source to use the unrestricted role and be on our way.  I was woefully mistaken.  Apparently PerformancePoint uses some sort of invisible GUID or something for the data source name.  Worse yet, you can’t change the data source for existing Analytic charts, grids, and scorecards.  So we couldn’t make a copy of the objects, rename them, and point them to a different data source.  We actually had to use two different site collections for the restricted and the unrestricted dashboards!

I Know Who You Are and What You Did Last Quarter

Once we had the dynamic security in place we wanted the dashboard to automatically know who you are.  We accomplished this by adding an attribute to our person dimension that would hold the active directory account name.  I use the word person in it’s most generic sense.  A person can be a vendor, customer, employee, manager, or sales person.  I created a MDX filter in PerformancePoint and used the following MDX code to achieve this: 

STRTOSET(“[People].[Active Directory Login].&[” + CUSTOMDATA() + “]”)

I then added this attribute to any analytic charts, grids, or scorecards in the background quadrant of the design canvas so that the report could be filtered by it.  I connected the filter to the report as one typically would and later hid the filter using the edit web part interface within the SharePoint 2013 site.

Save the Trees

My good buddy Michael Simon used some fancy MDX to set the trailing six quarters for an analytic bar chart only to discover that we lost the ability to right click and get the decomposition tree and all the other neat PerformancePoint features.  He discovered a clever work around using dynamic named sets.  We had our right clicks back.

Yesterday… Not Such an Easy Game to Play

Another challenge we faced is a very common one.  How to get the dashboard to automatically present data as of yesterday.  Typically this wouldn’t be a big deal; but, in our case we use a reporting calendar that doesn’t include Saturdays, Sundays or Holidays.  Any sales that occur on those days roll forward to the next business day.  If you don’t have such a dilemma you can use either of these two snippets to accomplish this feat:

1. STRTOSET(“[Date].[Calendar].[Date].&[“+ VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”)

2. STRTOSET(“[Date].[Calendar].[Date].&[“+ VBAMDX.Format(VBAMDX.Now(), “yyyyMMdd”) +”].Lag(1)”)

In our case, this wouldn’t work.  If the user wanted to look at the dashboard on a weekend or a holiday they would get an error. 

I initially tried to set the default member of the dimension to yesterday using MDX.  I ran into a problem in that once I set the default member for the date dimension in the cube I could no longer put the date dimension in the report filter area of an Excel Pivot table.  It would no longer work and would just show yesterday’s data.  See this link for more information about that issue:  Default Member Trap (be careful of what you filter)

My next strategy was to add an attribute to our Gregorian date dimension called Last Business Date and update the corresponding relational table with this information.  I then attempted to parse together the information to get the correct reporting date.  We have two date dimensions in our cube.  One is a regular calendar date dimension, the other one is a fiscal reporting date dimension that doesn’t include weekends or holidays.  Our fact table has keys for both.  You can see my post asking for help from the BIDN community here: 

MDX Help Please – The + function expects a tuple set expression for the 1 argument. A string or numeric expression was used

Before I received the kind reply from mreedsqlbi, I found another way to accomplish what I needed.  This post helped me find another solution:  Member Does Not Exist in MDX Calculations

I created four dynamic named sets in the cube called, OneDayAgo, TwoDaysAgo, ThreeDaysAgo, and FourDaysAgo.

Here is the code for OneDayAgo:

iif(iserror(STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”)),{},STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now()-1, “yyyyMMdd”) +”]”))

For each of the other named sets, I just changed the –1 to a –2, –3, and –4 respectively.

The code for my MDX filter used the dynamic named sets to figure out if the member exists:


IIF([OneDayAgo].Count > 0
            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-1, "yyyyMMdd") +"]")
            ,IIF([TwoDaysAgo].Count > 0
                  ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-2, "yyyyMMdd") +"]")
                  ,IIF([ThreeDaysAgo].Count > 0
                        ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-3, "yyyyMMdd") +"]")
                        ,IIF([FourDaysAgo].Count > 0
                            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-4, "yyyyMMdd") +"]")
                            ,STRTOMEMBER("[Date - Reporting].[Date].&[" + VBAMDX.Format(VBAMDX.Now()-5, "yyyyMMdd") +"]")
                        )
                    )
                )
            )

UPDATE:

This works better because it works in Dev which doesn’t always have data for yesterday.

TopCount(ORDER( NONEMPTY( {[Date – Reporting].[Date].Members

     – [Date – Reporting].[Date].[All]}

   , [Measures].[Sales Amt] )

   , [Date – Reporting].[Date].CurrentMember.MEMBER_KEY

   , DESC )

,1)

UPDATE:

At one point during Beta testing, I was asked to present the user with a date dropdown that defaults to yesterday.  This was later scrapped because it required a server wide change within the Central Administration Application Settings for PerformancePoint in which we had to set the user selected filters to expire every day.

For anyone who is interested, I got that to work by removing the TOPCOUNT from the MDX formula filter in Dashboard Designer:

ORDER( NONEMPTY( {[Date – Reporting].[Date].Members – [Date – Reporting].[Date].[All]},                 
 [Measures].[Sales Amt] ), [Date – Reporting].[Date].CurrentMember.MEMBER_KEY, DESC )

This only shows dates that have data, excludes the all member, and sorts it in descending order.  The result is that it defaults to yesterday but allows you to change it. 

UPDATE

I think I’m crazy; but, I could have sworn the dashboard was showing Friday’s data on Monday.  Looking at the code, I don’t see how; but, I am amazed no one has pointed it out.

I had to modify the code to eliminate Today’s date from the data set.

TopCount
  (
    Order
    (
      NonEmpty
      (
        {
          [Date – Reporting].[Date].MEMBERS
   – [Date – Reporting].[Date].[All]
   – STRTOMEMBER(“[Date – Reporting].[Date].&[” + VBAMDX.Format(VBAMDX.Now(), “yyyyMMdd”) +”]”)
        }
       ,[Measures].[Sales Amt]
      )
     ,[Date – Reporting].[Date].CurrentMember.Member_Key
     ,DESC
    )
   ,1
  )

Filters

Another challenge we faced was how to have MDX filters control relational reports.  This was pretty easy.  Just use the DisplayName attribute of the filter instead of MemberUnique and let the report figure out how to use that information.  We were unable to use multi-select MDX member selection filters.  When we passed these as DisplayValue to our relational SSRS reports it would only pass the first selection.  We were unable to overcome this limitation the way we would have liked.  We ended up not using multi-select filters and instead had filters that only contained the parent level of the hierarchy.  In other words, our filter contained managers and the report would display the managers employees.  We lost the ability to have the user only select some employees of that manager; but, not others.  To my knowledge, there is not a way to pass MDX multi-value filters using the DisplayValue property to relational SSRS reports.

Staying On Top of Reporting Services

One aggravation we experienced was that the column headers on the SSRS reports would not stay at the top of the web part when the user scrolled through the report part.  There are many blog posts out there on how to achieve this; but, within SharePoint 2013 it would work in Firefox and Chrome; but, not in Internet Exploder (9 or 10.)  I had to workaround this issue by sizing the report so it wouldn’t scroll, and instead display the toolbar and have the user page through the data instead of scroll.  Which in my opinion, looked nicer anyway.

I’ve got to mention one gotcha we experienced during our dashboard design endeavor that I had experienced many years ago.  I thought I had blogged about it in my What’s New BI-wise section; but, I couldn’t find it so I must have dreamed I had blogged about it.  Any way here are some words of wisdom:

When you are creating an SSRS MDX based report and need to make a customization to the underlying MDX that is created using the easy to use drag and drop interface, make a backup or better yet, check it into source control and label it ‘Pre-MDX customizations’ or something very clear like that.

Once you have made those customizations to the MDX (to allow that member property to be displayed for example) you can’t go back!  No longer can you simply delete all your parameters and add different ones easily.  Everything has to be done through hand coded MDX from that point forward.  It is far easier to design your report as much as possible, make a backup or labeled source control check-in, and then make your MDX customizations.  If you need to add/remove fields or parameters in the future it is easier to go back to that pre-MDX customized version and do it using the interface and then re-do the MDX customization than to work with it outside the interface.

Scorecards Don’t Always Score 

Scorecards are really cool gadgets to use in PerformancePoint dashboards because you can use them to control other objects.  Filters and scorecards are the only way to control objects in the PerformancePoint dashboard.  When you see a scorecard functioning well inside a PerformancePoint dashboard it looks really cool.  As you click on the various members within the scorecard you can have charts and data change according to the context.

We could not use scorecards the way we wanted because we simply had too many members in our dimension for them to render quickly enough.  They don’t work as you would expect.  Doing the same type of thing in an analytic grid or Excel services report is a piece of cake and renders very fast.  For whatever reason, regardless of the fact that I filtered the scorecard on the manager, it would insist on rendering every employee in the company and then hiding the ones that weren’t children of the selected manager.  In our case that was a little less than 5,000 members.  I reduced this set to about 2,800 by deleting employees that weren’t used as keys in any of our fact tables; but, it was too late.  The powers that be had already decided that scorecards stink due to not only the rendering speed; but, the hoops we had to jump through to develop one using that many members.  Even in the developer’s IDE, Dashboard Designer, it would pull back every single representative regardless of being filtered on a single parent member.  It would display those in a red font indicating that they would be hidden; but, the time it took the designer to render made you think it had crashed.  We actually had to build a cube containing a very small subset of the data in order to design the scorecard the way we wanted.

From a pure developer’s standpoint it seems that the scorecard design process in PerformancePoint needs some maturing.  Some configurations that are set within the wizard for creating an Analysis Services based scorecard are simply not editable outside of the wizard.  One would need to take screenshots of how the scorecard was developed to document the process in the event that a change was ever needed to one of those initial configurations.

While I’m on the subject of scorecards, another gripe we had about the way scorecards work is that any time you have more than one set of KPI actuals and targets within a scorecard it displays them with all of the actuals on the left and trends or targets on the right.  In other words, if I have a sales actual and goal KPI on the same scorecard as a gross profit actual and goal, I can only show them as actual sales, actual gross proft, target sales, target gross profit.  I would like some control to show the targets next to their corresponding actuals. 

Easy Migrations 

One new feature of SharePoint 2013 that I really like is the new way it handles exporting and importing a dashboard project.  In previous versions of SharePoint, I had to create long lists of customizations that had to be done to the web parts after Dashboard Designer had deployed the dashboard.  Things such as hiding filters, resizing objects, and changing the chrome settings had to be done over and over again every time we made a change to the dashboard that required a re-deployment.  Moving the project from the development environment to the QA environment and finally to the production environment was a manual tedious process.  Furthermore, these customizations were not stored in source control.  I was pleased as punch to discover the export/import feature of SharePoint 2013 solves both problems.  I can extract the resulting .CMP file using a cab extractor and put the entire project into source control and I don’t have to repeat those customizations over and over again.

Accepting the Unacceptable 

I’ve been creating dashboards since PerformancePoint 2007.  Make no mistake, I do not consider myself an expert at dashboard design.  I would however consider myself pretty advanced in using the Microsoft BI stack to produce great dashboards quickly.  I really like using PerformancePoint.  If you are willing to accept some of the quirks w/ PerformancePoint, it is a great tool to allow a developer to very quickly produce meaningful dashboards with a lot of functionality right out of the box.  When you start getting very particular about the way things look it gets more complicated.  For example, I can only show the legend at the top or to the right of my analytic charts.  I can’t show them on the left and I can’t show them on the bottom.  I can’t control the color of the bars in the bar charts.  I sometimes joke that designing PerformancePoint dashboards is a great opportunity for someone who likes to say no.  Sure, we can give you a bar chart w/ the colors you want or with the legend on the left; but, I have to do that in Reporting Services and we lose all of the cool right click functionality that is included in the analytic chart such as the decomposition tree. 

 

Fifty Shades Of Gray Is Not Enough 

I’ll never forget one project I worked on as a consultant with a very lovely lady who had designed her dashboard on paper before she had any idea of what PerformancePoint could and could not do easily.  A simple thing like showing the total dollar amount at the top of the bar chart involved creating a separate reporting services web part that only showed that total and placing it above the chart. The end result was a really good looking dashboard that looked remarkably similar to what had been envisioned.  It just took a lot of extra time to make it look exactly how it was imagined.

One other piece of advice I can give is to apply a lot of rigor in your testing.  Make sure each page of the dashboard uses consistent nomenclature.  Make sure that the numbers on all pages agree with each other.  If one page says that gross profit is this and another says gross profit was that, you’re going to get a lot of questions from your user community.  Be consistent in your color and number formatting as well.  Are you going to display zeros or leave the cells blank?  I personally prefer to leave the cells blank; but, it is more important that whatever you decide you stay consistent.  Decide beforehand what screen resolution you are designing to.  If everyone at your company has widescreen 17 inch monitors than it might be best to design your dashboard to look best at 1920×1080.  If you need to show your dashboard on an iPad or a Surface than you better have one lying around.  Good luck with that by the way.  Consider which browser is your company’s standard.  HTML5 or not, things look different in Chrome, Firefox, Safari and Internet Exploder versions 8, 9, 10, or 11.  

Awwww…

As I wrap this up, I just want to give a big warm thank you to the other members of my team, Sherri McDonald and Michael Simon.  A lot of hard work and long hours went into this project and it feels great to have professionals like them to ease the burden.  It’s a much better experience than working solo.  Three heads are much better than one and it feels great to have others throw rocks at your stuff and see where it breaks before the real critics get their hands on it.

Do you have a good dashboard design story?  Did I leave a danger out?  Please let us know in the comments. 

——————————————————————————- 

Other Dashboarding Resources

One person that I would consider an expert on dashboard design is Stephen Few.  I recently read on his blog that a second edition of his book, Information Dashboard Design: The Effective Visual Communication of Data, is being released shortly.  He mentioned in that post that one of the criticisms he received on the first edition that is addressed in the second edition is the lack of real examples of effective dashboards by real products.  His explanation is that at that time there simply weren’t any products released that were capable of producing what he considered really good dashboard designs.  Be sure to read his article Common Pitfalls in Dashboard Design for a great list of dashboard design gotchas.

Links

PerformancePoint 2013 Dashboards: Applying Dynamic Security using CustomData and EffectiveUserName

What are Web Parts?

PerformancePoint – Effectively Disable Analytic Grid Right Click Select Measures Functionality

Time Intelligence Filters in PerformancePoint 2010

Cascading Filters in Performance Point Services dashboard using SharePoint 2013

Performance Point Relative Date Time Intelligence with current date time

How to make SSRS reports in a PerformancePoint dashboard ‘pop out’ or open in a new window.

Clutter, data overload put dashboard designs on path to failure

Data Visualization and Dashboard Design

Formatting your PerformancePoint Analytic Grid!

Using SSAS MDX Calculation Color Expressions

Time Intelligence Post Formula Filters

Set Default Value of PerformancePoint Filter – Note:  Caveat to this tip- Once a user changes the default value SharePoint will remember the changed value and not reset the default value.

Add conditional formatting to your PerformancePoint Services Analytic Grid, by defining a custom Scope calculation from within Analysis Services (SSAS)

Filter Connection Quick Reference

——————————————————————————-

All my PerformancePoint links – Updated as I update them!

——————————————————————————-

PerformancePoint Gotcha!  – “PerformancePoint Services could not connect to the specified data source.”  When creating a new dashboard, you MUST do a save all right after you create your data sources otherwise you won’t be able to create anything that uses that data source.

——————————————————————————-

Gotcha! PerformancePoint Time Intelligence and SSRS

I was trying to use a PerformancePoint time intelligence filter to pass values to a SSRS report. I discovered that the MonthToDate, YearToDate, QuarterToDate syntax does not work with SSRS. Instead use Month.FirstDay:Day, Year:FirstDay:Day, Quarter.FirstDay:Day

——————————————————————————-

Zero to Dashboard- Intro to PerformancePoint    
Come check out the PerformancePoint hotness! Mike will demonstrate the functionality in PerformancePoint services in SharePoint 2010 and show you how to quickly build some dynamic dashboards and reporting functionality your end users will crave.

http://pragmaticworks.com/Resources/webinars/WebinarSummary.aspx?ResourceID=327

——————————————————————————-

PerformancePoint Tip – Right clicking an analytic grid in the leftmost columns allows the user to select measures.  Some people may not want that, so how do you disable that ability?  PerformancePoint Services Application Settings – Select Measures Control – set maximum value to 0.  (default is 1000).

http://technet.microsoft.com/en-us/library/ee620542.aspx

——————————————————————————-

SSAS MSMDSRV.INI Settings for Multi-User Environments

Here are excerpts from the SSAS Operations Guide for SQL Server 2008 R2 regarding the recommended settings for a multi-user environment.  If you want more information you must read the whitepaper.

——————————————————————————-

2.3.2.4    Heap Settings, DataStorePageSize, and DataStoreHashPageSize
During query execution, Analysis Services generally touches a lot of data and performs many memory allocations. Analysis Services has historically relied on its own heap implementation to achieve the best performance. However, since Windows Server 2003, advances in the Windows Server operating system mean that memory can now be managed more efficiently by the operating system. This turns out to be especially true for multi-user scenarios. Because of this, servers that have many users should generally apply the following changes to the msmdsrv.ini file.

Setting Default Multi-user, faster heap
<MemoryHeapType> 1 2
<HeapTypeForObjects> 1 0

It is also possible to increase the page size that is used for managing the hash tables Analysis Services uses to look up values. Especially on modern hardware, we have seen the following change yield a significant increase in throughput during query execution.

Setting Default Bigger pages
<DataStorePageSize> 8192 65536
<DataStoreHashPageSize> 8192 65536

——————————————————————————-

2.3.2.5    Clean Up Idle Sessions
Client tools may not always clean up sessions opened on the server. The memory consumed by active sessions is non-shrinkable, and hence is not returned the Analysis Services or operating system process for other purposes. After a session has been idle for some time, Analysis Services considers the session expired and move the memory used to the shrinkable memory space. But the session still consumes memory until it is cleaned up.
Because idle sessions consume valuable memory, you may want to configure the server to be more aggressive about cleaning up idle sessions. There are some msmdsrv.ini settings that control how Analysis Services behaves with respect to idle sessions. Note that a value of zero for any of the following settings means that the sessions or connection is kept alive indefinitely.

Setting Description
<MinIdleSessionTimeOut> This is the minimum amount of time a session is allowed to be idle before Analysis Services considers it ready to destroy. Sessions are destroyed only if there is memory pressure.
<MaxIdleSessionTimeout> This is the time after which the server forcibly destroys the session, regardless of memory pressure.
<IdleOrphanSessionTimeout> This is the timeout that controls sessions that no longer have a connection associated with them. Examples of these are users running a query and then disconnecting from the server.
<IdleConnectionTimeout> This timeout controls how long a connection can be kept open and idle until Analysis Services destroys it. Note that if the connection has no active sessions, MaxIdleSessionTimeout eventually marks the session for cleaning and the connection is cleaned with it.

If your server is under memory pressure and has many users connected, but few executing queries, you should consider lowering MinIdleSessionTimeOut and IdleOrphanSessionTimeout to clean up idle sessions faster.

——————————————————————————-

2.4.3    Multi-User Process Pool Settings
In multi-user scenarios, long-running queries can starve other queries; specifically they can consume all available threads, blocking execution of other queries until the longer-running queries complete.
You can reduce the aggressiveness of how each coordinator job allocates threads per segment by modifying CoordinatorQueryBalancingFactor and CoordinatorQueryBoostPriorityLevel as follows.

Setting Default Multi-user nonblocking settings
CoordinatorQueryBalancingFactor -1 1
CoordinatorQueryBoostPriorityLeve 3 0

——————————————————————————-

If your SSAS instance shares a server with anything else you may also want to read my good buddy Jorge’s blog post: Setting Memory Limits

——————————————————————————-

Did I miss any?  Do you have any additional tips to share?  If so, please sound off in the comments.

What are Web Parts?

The term “web parts” can mean different things to different people.  A simple web search for “Microsoft web parts” returns some interesting results.  My introduction to web parts began with SQL Server 2008 R2 Reporting Services and the introduction of the report viewer web part (not to be confused w/ the report part.)  This allowed you to embed reporting services .RDL files within a PerformancePoint 2007 dashboard.  I believe you could also embed a report in any SharePoint page using the add web part feature.

A Developer’s Introduction to Web Parts (pertaining to SharePoint 2003) states: “Developers can build Web Parts as ASP.NET custom controls. Administrators can install Web Parts on any site based on Windows SharePoint Services. Users can add Web Parts to pages by dragging and dropping in a browser, and they can personalize them by setting properties. Web Parts can connect to other Web Parts using standard interfaces.”

The Introduction to Customizing Pages by Using Web Parts (applies to SharePoint Server 2007) provides an excellent overview of Web Parts and Web Part Pages:

——————————————————————————-

A Web Part is a modular unit of information that forms the basic building block of a Web Part Page. You can add Web Parts to Web Part zones in a Web Part Page and then customize the individual Web Parts to create a unique page for your site users.

The following example uses the Image Web Part to describe the basic features of a Web Part.


Sample Image Web Part

Callout 1 The Web Part title bar contains the heading for the Web Part.

Callout 2 The Web Part menu contains functions that enable you to minimize or close the Web Part, edit the Web Part, or get Help for a specific Web Part. When the page is in edit mode, you can also use this menu to delete the Web Part or connect it to other Web Parts, depending on the type of Web Part that you are using.

Callout 3 The body of the Web Part contains the content that you specified for the type of Web Part that is being used. In this example, the Web Part is an Image Web Part, which displays an image.


A Web Part Page is a special type of Web page in which you can use Web Parts to consolidate data, such as lists and charts, and Web content, such as text and images, into a dynamic information portal that is built around a common task or special interest.

——————————————————————————-

Finally, WikePedia defines a web part this way:

——————————————————————————-

A Web Part, also called a Web Widget, is an ASP.NET server control which is added to a Web Part Zone on Web Part Pages by users at run time. The controls enable end users to modify the content, appearance, and behavior of Web pages directly from a browser. It can be put into certain places in a web page by end users, after development by a programmer.

Web Parts can be used as an add-on ASP.NET technology to Windows SharePoint Services.

——————————————————————————-

Web Parts can be custom created for your needs or purchased through third party vendors.

Here are some Free Microsoft SharePoint Web Parts.

As a PerformancePoint dashboard designer, the web part customization is done after the dashboard has been deployed.  The frustration for me as a developer is that the customization tasks must be repeated every time the dashboard is re-deployed.  I’ve had to create long text cheat sheets of the various web part height, width, and chrome settings for a dashboard.  If anything needs changed to the dashboard using Dashboard Designer that causes you to have to re-deploy, these needs to be done individually by hand.  As far as I know, there is still no way to automate this task.  This always baffled me since these configurations can’t really be stored in source control and can’t be automated as the solution is deployed from dev to QA to production.

UPDATE!  SharePoint 2013 allows for easy migrations!

One new feature of SharePoint 2013 that I really like is the new way it handles exporting and importing a dashboard project.  In previous versions of SharePoint, I had to create long lists of customizations that had to be done to the web parts after Dashboard Designer had deployed the dashboard.  Things such as hiding filters, resizing objects, and changing the chrome settings had to be done over and over again every time we made a change to the dashboard that required a re-deployment.  Moving the project from the development environment to the QA environment and finally to the production environment was a manual tedious process.  Furthermore, these customizations were not stored in source control.  I was pleased as punch to discover the export/import feature of SharePoint 2013 solves both problems.  I can extract the resulting .CMP file using a cab extractor and put the entire project into source control and I don’t have to repeat those customizations over and over again.

PerformancePoint 2013 Dashboards: Applying Dynamic Security using CustomData and EffectiveUserName

I recently had the great pleasure of solving a PerformancePoint 2013 puzzle and I thought I’d share how I did it to help others and as a reference for my fellow colleagues.  Our team was trying to implement dynamic active directory security for PerformancePoint 2013 dashboards.  A variety of PerformancePoint web parts were tested as part of this endeavor including Analysis Services scorecards, analytic charts and grids, SSRS reports (native mode), and Excel Services.

Devin Knight does an excellent job of explaining how to get this to work in his blogpost:  Analysis Services Dynamic Security.

For this method to work with PerformancePoint and Excel Services we will need two Roles:  ReadUserName and ReadCustomData.  The first one, ReadUserName is set up exactly how Devin describes.  This will be the role that Excel and SSRS will use.  The second one, ReadCustomData is the same except you need to replace the function UserName() with CustomData().  This ReadCustomData role will be the role that is referenced in the data source editor of PerformancePoint dashboard designer.

To test this I had to make sure that I wasn’t in any other AD groups or roles.  I only had to add myself to the ReadUserName role.  I did not have to be in the ReadCustomData role, only the service account that runs the SharePoint 2013 PerformancePoint content (a.k.a. Proxy.)  We also have a ReadUnrestricted Role.  I had to make sure I wasn’t in that role.

Here is another link, that describes how to get this working with PerformancePoint 2010:  How to configure dynamic security in Analysis Services and Performance Point 2010 without Kerberos.

Here is a link that describes various methods for testing if the security is working correctly:

Unit Testing Role Security in Analysis Services.

The series of posts below are wonderful!  I won’t repeat what is said there; but, I will say I was relieved to see that this person also came to the conclusion that two roles would be needed.

http://dbaexperience.com/2013/01/24/sharepoint-2010-sql-server-2012-bi-dynamic-security-part-1/

http://dbaexperience.com/2013/01/31/sharepoint-2010-sql-server-2012-bi-dynamic-security-part-2/

http://dbaexperience.com/2013/03/12/sharepoint-2010-sql-server-2012-bi-dynamic-security-part-3/

Here is where things got interesting… No, really.

This post:  Using EffectiveUserName To Impersonate in SSAS mentions that you have to set up Kerberos on your SSAS server to get this to work.  I can tell you we got it to work without that.  I did set up the data source in SSRS as this person describes.  Credentials stored securely in the report server, use as windows credentials, Impersonate the authenticated user after a connection has been made to the data source.

I got a little lucky (ok, maybe a lot lucky) in that my account had certain permissions that no one else in my active directory group had.  If it weren’t for that, I don’t know if I would have ever gotten any of this working.

Our initial goal was to set these permissions on either the SharePoint proxy account or the domain service account that process the cube.  That worked for the SSRS MDX reports; but, the Excel Services dashboard pages quit working.  Currently we are faced with the possibility that these permissions will need to be set for all the dashboard users and we are investigating what that actually means.  (http://windowsitpro.com/active-directory/hiding-data-active-directory-part-3-enabling-list-object-mode-forest)

image

We think the List contents is the one that made the difference.  My Active Directory sent me these links that he felt describe what is going on pretty well.  Here is what he said:

‘The link below explains the end result as the Account authenticating for the SSRS reports needs Read access to the TokenGroupGlobalAndUniversal attribute. This is done by adding the required account to the Windows Authorization Access  group in Active Directory.

Originally it was thought the SharePointProxy account was the account being used. I did add it to the group last Saturday and it did not resolve the issue. But today it was identified that the service account that connects to the data warehouse for cube processing was the account needing to verify the users group memberships to present the reports based on this information to the clients.

I also confirmed this to be the attribute by allowing Read access to all the User Object attributes and removing them until this single attribute was remaining. Removing this last attribute denied the reports with the Unknown username or password error.’

http://sqlinsight.net/blog/ssrs/ssrs-subscriptions-not-working/

http://blogs.msdn.com/b/richin/archive/2008/08/27/ssrs-subscription-tggau-rights-authzinitializecontextfromsid-exceptions.aspx

The other piece that seemed to make a difference was that the SSAS service is running under the NT AUTHORITY\NetworkService account.

Post in the comments if you have questions.

SSAS Gotcha: The Operation Has Been Cancelled Due To Memory Pressure

My team and I recently had an issue with our SSAS instance in which users were receiving the following message from Excel:

Data Could not be retrieved from the external data source.  Error message returned by the external data source:

Server: The operation has been cancelled due to memory pressure. 

Ok

image

Another user reporting a different error from Excel when trying to create a new connection to the Analysis Services database:

An error has occurred in the transport layer.

Errors in the OLE DB provider.  An error occurred while loading the connection dialog box components for prompting.

image

image

The following blog posts helped us identify and kill the MDX query and the user was executed:

Finding and killing SSAS Sessions

Kill a Session, SPID or Connection to Analysis Service Cube

The problem seemed to be caused by poorly constructed Excel pivot tables.  Users can do some strange things.  Things like dragging product numbers to the columns and product descriptions to the rows.  Regardless, a single user shouldn’t be able to bring down an entire server making it useless to anyone else in the company, right?

For the short term we had to find a way to automatically kill queries like these automatically.  Otherwise, our team could never go to lunch together.  Someone would always have to stay behind to log the locked query and restart the SSAS service if the server spun out of control.

Luckily, we were able to reproduce the condition by simply executing one of the queries we had captured on our QA server.  We tried a SSIS package that Chris Webb wrote to automatically kill locked sessions.  Unfortunately, when the issue occurred, the SSIS package couldn’t get a connection to check for locks let alone kill them.  I also played with the Server Timeout advanced property of the SSAS instance; but, that would only work reliably if we set it to a very small time period, such as 30 seconds.  Otherwise, it would blow right past the time out and never get cancelled.  I also got the PowerShell version of Chris Webb’s method to work; but, again it would only work reliably for short time periods.

*Make sure the user executing the Agent job is a SERVER administrator to get the PowerShell solution to work.  Only a WINDOWS SERVER admin can issue cancel XMLA requests.

Killing sessions automatically with SSIS

Dealing with Long Running SSAS Queries using Powershell

After more research and testing, I found this snippet:

Check the CancelCoordinatorCount property, you can decrease a bit from its default value, which will force the engine to check the IsCancelled property more often. As a result, the queries will be stopped faster. We had big problems with that, but after some post SP2 updated + with using that CancelCoordinatorCount thighs got much better.

http://social.msdn.microsoft.com/Forums/en-US/sqlanalysisservices/thread/72fb777a-1198-46d0-991b-df753ea164a6

I changed the CancelCoordinatorCount property from 1000 to 100 and the PowerShell solution would work with a slightly larger threshold.  Coupled w/ a larger failsafe threshold on the Server Timeout property (which also started working) I was getting closer to what I wanted.

Something about that post triggered a memory and a promise to myself I had made years ago.  The part about applying a service pack.  Many years ago, I worked as a BI Developer for a company that specialized in software for the Casino industry.  From time to time, I would be asked to help troubleshoot a problem with a client’s BI solution.  The first thing I would always do would be to reboot the server.  I can’t tell you how many times this would fix the immediate problem.  Second, I would make sure all of their windows updates and service packs were applied.  When they weren’t up to date, many times this fixed the problem.  As a consultant for Pragmatic Works, one of the first things I would always do when starting a new BI project or troubleshooting an issue would be to make sure all updates and service packs were applied.

To make a long story short… applying SQL Server 2008 R2 Service Pack 1 completely fixed our problem.  Two queries that would run for almost two hours on our QA server before finally bombing with a memory pressure error ran in less than five minutes and produced results.

Old lessons learned again.