SSRS Gotcha: Multi-value Parameter Design Patterns

Adding a Multi-value parameter to a report is so easy in SSRS.  If you only want one.  If you want more than one things get a whole lot more complicated.  For example, inline SQL is no longer even an option as far as I can tell.  You must create a stored procedure to handle the request.

Recently, I had the pleasure of assisting a cohort with the following dilemma: a stored procedure would serve the correct results when run as inline SQL; (i.e. commenting out the create procedure code) but, incorrect results were retrieved when the stored procedure was executed through SSRS.

I ran a profiler trace and it appeared at first glance as if the SSRS engine was sending the same parameters as the inline SQL.  Example: exec rpt_someSillyCode @variable=N’this,that,the other’

Notice the N?  I think that means NVarchar.

Turns out this situation was caused by the fact that in his procedure declaration the variable lengths were defined as 10 characters; but, when he ran the procedure as inline SQL his test variables were defined as 20 characters.

The whole ordeal reminded me of the fact that multi-value parameters can be a real pain so I wanted to document what I know.

There are at least three different design patterns (that I know about) when dealing with multi-value parameters in SSRS.

——————————————————————————-

Pattern #1, Dynamic SQL:

Example:

SET @variable = CHAR(39) + replace(@variable, ',', CHAR(39) + ',' + CHAR(39)) + CHAR(39)
--@variable is a multi-value parameter sent from SSRS
SET @strSQL = 
	'Select SUM(stuff) as StuffSum  
     into #ReportTempTable
     from #ADifferentTempTable
     Where VariableColumn  IN (' 
	+ @variable + ')';

--print @strSQL
EXECUTE (@strSQL)

——————————————————————————-

Design Pattern #2, Use a function to parse the multi-value:

CREATE FUNCTION SplitParameterValues (
	@InputString NVARCHAR(max)
	,@SplitChar CHAR(5)
	)
RETURNS @ValuesList TABLE (param NVARCHAR(255))
AS
BEGIN
	DECLARE @ListValue NVARCHAR(max)

	SET @InputString = @InputString + ','

	WHILE @InputString <> ''
	BEGIN
		SELECT @ListValue = SUBSTRING(@InputString, 1, CHARINDEX(',', @InputString) - 1)

		INSERT INTO @ValuesList
		SELECT @ListValue

		SELECT @InputString = SUBSTRING(@InputString, CHARINDEX(',', @InputString) + 1, 
				LEN(@InputString) - CHARINDEX(',', @InputString))
	END

	RETURN
END

——————————————————————————-

Design Pattern #3, Parse the multi-value parameters into a #temp table and join to it

This is done in a similar fashion to the function.  Except the parsing is done right in the stored procedure.  This can be used in conjunction with an ‘All’ member.

——————————————————————————-

Click here to download an SSRS solution and related SQL that demonstrates all three design patterns.

——————————————————————————-

The whole multi-value parameter thing can be a huge problem once you dig into it.  Check the links below for the pages that really helped me see the light.  Many think Microsoft should have given developers the option to hide the ‘select all’ option and provide our own ‘all’ solution (see this connect link.) This would prevent SSRS from sending a huge string of values that basically meant all of my values separted by columns as opposed to sending just the word ‘everything.’  These clever technicians have devised a multitude of ways around this.

Related Links:

How to Default to ALL in an SSRS Multi-select Parameter

http://dataqueen.unlimitedviz.com/2011/06/how-to-default-to-all-in-an-ssrs-multi-select-parameter/

Using Multi-Select in SSRS

http://jsimonbi.wordpress.com/2011/01/15/using-multi-select-with-stored-procedures/

Passing multi-value parameter in stored procedure (SSRS report)

http://munishbansal.wordpress.com/2008/12/29/passing-multi-value-parameter-in-stored-procedure-ssrs-report/

——————————————————————————-

Create a Table valued function in SQL Functions as below

Step 1

CREATE FUNCTION [dbo].[FnSplit] (
	@List NVARCHAR(2000)
	,@SplitOn NVARCHAR(5)
	)
RETURNS @RtnValue TABLE (
	Id INT identity(1, 1)
	,Value NVARCHAR(100)
	)
AS
BEGIN
	WHILE (Charindex(@SplitOn, @List) > 0)
	BEGIN
		INSERT INTO @RtnValue (value)
		SELECT Value = ltrim(rtrim(Substring(@List, 1, Charindex(@SplitOn, @List) - 1)))

		SET @List = Substring(@List, Charindex(@SplitOn, @List) + len(@SplitOn), len(
					@List))
	END

	INSERT INTO @RtnValue (Value)
	SELECT Value = ltrim(rtrim(@List))

	RETURN
END

Step 2 in your store procedure change the parameter where condition something like below

CREATE PROCEDURE [dbo].[SomeSP]
	-- Add the parameters for the stored procedure here 
	@CostCentre NVARCHAR(255)
AS
BEGIN
	SELECT [ProjectCode]
		,[ProjectName]
		,[ProjectManager]
		,SUM([Hours]) AS [Hours MTD]
	FROM dbo.Rpt_NRMA_CATS NC
	INNER JOIN PeriodID P
		ON NC.PeriodID = P.PeriodID
	WHERE ([CostCentre]) collate database_default IN (
			SELECT Value
			FROM dbo.FnSplit(@CostCentre, ',')
			)
END

Thanks Desari!

——————————————————————————-

Copied and pasted from: http://social.msdn.microsoft.com/Forums/en-US/sqlreportingservices/thread/0ead7ceb-3fdd-4625-aa82-1d4195f984b1/

——————————————————————————-

Know another way?  Please leave a comment.

SSRS Gotcha!–Using SSAS Time Intelligence in SSRS Reports

Today I had the opportunity to revisit a problem I encountered previously; but, today was different. Today I found the solution.

Adding Time Intelligence to your SSAS cube using the Business Intelligence Wizard is an easy way to add a whole bunch of useful information to your solution. It is one of my favorite topics when teaching boot camps or quick starts because it adds so much functionality to your Analysis Services solution without a lot of work. One day, I decided to create an SSRS report based on one of my SSAS databases that utilized this feature and encountered a strange error when I tried to filter the result set in the SSRS query designer by adding the time intelligence attribute to the filter screen.

“A set has been encountered that cannot contain calculated members”

Allow me to demonstrate…

  1. Start a solution in BIDS by selecting File, New Project, Import Analysis Services Database, and selecting your existing “Adventure Works DW 2008R2” solution.
  2. Once complete, double click the Adventure Works cube in the solution exploder and select Cube, Add Business Intelligence from the File menu up top.
  3. Define time intelligence is selected by default, so click next. Change the hierarchy “to analyze time calculations” from date/fiscal to date/calendar and check all of the available time calculations.
  4. Click the Select All button for the available measures and click Finish.
  5. Process your solution
  6. Close BIDS
  7. Open BIDS
  8. Click File, New Project, and select Report Server Project
  9. Create a data source to your SSAS database
  10. Start a new report and use the query designer
  11. Open up your Date Dimension, Calendar hierarchy and drag Date.Calendar Date Calculations to the bottom right pane in the query designer.
  12. Drag date.Calendar hierarchy to the bottom right pane, Open up measures, Internet Sales and drag Internet Sales amount to the bottom right pane in query designer. Scroll down the analyze the values.
  13. Now, for the fun part… drag Date.Calendar Date Calculations to the top right section to filter on a specific time calculation. I used month over month growth %.

image_thumb2

“A set has been encountered that cannot contain calculated members.”

Well, I don’t want to bore you anymore than necessary… Here is the fix:

Edit your data source connection string by appending “;Subqueries = 1” at the end like so:

image_thumb6

That’s all there is to it! Go back to the query designer if you don’t believe me.

image_thumb10

The blog post that gave me the clue was from Chris Webb: Subselects and Calculated Members in R2

Thank you Mr. Webb!

One more thing… the built in time intelligence wizard in SSAS is limited because you have to apply the time intelligence to each of the date dimensions in your cube which can lead to quite a lot of query processing overhead. There is a whitepaper with an alternate method which can apply these MDX formulas to all of your (role playing) date dimensions in one sweep. Also, in this zip file is a word document that describes how to apply Cognos style relative date time intelligence.

You can download it here.

I hope you found this post useful. If you have found a better way, please let us know!

SSRS Gotcha! BC 42206 Maximum number of warnings has been exceeded

Recently, another colleague came to me again with an issue where BIDs would not open a report in design mode.  Instead, the following block of text greeted the developer:

<html><head><META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=UTF-8"><title></title><style>.ErrorStyle  { font-family: tahoma; font-size: 11 pt; text-align: left}.DetailsStyle { font-family: tahoma; font-size: 10pt;  text-align: left;text-indent: 0; word-spacing: 0; line-height: 100%; float: left; margin-top: 0; margin-bottom:  0}.StackStyle { font-family: tahoma; font-size: 10pt; text-align: left; margin-left: 20; text-indent: 0} </style></head><body><DIV class="DetailsStyle" width="100%"><table border="0" width="100%" id="table1" height="100%"  cellspacing="0" cellpadding="0"><tr><td height="40" bgcolor="#FF9999" width="8%" style="letter-spacing: 0"  align="center"><img border="0" src="C:\Users\DarthVader\AppData\Local\Temp\tmpE34F.tmp" width="35" height="35"> </td><td height="40" bgcolor="#FF9999" width="91%" style="vertical-align: top; letter-spacing: 0" align="left"> <b><font size="2" face="Arial" fontsize="70%">
One or more errors encountered while loading the designer.  The errors are listed below.
</font><font size="2" face="Arial"></font></b></td></tr><tr><td colspan="2" align="left" valign="top"> <DIV id="div1" class="ErrorStyle" width="100%"><span style="font-weight: 600"><br>
Deserialization failed: The 'DataType' attribute is not declared. Line 31012, position 20. 
</span><span style="font-weight: 400"><p></p><div a href="" id="task0" style="cursor: hand; text-decoration:underline;  color=#0000FF">Edit code</a><p></p></div></span></DIV></td></tr></table></div></body></html>

When I loaded the same report, I was greeted with a slightly more friendly version of the above.

Deserialization failed: The 'DataType' attribute is not declared. Line 31012, position 20.

The solution is to do a search and replace for the words DataType = “Integer” and replace it with blank text.

Anecdotal evidence suggests that this happens when a report is downloaded from the server to get the latest version or boolean parameters are used (switch to integers.)

More Info:

http://connect.microsoft.com/VisualStudio/feedback/details/405739/vs-2008-rs-2008-error-deserialization-failed-the-datatype-attribute-is-not-declared      

SSRS – Gotcha! InteractiveHeight=0

This totally killed me for longer than I’ll ever admit.  I sat with DBAs running profiler traces, looking at execution logs, and running various tests.  Turns out it’s Reporting Services 101 (someone else’s words, not mine.)  As soon as my colleage said the words “Interactive Page Size” I slapped myself in the forehead.  Anyway, I wanted to share this so no one else goes through what I went through.

Screenshot

Symptoms:  Report takes too long to render or never renders.  Data retrieval through stored procedure or inline SQL is short (most of time spent in rendering.)  BIDs throws system.outofmemoryexception error when running the report in BIDs.  Report renders when filtered or top 1000 added to query; but, too much data causes it to either run indefinitely or return no data and a little yellow triangle with an exclamation point appears in the bottom left status bar in Internet Exploder with the words Error on Page.

I wish the System.OutOfMemoryException said something like, “check your that interactive page size height property is not set to 0.”  I wish there were more clues about what was going on when running on a server.  We were only talking about 200K rows of data with some drill-down capability.  Even if the server had to render hundreds of pages it should have eventually worked or at least come back with some sort of error or something.  (Maybe we missed something here, I don’t kinow… I’m not a DBA; but, I played one in a webinar once.)

From: Troubleshooting Reports: Report Performance

No Page Breaks in the Report

As a user pages through a report, the report processor combines data and report layout information for each report page and passes the page to the report renderer. For a report that has no page breaks, the whole report must be processed before the user can view the first page.

A soft-page break renderer, such as the HTML viewer, automatically handles paging for you. You can override this automatic behavior and set the report to be one page by setting the Report property InteractiveHeight to 0. For hard-page break renderers, you must add page breaks manually. For more information about types of renderers, see Understanding Rendering Behaviors (Report Builder 3.0 and SSRS).

Verify that InteractiveHeight is not 0 and that it is set to some reasonable page size, for example, 8.5 in. Add page breaks to report items or Tablix groups to help organize the report into pages. This reduces the amount of data that must be processed for each page. For more information, see How to: Add a Page Break (Report Builder 3.0 and SSRS).

—————————————————————————

More info:

SSRS InteractiveHeight = 0 and IE Script Performance (Connect suggestion)

Report Design: Best Practices and Guidelines

SSRS – Relational Report Performance Troubleshooting

If I am tasked with troubleshooting relational report performance there are a number of processes I will follow to identify exactly where the bottlenecks exist so I can focus my performance tuning on the areas that will have the most impact.  I’d like to share these processes with the community and invite others to contribute their own tips in the comments section.  I’ll try to update this blog post with best suggestions as they come available.  This post will focus on relational reports as opposed to OLAP reports.

Before I begin, please review the following MSDN post: Troubleshooting Reports: Report Performance.  I am going to try and not repeat anything listed in that article because it is a great resource and should be your first stop.

The first suggestion I have is to use stored procedures instead of inline SQL.  I think it makes sense from a modular programming point of view; but, there are also some performance gains that can be made from this practice.  A great discussion on the pros/cons of this approach can be found in this StackOverFlow forum post:  Are Stored Procedures more efficient, in general, than inline statements on modern RDBMS’s?

So if you have a report that uses a huge stored procedure the next step is to get a baseline measurement for how long the stored procedure takes to execute using certain parameters.  Using the same parameters, run the report and measure the length of time it takes the report to render.  The difference between the stored procedure execution time and the total report execution time will give you a general idea of how long it takes the Reporting Services engine to render the data.  This approach assumes you do not have DBA privileges to actually query the executionlog3 view on the reporting services instance to get exact values yourself.

If a report is taking thirty minutes to run and the stored procedure is taking fifteen minutes of that time then you need to look at what you can do to improve data retrieval and what you can do to improve report rendering.  Count yourself lucky if can identify one or the other as being the bottleneck.  You’ll have half as much work to do!

Troubleshooting Stored Procedure Data Retrieval

I like to start by creating a table on the development server to measure the time it takes the SQL engine to perform each of the steps within the stored procedure.  For example:

CREATE TABLE WORK.PerfTuning (
     TestNumber         INT         NOT NULL
    ,TestSystem         VARCHAR(50) NULL
    ,StartProcedure     DATETIME    NULL
    ,EndProcedure       DATETIME    NULL
    ,ProcedureDuration  DATETIME    NULL
    ,Milestone1         DATETIME    NULL
    ,Milestone2         DATETIME    NULL
    ,Milestone3         DATETIME    NULL
    ,Milestone4         DATETIME    NULL
    ,Milestone5         DATETIME    NULL
    ,Milestone6         DATETIME    NULL
    ,Milestone7         DATETIME    NULL
    ,Milestone8         DATETIME    NULL
    ,Milestone9         DATETIME    NULL
    ,Milestone10        DATETIME    NULL
    ,Counter1           INT         NULL
    ,Counter2           INT         NULL
    ,Counter3           INT         NULL
    PRIMARY KEY CLUSTERED (TestNumber ASC) WITH (
         PAD_INDEX = OFF
        ,STATISTICS_NORECOMPUTE = OFF
        ,IGNORE_DUP_KEY = OFF
        ,ALLOW_ROW_LOCKS = ON
        ,ALLOW_PAGE_LOCKS = ON
        ) ON CIA_SecretDatabase
    ) ON
CIA_SecretDatabase

Add the following code to the beginning of your stored procedure:

    DECLARE @TestNumber int = 1;
    DELETE FROM WORK.PerfTuning WHERE TestNumber = @TestNumber;
    INSERT INTO WORK.PerfTuning            (TestNumber            ,TestSystem            ,StartProcedure            )      VALUES            (@TestNumber            ,'QA'            ,GETDATE()            );

Add Update statements after each major code block within your stored procedure incrementing the Milestone1 to Milestone2, Milestone3, and so on.

UPDATE WORK.Perftuning SET Milestone1 = GETDATE() WHERE Testnumber = @Testnumber;

After you execute the stored procedure the following query will tell you how many seconds it took for each milestone to complete.  The milestones with the greatest values are the ones you should focus on.

SELECT
     testnumber
    ,duration   = CAST(DATEDIFF(SECOND, startprocedure, endprocedure) AS VARCHAR(5))
    ,Milestone1 = CAST(DATEDIFF(SECOND, startprocedure, Milestone1) AS VARCHAR(5))
    ,Milestone2 = CAST(DATEDIFF(SECOND, Milestone1, Milestone2) AS VARCHAR(5))
    ,Milestone3 = CAST(DATEDIFF(SECOND, Milestone2, Milestone3) AS VARCHAR(5))
    ,Milestone4 = CAST(DATEDIFF(SECOND, Milestone3, Milestone4) AS VARCHAR(5))
    ,Milestone5 = CAST(DATEDIFF(SECOND, Milestone4, Milestone5) AS VARCHAR(5))
    ,Milestone6 = CAST(DATEDIFF(SECOND, Milestone5, Milestone6) AS VARCHAR(5))
    ,Milestone7 = CAST(DATEDIFF(SECOND, Milestone6, MILESTONE7) AS VARCHAR(5))
    ,Milestone8 = CAST(DATEDIFF(SECOND, Milestone7, EndProcedure) AS VARCHAR(5))
FROM WORK.Perftuning

Seems great, huh?  As you go through the code for the heavy hitting milestones and make changes you should be able to see the impact of those changes on the execution time, right?  Well, it depends.  Hopefully you have the fortune of being able to work on a development server that no other developers are using.  Usually, this is not the case.  Second best would be to have the source databases on your local workstation.  You can then issue a DBCC DROPCLEANBUFFERS command to test queries with a cold buffer cache without  shutting down and restarting the server and without affecting other developers.  (More info: SQL SERVER – Stored Procedure – Clean Cache and Clean Buffer)

So what if you can’t do either?  My suggestion would be to run every test twice and take the average.  It’s really the best we can do in such a situation.  Unless you have DBA privileges you’ll never have any insight into what else was happening on the server during the time you performed your test.  Another developer could very well be running a resource intensive query at the same time.

Once you have identified the bottlenecks in your stored procedure, analyze the execution plan of each code block for your worst offenders (the ones that take the longest time to complete.)  It will suggest indexes that can be added to improve performance.  There are many resources for improving queries out there and how to add indexes to support them.  To be brief, make sure you have indexes to support your joins, where clause filters, case statements, and group bys.  Also make sure your joins are in the same order of your index and wherever possible include the clustered index.  Try to avoid queries within queries within queries, consider using common table expressions instead.  Test different variations of #temp tables vs. table variables.  Rule of thumb here is use #temp tables for lots of data and use table variables for small amounts of data.  You can add indexes to your #temp tables where appropriate, you can’t with table variables.

Other things that can improve data retrieval performance:  database compression introduced in SQL Server 2008 is an easy win to improve query performance.  Solid state storage is another great option.  Consider partitioning tables that are 50GB or larger.

Performance tuning is a time consuming task so I can’t emphasize this enough:  Work from a subset of the data.  If the stored procedure takes 15 minutes to run against a table with 5 years of data it should run much faster against a table with only 1 year of data.  If it takes 5 minutes to run on 1 year of data and you improve it to run in only 1 minute, those performance gains will translate to the larger dataset as well; but, the development time will be much less.

Troubleshooting Report Rendering Bottlenecks

Once your stored procedure is running lean, mean, and fast it’s probably time to look at what we can do to improve the rendering of the report.  The previously mentioned link from MSDN, Troubleshooting Report Performance, has a lot of great suggestions.  Check my recent blog entry on the Interactive Height setting not being set to 0, that one is a doosey.

Whenever possible, let the SQL Engine do the work.  Grouping and ordering is usually going to be more efficient if done in the stored procedure.  Is the report trying to do too much in one report?  This is a common problem.  Consider breaking the report into several reports.  Include links to the other reports within each report.  It may not be as fancy as a document map; but, the report will render much more quickly which should improve the user experience.  It may not.  If the user always wants to see the other sections they may prefer to wait 30 minutes and have it all ready as opposed to waiting 10 minutes, clicking a link and waiting another 10 minutes.  Drill down functionality can be very expensive during rendering.  You can alternatively use linked “drillthrough” reports to support similar, albeit not as fancy, functionality.

Finally, ask yourself: are you using the right tool for the job?  Relational reports are best for operational type reporting.  The types of reports the middle managers need on a daily basis.  Highly aggregated, birds-eye view reports should use a data warehouse or even better an OLAP cube.  Reports that render over one hundred pages of data that are subsequently exported to Excel are inefficient.  These are not reports, they are data dumps.  Sure, scheduling those reports to run once a day can help alleviate some of this grief; but, why not just use an SSIS package to do this job?

Do you suspect that the report server itself is not configured correctly?  Prove it!  Temporarily modify your procedure to populate a physical table with the resultset.  Then modify the procedure again to select only the top 1000 rows from that table.  Make sure to use different parameters so you’re not hitting the cache and run the report in BIDs measuring the time, and then run the report using Internet Exploder measuring the time.  If your workstation renders the same report in a lot less time than the server than you may have a case for the DBA to do some further investigation on the Reporting Services configuration.  This is also a great way to eliminate data retrieval bottlenecks during report execution as you troubleshoot rendering performance and functionality issues.

That’s all I have for now.  I hope to add more to this post as I think of more great tips or encounter them in the future.  Please share any tips, links, or suggestions in the comments.

More information

Troubleshooting Report Problems

Troubleshooting Concepts (Reporting Services)

Troubleshooting Reports: Report Rendering

What’s Up With the Slow SSRS R2 Rendering in SharePoint 2010 Integrated Mode?

SQLCAT: Report Server Catalog Best Practices