Wednesday, May 3, 2017

SSRS Tips: Creating 2012/2014 BI Templates Using VS 2015

Introduction

With the release of VS 2015 shell, its now possible to utilize the BI templates available init for creating SSRS reports, SSIS packages and SSAS projects compatible with server versions 2012 till 2016 from the same unified environment.
This quick post discusses on two things that you need to consider while creating projects using VS 2015 shell which are targeted to deploy at a lower version server.

Scenario

I had recently used VS 2015 shell for development of SSRS reports and SSIS packages for one of my projects. Our production server was still on 2014 version. As per official documentation of Microsoft VS 2015 shell can be used for creating packages, reports etc which can be deployed onto 2012, 2014 or 2016 versions. The link below has the details


As per the above I used SSDT 2015 for creating SSRS reports. After creating the reports I tested locally and once I was happy I decided to deploy the reports to the server which was in 2014 version.
For this I went to Project  Properties and tweaked the below properties


My expectation was that by setting this property I would be able to deploy the reports to 2014 server. But to my surprise I got the below message when I tried to deploy the rdl file


The definition of this report is not valid or supported by this version of Reporting Services. The report definition may have been created with a later version of Reporting Services, or contain content that is not well-formed or not valid based on Reporting Services schemas


Solution

I didn't understand why this was happening and did some analysis on other settings to see if there's anything I could find. Fortunately i was able to spot the below property




This comes under Tools options and determines the Server Version to which report is to be deployed. The default value set was 13.0 which corresponds to SQL 2016. Once I changed it to 12.0 and tried deploying, the report got successfully deployed to the server. 

Something to note next time when you do deployment of SSRS,SSIS or SSAS project from VS 2015 shell to make sure deployment is successful. 

Conclusion

Hope this quick tip will save you sometime in avoiding the above specified issue while deploying BI projects using VS 2015 shell templates

Thursday, April 20, 2017

SQL 2017 CTP Released!

SQLServer 2017 CTP is out for public preview now.
SQLServer 2017 has some exciting features on board like resumable online index rebuild, adaptive query query processing etc
Also great improvement can be seen in the performance of  datawarehouse workloads in Linux environment.

More details on the release here

https://blogs.technet.microsoft.com/dataplatforminsider/2017/04/19/sql-server-2017-community-technology-preview-2-0-now-available/

Thursday, April 13, 2017

SSIS Tips: Case Study - Extracting ZenDesk data using SSIS

Impetus

 Its been a long time since I wrote a blog.
The main reason was that I've been super busy with some exciting projects of late.
One of my recent projects had an interesting requirement which I thought would be worth sharing for the benefit of the wider audience.

Scenario

One of my recent BI engagements with a e-commerce customer had a use case which required customer satisfaction metrics to be captured and analyzed. The metrics were to be derived from variety of data including support tickets, delivery, feedback and notification emails etc. The client was using ZenDesk as the ticketing application. All of the above required information was tracked by means of tickets within ZenDesk so the attempt was to integrate ZenDesk data using ETL tool SSIS and use the data for analysis along with the other available data.

Illustration

ZenDesk offers API access to its data. The attempt was to use SSIS and access the required ZenDesk data using API calls. 
For the purpose of this blog lets see a simple scenario of how SSIS can be used to call ZenDesk API for extracting tickets related data.
The details regarding ZenDesk APIs can be found here


Refer to the Search section which explains on various API endpoint for searching ZenDesk data. We will be utilizing this for searching and fetching the ticket information using our required filters. The search API is flexible due to the fact that it allows a set of query string parameters which can be utlized to customize our search. Some example usages are given in the below link 


So for our case since we require ticket information the full URL would look like below

https://{subdomain}.zendesk.com/api/v2/search.json?sort_by=created_at&sort_order=asc&query=created> type:ticket

Here subdomain represents the subdomain used while registering with ZenDesk (mostly the company name) and date value the date relative to which you want to do the search. sort_by parameter is used to get the search result in chronological order.
The response for the above API call will look like this

{
"results":[
{
"url":"https://.zendesk.com/api/v2/tickets/36046.json",
"id":36046,
"external_id":null,
"via":{
"channel":"email",
"source":{
"from":{
"address":"xyz2@gmail.com",
"name":"XYZ 2"
},
"to":{
"name":"abc.com",
"address":"customercare@abc.com"
},
"rel":null
}
},
...],
"facets":null,
"next_page":null,
"previous_page":null,
"count":72
}

As you see it gives a JSON response having the above structure 
One more thing to note here is that API returns only a maximum of 100 results in a response. In case search result exceeds 100 items the response gets auto-paged at every 100 results. The url for the next page will be sent along with the response (refer to the next_page/previous_page elements in the above JSON)
So the cases where we have more results we need to capture next_page value each time and sent a new request to the server using that url. This will continue until the last page in which case the next_page value will be null.
Considering all these points a simple package for extracting ZenDesk data will look like below


Now we can analyse the various tasks inside the package to understand their purpose
For Loop: The loop is included to ensure we iterate through requests until we reach the last page of our search result. It will be a simple For loop based on a counter which will be initialized to 1 and reset inside the loop when the results are completed to break from the iteration.
The Expression Tasks are used to set the iterative values for request URL i.e. URL to fetch next page of results as well as for the reset of the URL and counter values
The script task does the download of search results in JSON format into a text file using API call. It utilizes a HTTP connection which will use ZenDesk API URL as the source through a variable expression to pass dynamic values like date.
The variable expression in our case looks like below

"https://{domain}.zendesk.com/api/v2/search.json?sort_by=created_at&sort_order=asc&query=created >" + (DT_WSTR, 30)  (DT_DBDATE) @[User::Date] + " type:ticket"

The script task code will look like this

Dim nativeObject As Object = Dts.Connections("HTTP Connection Manager").AcquireConnection(Nothing)

        
        Dim connection As New HttpClientConnection(nativeObject)


        ' Download the file #1
        ' Save the file from the connection manager to the local path specified
        Dim filename As String = ""
        connection.DownloadFile(filename, True)

The Execute SQL Task uploads the JSON data from the file to a table in SQL Server database. Once data is transferred it uses a JSON parsing logic to get the individual elements out.
Parsing JSON can be done using native SQL functions like OPENJSON, JSON_VALUE etc if you're on SQL 2016 . This is already explained in my earlier blogs below


If you're on an earlier version of SQL, you need to use a UDF for parsing JSON like one below


The SP code will look like below in that case

INSERT JSONData(JSON)
SELECT *
FROM OPENROWSET(BULK '',SINGLE_BLOB) AS t


IF OBJECT_ID('tempdb..#Temp') IS NOT NULL DROP TABLE #Temp

SELECT f.* 
INTO #Temp 
FROM JSONData j
CROSS APPLY dbo.parseJSON(j.JSON)f


SELECT @NextURL = StringValue
FROM #Temp
WHERE NAME = 'next_page'

Once you parse it you will get values of various elements within the JSON. This can be used for extracting the required information
The next_url value will be captured and returned through an OUTPUT parameter back to the package. Using this we can determine whether to iterate again for the next page of results or break the loop.
Once you parse the data out to a table you can then use it for your further manipulations like aggregation etc.

Conclusion

As seen from the above illustration you can use a simple script task utilizing a HTTP connection in SSIS package to call and extract data through ZenDesk APIs. This data can then be used for manipulations within your applications and can be utilized in BI reports, dashboards etc.

Friday, October 21, 2016

WinSCP Tips: Passing Credentials With Special Characters

Scenario

Recently there was an issue faced in one of our projects on a file processing system that we designed. The file processing system was designed to process third party files which had to be downloaded from a FTP location. 
We designed a SSIS package to do the FTP download and file processing and used WinSCP for the purpose. The logic used was as per one of my earlier articles as given below


On executing the package the Execute Process Task which was added to do the FTP has failed. As SSIS doesn't give too much information on the error I put a break point on the pre execute event of the execute process task and captured the full statement which is to be executed by WinSCP. 
I launched a command prompt window and tried executing the statement generated by the SSIS and got the below



As seen from the above screenshot I got an access denied error. I checked the credentials again and tried them on WinSCP client installed in my machine and to my surprise it was working fine allowing me to connect to the FTP location. From this I could infer that the problem was with the way SSIS or the WinSCP command line is handling the credentials.
I did some detailed analysis on the command used for FTP operation by WinSCP which was as below

open ftp://username:password@Servername:Port

And managed to find the issue. The problem in this case was the password which contains special characters like +. This will break in case of the WinSCP. 

Solution

The solution in the above case was to encode the special characters before being passed to the url. If you've used .NET programming before you should know that there's a function available in .NET EscapeDataString which is available in System.Uri namespace. This function provides with the encoded values when special character is passed.
Based on this we can develop a solution such as below for handling the above scenario.
Create a function which will parse through the passed URL string and looks for special characters within it. Once found it will pass the character to the EscapeDataString function which will return the encoded value corresponding to it.
The string parsing can be done using RegexMatch function to find out the special characters.
So the overall package would look like this

The Script task will receive as parameter the URL for FTP/SFTP and then parse it for encoding the special characters as below

public void Main()
{

            var str = EncodeUrl(Dts.Variables["Url"].Value.ToString());
            Dts.Variables["EncodedUrl"].Value = str;
                Dts.TaskResult = (int)ScriptResults.Success;
}

       
        private static string EncodeUrl(string URLInput)
        {
            const string pattern = "[$&+,/:;=?@]";
            var match = Regex.Match(toEncode, pattern);
            while (match.Success)
            {
                URLInput URLInput.Replace(match.Value, Uri.EscapeDataString(match.Value));
                match = Regex.Match(URLInput, pattern);
            }
            return URLInput;
        }

Once the encoded URL is returned by the above function you can then store it in a variable and use it to set expression properties for your Execute Process Task to use the returned value. 
In the above code you need to add a reference to System.Text.RegularExpressions namespace  before using the Regex.Match function. Uri.EscapeDataString comes under System.Data namespace which will be included by default inside the script task. 
In the above case function will replace character + with its encoded value %2B which will work well with the WinSCP. 

Conclusion

As discussed above the special characters in credentials have to be encoded before generating the URL for FTP in case of applications like WinSCP.
The above illustration gives an easier way of doing it within the SSIS package using a simple .NET function which can be invoked from within a simple script task.
Hope this helps to solve similar issues faced due to the presence of special characters in URLs and in query string parameter values.

Friday, August 26, 2016

SSRS Tips : Fixing Format Issues in MHTML Rendered SSRS Reports

Impetus

The impetus for writing this blog came from a recent issue I faced while setting up email based subscription for one of my MIS reports for enterprise wide distribution. The report content had to be embedded within the email in HTML format so I had set the rendering option as MHTML for the subscription. When report is delivered using subscription it can be noticed that none of the formatting, cosmetic settings on the original report gets applied. 
This article explains how you can overcome this issue to preserve the native format settings you applied to the report while rendering it through automated subscriptions in MHTML format.


Illustration

Consider the below report created in SSRS


The requirement was to schedule an automated delivery of the above report through email to a list of users. The report has to be embedded within the email body rather than being as a separate attachment.
The best way to implement the above requirement is to set an email based subscription for the report and configure the render format as MHTML. This would ensure that the report will be embedded within the body of the email
As per the above approach subscription is configured for the report as below

.

When the subscription got triggered the email obtained was as below

As you see from the above, the rendered report was stripped off its format settings when rendered in MHTML format
This would cause difficulty in interpreting the report especially when amount of data shown in the report is huge. Question was how to make sure the format remains intact while rendering through sunscription as above
I was able to arrive at the solution after some analysis. The solution is to use expression mode to set the format properties for the report cells
As an example see the below illustration


Once you set this as an expression and set it for subscription you can see the rendered report as below


As you see from the above the report gets rendered with all the format settings correctly applied. All the property values have been set using expressions in the report designer and MHTML format will preserve the values while rendering through the subscription.

Conclusion

As seen from the illustration above, the format properties of the report can be preserved intact by using expressions while rendering in MHTML format.