BI-NSIGHT – Power BI Designer Refresh – SQL Plugin for Excel – SQL Server 2016 In Memory Tables – Azure SQL Data Warehouse – Office 2016 Update – Power BI Content Pack (UserVoice)

Another week has gone by, so let’s get into what is currently out there and happening in the BI world.

Power BI Designer Refresh

I do know that this came out last week, but I still wanted to comment on this.

Once again this is something that customers and myself have been waiting to be implemented and asked for on the forums.

For me personally this is really fantastic, because it gives me the opportunity to be able to get access to my data, shape it and then create my report interface using Power BI Designer. And through every iteration it is getting better, easier and smarter. And now with the ability to easily upload this into Power BI, and if you are using one of the supported data sources, it is simple as a few clicks in order to get the refresh up and running.

This really makes it easy for the people creating the reports, and with the future releases on the way, I personally see this becoming a valid competitor to the other visualisation tools that are currently out there.

You can read all about it here: Announcing Refresh Support for Power BI Designer files in the Power BI service

SQL Plugin for Excel

I came across this from reading through my twitter feeds.

I have to say that it does look very interesting. It appears that this can be used to pretty much have a SQL like interface when using Excel.

It also does appear that there is the potential to write data from your Excel workbook, back into SQL Server, via temp tables. This might be very handy, because I have had situations in the past where I have wanted to extract some data from a SQL database, in order for the users to make some changes. And then struggled to easily get the updated data back into my database. I could think of a method to be able to do this via the temp tables, and having a process to import the data from the temp tables into my database. Then once I have it in there, it would be easy to update the required changes!

More information here: thingiequery

SQL Server 2016 – In Memory Tables

Last week at the local (SSUG) SQL Server User Group, we were very fortunate to have some of the program managers from the SQL Server team present to us what is coming up in SQL Server 2016.

What I wanted to highlight is something that as far as I can see, so far has not really been highlighted.

It has to do with the In Memory tables in SQL Server 2016, and that Microsoft have almost resolved or allowed almost all of the query syntax is now available against In Memory tables. I do remember when I did try this before on SQL Server 2014 for my data warehousing loads, whilst there was not a big performance improvement, one of the issues was due to the fact, that there was a whole host of query syntax that I could not use.

Now with this almost having been resolved, as well as generally improving the In Memory tables, this is something that I am going to look at implementing where required in my data warehouse loads. Ideally I would imagine where there is a section of work, where it needs to be processed really quickly and having all the tables in memory means that this could potentially have a big performance improvement.

You can read about what is supported in SQL Server 2016 In Memory Tables here: Query Surface Area in Natively Compiled Stored Procedures

Azure SQL Data Warehouse

I am super excited about Microsoft starting to roll out the SQL Data warehouse preview in Azure.

I do think that this is something that is going to be a key driver for businesses to look at Microsoft for their cloud strategy going forward.

In the past few months, or closer on a year now, I have started to really enjoy how Microsoft has changed, and become a really forward thinking company, where they are listening to their customers and where they can taking the customers’ requirements and putting this into their products.

Along with this, especially in the cloud area, they are able to release and update their software at a pace that I have not seen ever. And when comparing it to other cloud vendors it is amazing how quickly they are able to do this. And for me as a consumer of these clouds services, and other services, it makes my work every interesting and keeps me on my toes!

I look forward to testing this out and see how it performs with my own data, when I can get access to it, and see if it lives up to all the hype!

You can read about it here: SQL Data Warehouse

Office 2016 Update

There is now an official blog update from Microsoft with regards to what they have done with Office 2016.

It is great to see what they showcased at Ignite to already be available for people to look at and test.

You can read more about it here: Office 2016 Preview—update 2

Power BI Content Pack – UserVoice

I am pretty certain that Microsoft is releasing a new content pack every week. When I log in and view what new blog posts are out there almost every week without fail there is another new content pack. Which is fantastic.

This week the content pack is for an application called UserVoice, which is allows it’s customers to provide feedback on their products. As well as for customers to share their experiences. And it looks like a really easy and simple interface to use. Which at the same time is very powerful.

You can find all the related information here: Monitor and Visualize your UserVoice Data with Power BI

BI-NSIGHT – SQL Server 2016 – Power BI Updates – Microsoft Azure Stack

Well I have to admit that it seems that the Microsoft machine has been working flat out to get out new products and updates.

If my memory serves me, this is the third week in a row that Microsoft has released new products and updates. I am really enjoying it! But hopefully this will slow down, so that we can catch our breath and actually play with some of the new products and features.

So let’s get into it. There is quite a lot to go over!!

SQL Server 2016

So with Microsoft Ignite happening this week, the wonderful guys from Microsoft have started to announce what we can expect to be in the next version of SQL Server.

I am going to focus mainly on the BI (Business Intelligence) features, and there are quite a few! Some of what I am detailing below I have only seen pictures on Twitter, or I have read about it. As well as the preview not even being released yet, I am sure that there will be some changes down the line.

SQL Server Reporting Services

It finally appears that Microsoft has been listening to our cries and requests for updates in SSRS (SQL Server Reporting Services)!

Built-in R Analytics

I think that this is really amazing, even though I am not a data scientist, I think it is a smart move my Microsoft to include this. What this means in my mind is that you can now get the data scientists to interact and test their R scripts against the data as it sits in the transactional environment. And from there, this could then be used to create amazing possibilities within SSRS.

Power Query included in SSRS

From what I have seen people tweeting about as well as what I have read, it would appear that Power Query will be included in SSRS. This is fantastic and will mean now that virtually any data source can be consumed into SSRS.

New Parameter Panel, chart types and design

I am sure we can all agree, that SSRS has required an overhaul for some time. And it seems that finally we are going to get this. It would appear that the parameters panel is going to be updated. I do hope that it will be more interactive and in a way react similar to the way the slicers do in Excel. As well as getting new chart types. Here again I am going to assume that it will be similar to what we have in Power BI!

And finally there was also mention that the reports will be rendered quicker, as well as having a better design. I also am hoping that they will deploy this using HTML 5, so that it can then be viewed natively on any device. It is going to be interesting to see what Microsoft will incorporate from their acquisition of Datazen.

SQL Server Engine

Built-in PolyBase

Once again, this is an amazing feature which I think has boundless potential. By putting this into the SQL Server Engine this means that it is a whole lot simpler to query unstructured data. Using TSQL to query this data means that for a lot of people who have invested time and effort into SQL Server, now can leverage this using PolyBase. And along with this, you do not have to extract the data from Hadoop or another format, into a table to query it. You can query it directly and then insert the rows into a table. Which means development time is that much quicker.

Real-time Operational Analytics & In-Memory OLTP

Once again the guys at Microsoft have been able to leverage off their existing findings with regards to In Memory OLTP and the column store index. And they have mentioned in their testing that this amounts to 30x improvement with In-memory OLTP as well as up to 100x for In-Memory Column store. This is really amazing and makes everything run that much quicker.

On a side note, I did read this article today with regards to SAP: 85% of SAP licensees uncommitted to new cloud-based S/4HANA

I do find this very interesting if you read the article. What it mentions is that firstly 85% of current SAP customers will not likely deploy to the new S/4HAHA cloud platform. Which in itself does not tend well for SAP.

But what I found very interesting is that to make the change would require companies to almost start again for this implementation. In any business where time is money, this is a significant investment.

When I compare this to what Microsoft has done with the In-Memory tables and column store indexes, where they can be used interchangeably, as well as there is some additional work required. On the whole it is quick and easy to make the changes. Then you couple this with what Microsoft has been doing with Microsoft Azure and it makes it so easy to make the smart choice!

SSAS (SQL Server Analysis Services)

I am happy to say that at least SSAS is getting some attention to! There were not a lot of details but what I did read is that SSAS will be getting an upgrade in Performance usability and scalability.

I am also hoping that there will be some additional functionality in both SSAS OLAP and Tabular.

SSIS (SQL Server Integration Services)

Within SSIS, there are also some new features, namely they are also going to be integrating Power Query into SSIS. This is once again wonderful news, as it means now that SSIS can also get data from virtually any source!

Power BI

Once again the guys within the Power BI team have been really busy and below is what I have seen and read about in terms of what has been happening within Power BI

Office 365 Content Pack

I would say that there are a lot of businesses that are using Office 365 in some form or other. So it makes perfect sense for Microsoft to release a content pack for Office 365 Administration

As you can see from the screenshot below, it gives a quick overview on the dashboard to see what activity is happening. As well as details of other services. I am sure that this will make a quick overview of your Office 365 systems really easy to see. And also if there are any potential issues, this could also be highlighted!

Visual Studio Online Content Pack

Another content pack that is about to be released is for people who use Visual Studio Online, I personally do not currently use this. But it does look great for people to once again have a great overview of what is going on.

You can read more about it here: Gain understanding and insights into projects in Visual Studio Online with Power BI

And as you can see below, what you can view once you have got it setup within Power BI.

Power BI planned updates

Below are the updates that I had previously voted for in Power BI. It is great to see that the Microsoft team is actively listening to their customers and implementing some of the idea’s. I have to say that I do not think that there are many other software companies that are doing this currently. And also being able to roll it out as quickly as Microsoft is.

Set Colors and Conditional Formatting in visuals

  • This is great as it will allow the report authors to have more control in terms of how their reports look.

Undo / Redo button in browser & designer

  • This might seem like a small update, but I know personally from working with the Power BI reports, that sometimes you just want to see what a different report looks like. Or adding another element. And with the Undo / Redo buttons, it just saves that little bit of time, as well as to make the report authoring experience that much more enjoyable.

Power BI Announcements from Microsoft Ignite

Below are some announcements that I have read up about either on Twitter or one of the blogs that I follow. It is really great to see so many things in the pipeline.

This means that there is a lot to look forward to, as well as ensuring that we have new and wonderful things to show.

I got this picture via Twitter, which someone must have taken at the Microsoft Ignite Conference. As you can see it is not very clear, but it does show the next update in the Power BI Designer.

You can also see the undo and redo buttons.

It also appears that in the Power BI service, there will be support for SSRS files, namely the .rdl files! Here is another picture taken from Microsoft Ignite.


Then there is the Many to Many relationships and bi directional cross filtering will be supported in SQL Server 2016 tabular models, which I am sure will also be included in the new Power BI backend. As this is where it stored all the data.


For Hybrid BI, there will be support for live querying for SSAS, currently this is already in place for SSAS Tabular.


It also looks like there will be a scheduled refresh for SQL Server Databases as a source. Which is great for people who either do not have either of the SSAS cubes, or want to get some of their data into Power BI.

Microsoft Azure Stack

While this is not exactly BI, it is related to BI, in that with Azure Stack you can get the Azure functionality on your own hardware which is fantastic. And I am sure for a lot of businesses this will be welcomed.

You can read more about it here: Microsoft Brings the Next Generation of Hybrid Cloud – Azure to Your Datacenter


BI-NSIGHT – Azure Analytics Services – Power BI Dynamics AX Connector – Auto refreshing of Excel Files in Power BI – Excel 2016 Updates

Once again this past week Microsoft as released even more products. So let’s get started.

Azure Analytics Services

On 29 April at the annual Microsoft Build conference they announced new services on the Azure platform. I really think that Microsoft is doing a great job of having an entire cloud platform available for customers.

Due to the digital age and a lot of data is coming from mobile devices, which in turn are using digital services, I can see looking forward that as Microsoft says in their video, there will be a major requirement going forward for businesses to be able to use digital platforms in order to gain insights into their data.

With the Azure Analytics Services, this enables businesses to easily use products, without having to actually build the infrastructure and knowledge on the platform.

I am going to highlight the two new services which relate to BI. The Azure SQL Data Warehouse and the Azure Data Lake.

Azure SQL Data Warehouse

I really think that this is a smart move from Microsoft. It is something that we have been looking for and asking for from Microsoft. From the video (Build 2015: Azure SQL Data Warehouse) it appears to me that not only will it be a data warehouse, but also enable you to move data in and out of the data warehouse using SSIS.

I do hope that my assumption is correct, as this would be a great feature to have SSIS on a cloud platform. And this has the potential to create a whole host of new possibilities.

Along with this, it also appears from what I read that they are using the APS (Microsoft Analytics Platform System) as a basis for the Azure SQL Data Warehouse. If this is indeed true, I think that this could be a game changer. Due to the fact that we already know how easily it is to scale out with the APS. Which gives amazing performance.

Another great advantage is it looks like you can use the Azure SQL Data Warehouse as a source in Power BI. Which then enables all the features of Power BI. This combination I think would make a great competitor to other BI cloud products on the market.

Also as per the video, they have made it really easy to change the scale of computing that you require. It is really easy to complete and it can be changed in seconds!

Credit: Microsoft

Azure Data Lake

This is another clever move, allowing businesses to store all their data in a Data Lake. Which then gives the business the flexibility to then use this with the various Azure Services.

Here is a link to the video (Build 2015: Data Lake)

What I think will be great is that you can then take data from your data lake, either put this into Azure SQL Data Warehouse, or then use a Machine Learning API to run over your data. After you have got the required results, you could then put this into your Azure SQL Data Warehouse, and put Power BI on top of this to visualize your results and see based on your specific Machine Learning API, what affect it could potentially have!

You also have the potential to use existing Machine Learning API developed by other people, which means that you do not have to spend countless hours trying to get the correct Machine Learning algorithm!

Here is an overview of the Azure Analytics Services from Microsoft

Credit: Microsoft

Here is the blog post from Microsoft: Microsoft Announces Azure SQL Database elastic database, Azure SQL Data Warehouse, Azure Data Lake

Power BI Dynamics AX Connector

Just a quick note that I received an email that the Power BI guys have started planning for the integration of Dynamics AX as a connector in Power BI.

I am sure that this is welcoming news for Microsoft Dynamics AX customers

Power BI Dynamics AX – Planned

Automatic Refreshing of Excel Workbooks in Power BI

Just a quick review of my blog post this week, which was really well received.

It just explains how using a Power Query and Power Pivot, in which you use to get data into Excel Workbook, you can then have a mechanism how to get your Power BI report updated automatically.

This means that you can have your data on premise which you can then ensure gets into Power BI and then can have the potential for real-time updates.


Excel 2016 Updates

Just another quick note if you want to read up about the updates for Excel 2016, head over to Chris Webb’s blog post which has all the relevant information.

Chris Webb – What’s New In The Excel 2016 Preview For BI?

That is it for this week, another busy week and a lot going on!


SQL Server 2014 Observations

Today’s post is not going to be about explaining something but more along the lines of my experience on SQL Server 2014.

When I heard about all the new features in SQL Server 2014, the In Memory Tables, new cardinality estimator, new changes to how TempDB works, SSIS and Delayed Durability, I was excited to see how I could use these new features for my data warehouses.

I did a full backup, restore to a Virtual Machine (VM), test the upgrade process to SQL Server 2014 to ensure that when I completed the upgrade I would not have any unforeseen issues. (As well as test my backup files and my restore document to ensure that I could restore my system from a backup)

So the first thing that I did was to test the In Memory tables, and I was excited by using the In Memory table with no durability, this would enable my staging tables to work that much faster in getting data from the staging tables into my disk based Fact Tables. I tried a few variations in terms of what Index to use and different types of tables to use. As we are all aware due to this being the first release of In Memory tables, there are some limitations. I have now come to the conclusion for my current workloads that the new In Memory tables offer no real benefit for my staging tables in my data warehouse. I find that at times it is slower than the disk based tables. (I am aware that I could well be not using them correctly, but I did quite a lot of reading and testing to ensure that I was doing what was suggested).

Next I was hoping the combination of the cardinality estimator, changes to the TempDB and Delayed durability would increase the throughput of my queries, which would lead to faster load times from my staging tables into my Fact tables. I have monitored the time my SSIS Packages have taken to run, and unfortunately I cannot see any vast improvement in my times.

What I can say that is working in the new SSDT with Visual Studio 2013 does seem a lot quicker to develop and when debugging in my experience it starts a lot quicker than in the past. Yes it did take some getting used to all the new images for all the control flow items, data flow items etc, but that is a small price to pay. I do prefer the Dark Theme Smile 

So for me in my current environment I would say that all the new changes have not made any real significant impact on my data warehouse. And have not made it any quicker to load data and get it into my Fact table.

NOTE: I welcome anyone who has any suggestions or idea’s to please let me know. I am more than willing to learn as well as potentially not have configured it correctly.

SSIS – Creating new Partitions in SQL Server Analysis Services (SSAS) with XMLA and Analysis Services DDL

What I wanted to do is to create a process in SSIS that did not use code to create new SSAS Partitions using XMLA.


NOTE: This is quite a long blog post, but it has all the details from start to finish to get this completed.



·         We will be using the AdventureWorks2012DW data warehouse and Cube.

·         We will be creating new Partitions starting from 01 Jan 2010 on our Internet Sales Measure Group.

o    When creating our Partitions we will be using the query binding because our underlying SQL Server Table will be partitioned by DateKey.

§  NOTE: Our SQL Server Table would also have a Clustered Index created on the DateKey.

o    For our query in Analysis Services Partition we will be using the first day and the last day of the month in order to get our required data into our Partition.

o    Along with this when we create our Partition we will also use the Slice Function to further help Analysis Services query the data as quickly as possible.

o    So our Start Datekey will be 20100101

o    And our End DateKey will be 20100131

·         Our Partitioning naming scheme will be the Measure group name and year and month

o    So for our example it will be Internet Sales 2010-01


You can get a copy of this here:

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012


Creating our Partitioned Table and starting point.

Below are the steps where we will create our table and our starting point for our example above.


1.       Below is the script that we will use on our AdventureWorksDW2012 SQL Server database which will be explained below:

— =======================================================================

— 1. This will create our Table

— =======================================================================












— =======================================================================

— 2. Then we insert our starting point into our Partitioned Table

— =======================================================================


Select 2009 as[YearOfLastPartition]




      ,‘Internet Sales – 2009-12’[PartitionName]

      ,‘Adventure Works’as[SSAS_DatabaseName]


— =======================================================================

— 3. Then we do a select to ensure that the details inserted are correct

— =======================================================================



a.        As you can see above in Section 1 is where we are creating our table.

b.       In Section 2 is where we are inserting the data which is easier to explain:

                                                               i.      clip_image001

                                                              ii.      NOTE: We will be using the above columns to generate our requirements for when creating our next partition.

1.       Which you can view here in section later in the document.

                                                            iii.      As you can see above because we want our next partition to be for Jan 2010, we need to start our details one month earlier. EG; December 2009


Getting details from a stored procedure to be used in SSIS to create our next partition

In the steps below we are going to show the stored procedure and explain how we get our details for our next partition.

We will then use these details and in the next step put this all into SSIS to create our partition.


1.       Below is the entire script and each section is explained in the comments section

a.        NOTE: A quick overview is that based on the information in the Mart_TD_Max_SSAS_PartitionDetails, we are getting the next month’s values for the following:

                                                               i.      Our Start and End DateKeys, Year Month for the Partition Name, as well as the year and Month name which are used in the proc.

b.       clip_image002




/****** Object:  StoredProcedure [dbo].[prc_GetNextPartitionValues_ForSSAS_SQL]    Script Date: 2014-04-09 10:46:59 AM ******/







— ===============================================================================================================

— Disclaimer

— The sample scripts are provided AS IS without warranty of any kind.

— I further disclaims all implied warranties including, without limitation, any implied warranties of

— merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance

— of the sample scripts and documentation remains with you.

— In no event shall I, its authors, or anyone else involved in the creation, production, or delivery of

— the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business

— profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use

— of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the

— possibility of such damages.

— ===============================================================================================================


— ===============================================================================================================

— Author:       Gilbertq

— Create date: 09 Apr 2014

— Description:  This is where we get the data from the Current or Last Partition and then get into variables

—              the next partition values to create the SSAS Partitions and SQL Partitions if required in SSIS

— ===============================================================================================================








— ==================================================================================================================

— 1. We are getting the Year Value of the last partition that was created from Mart_TD_Max_SSAS_PartitionDetails

— ==================================================================================================================





          from[dbo].[Mart_TD_Max_SSAS_PartitionDetails]with (nolock)



— ====================================================================================================================

— 2. We are getting the Month Name Value of the last partition that was created from Mart_TD_Max_SSAS_PartitionDetails

— ====================================================================================================================





          from[dbo].[Mart_TD_Max_SSAS_PartitionDetails]with (nolock)



— ====================================================================================================================

— 3. Here we are getting the Actual Date from our previous values so that it can be used in the next steps.

—    Below we are always getting the last day of the month for our current values.

— ====================================================================================================================




          Selecttop 1 [FullDateAlternateKey]

          fromDimDatewith (nolock)






— ====================================================================================================================

— 4. Now here we are getting the next date value, which will be the first day of the Next Month

— ====================================================================================================================





          fromDimDatewith (nolock)




— ====================================================================================================================

— 5. Next we are getting the Year value for our Next Month’s value

— ====================================================================================================================





          fromDimDatewith (nolock)




— ====================================================================================================================

— 6. Next we are getting the Month Name value for our Next Month’s value

— ====================================================================================================================





          fromDimDatewith (nolock)




— ====================================================================================================================

— 7. Next we are getting the first day of Next Month’s value

— ====================================================================================================================




          Selecttop 1 DateKey

          fromDimDatewith (nolock)






— ====================================================================================================================

— 8. Next we are getting the last day of Next Month’s value

— ====================================================================================================================




          Selecttop 1 DateKey

          fromDimDatewith (nolock)






— ====================================================================================================================

— 9. Next we are getting the Year Month Value for Next Month’s Partition

— ====================================================================================================================




          Selecttop 1 convert(Varchar(4),CalendarYear)+‘-‘+RIGHT(REPLICATE(‘0’,5)+CONVERT(VARCHAR(2),MonthnumberofYear),2)

          fromDimDatewith (nolock)





— ====================================================================================================================

— 12. And finally we are getting our 3 values for Next month which we can then use in our SSIS package for multiple

—     loads

— ====================================================================================================================















2.       Run this against your SQL Server AdventureWorksDW2012 database.


SSIS to create new Partitions for SSAS

In the steps below we will use SSIS to create our new Partitions for our SSAS Cube.


1.       The first thing to do is to create the following variables which will be used for our entire SSIS Package:

a.        clip_image003

b.       NOTE: The reason that I set them all to the Data type of string is so that when they are used later in the expressions we do not have to cast the values.

                                                               i.      This is because all our expressions will either be passed to SQL server or Analysis Services which only accepts it in a string format.

                                                              ii.      We also put in defaults so that when we Evaluate the Expressions later we can see values.

2.       Next we are going to get all our variables for our stored procedure above and put them into the above variables.

a.        Drag in your Execute SQL task and as with our example we gave it the following name:

Get Variables for Next Months SSAS Partition for Internet Sales

b.       Next right click and select Properties.

c.        You will have to create your OLE DB connection to your SQL server where you have your AdventureWorksDW2012 database.

                                                               i.      clip_image004

d.       Then configure the General window with the following as shown below:

                                                               i.      clip_image005

e.       Then click on the Result Set on the left hand side and configure it with the following to map our variables from our Stored Procedure to our Variables in SSIS

                                                               i.      clip_image006

                                                              ii.      NOTE: All of the above will be used in different sections throughout the SSIS Package.

f.        Click Ok.

g.        Then right click and select Execute Task, to test that it runs correctly.

                                                               i.      You should see the following below:

                                                              ii.      clip_image007

3.       Now in the following steps I am going to explain how initially to manually create our Partition so that we can then use this for our SSIS in the next step.

a.        Log into your SSAS Server and go into the Measure Group and then Partition where you want to automate your partition creating.

b.       NOTE: As with our example we are going to create a new Partition on the Internet Sales Measure Group for December 2009

                                                               i.      clip_image008

c.        Right click on Partitions and select New Partition, this will open the Partition Wizard.

                                                               i.      We selected the Internet Sales Facts table as shown below and clicked Next

1.       clip_image009

                                                              ii.      On the Restrict Rows we selected Specify a query to restrict rows

1.       NOTE: We are doing this due to wanting to specify and our start and end DateKey

2.       clip_image010

3.       We then scrolled to the end of the Query and put in the following as per our example

a.        clip_image011

                                                            iii.      We accepted the defaults until we got to the final window. And put in the details below.

1.       clip_image012

                                                            iv.      We then clicked Finish.

d.       You will now see the partition we created

                                                               i.      clip_image013

4.       Next is where we are going to script out our Partition so that we can then use this within SSIS

a.        Right click on the partition we created above and select Script Partition as, Create To, New Query Editor Window.

                                                               i.      clip_image014

                                                              ii.      You should see the following below which is not the complete script

1.       clip_image016

b.       Next you will first have to do a find and replace on all the double quotation so that it can be escaped in SSIS.

                                                               i.      Press Control+H and complete the following below:

1.       clip_image017

                                                              ii.      Then click Replace All.

c.        If you have any singles quotes in your Query Definition you will also have to change them to have 2 single quotes so that when this is parsed by TSQL it will work.

                                                               i.      So as with our above example we also had to change the following from:

1.  + ‘Line ‘  +

2.       To: + ”Line ”  +

d.       Now open a TSQL Query and first put in the following:


                                                               i.      Then take your XMLA script from above and insert it in between the single quotes above.

                                                              ii.      Now run the TSQL Select to ensure that it will run successfully

1.       clip_image018

                                                            iii.      NOTE: The reason that we parse this in TSQL is so that in SSIS we can then put in our variables as required.

e.       Now take the entire TSQL statement from step 4c above and copy it.

f.         Now go back into SSIS and open your variables.

                                                               i.      Where it has the variable name of XMLAQuery_CreateSSASPartition click on the Ellipses under Expression

                                                              ii.      Then in the Expression Window put in a double quote at the start and insert your TSQL query from step 4d above:

1.       clip_image019

                                                            iii.      Then scroll right to the end and put in another double quote at the end

1.       clip_image020

                                                            iv.      Then click on Evaluate Expression to ensure that so far it is all correct.

g.        Now the final part in creating our script is to put in the required variables.

                                                               i.      The first section where we are going to add the variables is for the ID and name of our partition.

                                                              ii.      As with our example we put in the following:

1.       What it looked like before the change:

<ID>Fact Internet Sales 2009-12</ID>

<Name>Fact Internet Sales 2009-12</Name>

2.       And what it was with the variables inserted:

<ID>Fact Internet Sales “+ @[User::YearMonthForNextPartition]  +”</ID>

<Name>Fact Internet Sales “+ @[User::YearMonthForNextPartition]  +”</Name>

                                                            iii.      In the next section we are going to put in our Start and end DateKey’s for our Query Definition

1.       What it looked like before the change:

WHERE orderdatekey between 20091201 and 20091231</QueryDefinition>

2.       And what it looked like after the change:

WHERE orderdatekey between “+ @[User::StartDateKeyForNextPartition]  +” and “+ @[User::EndDateKeyForNextPartition]  +”</QueryDefinition>

                                                            iv.      Now what we are going to add is our Slice Property to our Partition.

1.       So after the section below is where you will put in the Slice Property


2.       As you can see we have put in the Slice Property with the variables already inserted:


<Slice>{[Date].[Date].&amp;[“+ @[User::StartDateKeyForNextPartition]  +”],[Date].[Date].&amp;[“+ @[User::EndDateKeyForNextPartition]  +”]}</Slice>

                                                              v.      Now click on Evaluate Expression to ensure that everything is correct.

1.       If you scroll down you should see the following

a.        clip_image021

b.       NOTE: This was due to us creating the values for our variables in Step 1 above.

5.       Next we are going to be taking our script which has been populated with the details above and put this into a variable which can then be passed to our Analysis Services Execute DDL Task.

a.        Drag in an Execute SQL Task.

b.       We then gave it the following name:

                                                               i.      Get XMLA for SSAS to create new Partition

c.        Next right click and go into the Properties and configure it with the following as shown below:

                                                               i.      clip_image022

                                                              ii.      NOTE: As you can see above we are using the variable which we populated with our XMLA script in step 4 above.

                                                            iii.      We are also setting the Result Set to a Single row. This is so that we can populate our XMLA Script which will then be passed to Analysis Services.

1.       Just to ensure the understanding is that this Execute SQL Task will populate our XMLAQuery_CreateSSASPartition variable with the details.

2.       After which we want the output to be inserted into our XMLA Script which will then be used in Analysis Services.

d.       Now click on Result Set and configure it with the following below:

                                                               i.       Click On Add and select the following:

                                                              ii.      clip_image023

e.       Then Click Ok.

6.       Next we will need to configure our Analysis Services Execute DDL Task with the following steps below.

a.        Drag in the Analysis Services Execute DDL Task

b.       Right click and select Properties to go into the Properties.

c.        Under Name we gave it the following name as per our Example:

                                                               i.      Run XMLA Script to create new SSAS Partition

d.       Then click on DDL.

                                                               i.      In the right hand side where it says Connection you are going to have to create a connection to your SSAS, and Database

                                                              ii.      Click on New Connection

                                                            iii.      As with our example we created our connection

1.       clip_image024

                                                            iv.      Then Click Ok Twice to get back.

                                                              v.      Then next to Source click on the Drop down and select the Variable that we populated in the previous steps.

                                                            vi.      As with our Example we are going to select the variable that we populated in step 5 above.

                                                           vii.      So once complete it will look like the following:

1.       clip_image025

                                                         viii.      Then Click Ok.

7.       Next we need to create our Update TSQL Statement for our Mart_TD_Max_SSAS_PartitionDetails table so that we can dynamically update this after the partition above has been created.

a.        Go into your variables and click on the ellipses button next to Query_Update_Mart_TD_Max_SSASPartitionDetails

b.       Then we put in the following for our Update Statement in the Expression Window:

“Update [dbo].[Mart_TD_Max_SSAS_PartitionDetails]

 Set         [YearOfLastPartition] = “+ @[User::YearForNextPartition]  +”

                                ,[MonthNameOfLastPartition] = ‘”+ @[User::MonthNameForNextPartition]   +”‘

                                ,[YearMonthOfLastPartition] = ‘”+ @[User::YearMonthForNextPartition]  +”‘

                                ,[DateTimeInsertedIntoTable] = getdate()

                                ,PartitionName = ‘Internet Sales “+ @[User::YearMonthForNextPartition]  +”‘

                                ,SSAS_DatabaseName = ‘Adventure Works'”

                                                               i.      Then click on Evaluate Expression to ensure that it is valid and you should see the following:

1.       clip_image026

c.        Then click Ok.

8.       The next part is to update our Mart_TD_Max_SSAS_PartitionDetails with our last Partition that was created in the steps above. This is so that we have a starting point for the next time this runs.

a.        Drag in an Execute SQL Task and give it the following name:

                                                               i.      Update Mart_TD_Max_SSAS_PartitionDetails with Last Partitions created

b.       Then configure it with the following:

                                                               i.      clip_image027

                                                              ii.      NOTE: The variable that we selected is called: Query_Update_Mart_TD_Max_SSASPartitionDetails

c.        Then click Ok

9.       Now the final part is where if the create partition script fails to then send an email so that the person responsible is made aware of the error.

a.        Drag in your Send Mail Task and drag it under your Run XMLA Script to create new SSAS Partition

b.       Drag the precedence constraint to your Send Mail Task and change it from success to failure.

                                                               i.      clip_image028

c.        Then configure your Send Mail Task as you require for the person responsible.

10.    So once complete it looks like the following:

a.        clip_image029

11.    Now that it is complete you can run the package.

a.        So when we looked at our Partitions before we ran the package it looked like the following:

                                                               i.      clip_image030

                                                              ii.      NOTE: The reason for our Fact Internet Sales 2009-12 was so that we could create our script.

b.       Now after running the package it completes as shown below:

                                                               i.      clip_image031

c.        Now if we look at our Partitions again we will see the following after refreshing it:

                                                               i.      clip_image032

d.       And then finally if we look at our Mart_TD_Max_SSAS_PartitionDetails it should have the details of our Partition we created above:

                                                               i.      clip_image033



SSIS (SQL Server Integration Services) – Using the Lookup Transformation and cache and how to handle NULL Values

I had a situation where I was using the Lookup transformation and then loading this into the SSIS Cache, but I wanted all rows to be inserted using the SSIS Cache, even if there was a NULL Value. Below explains how I overcame this.



·         In our Source data we have Products, but they might not be in our Product Dimension table, due to the source systems not having the required data.

o    NOTE: While this is not ideal and we either should fail the component or redirect it, for this example this is fine.

·         We have created a row in our Products Table for Products that we do not find.

o    EG:

§  ProductsSK ‘-1’

§  ProductsName ‘Product not Found’

·         So when we insert data we want to ensure that even when products are not found, they will still exist in the Lookup Transformation and cache.


1.       I created a Lookup Transformation in SSIS which I then configured as the following explained below.

2.       On the General Tab I configured it with the following as shown below:

a.        clip_image002

b.       NOTE: The reason that I selected the Redirect rows to no match output in the Specify how to handle rows with no matching entries, is because due to the configuration of our Lookup transformation all the rows WILL find a match.

3.       Next click on the Connection on the left hand side.

a.        clip_image004

b.       Now you will select your OLE DB connection to your database where your Product Dimension is located.

c.        Then select Use results of an SQL Query.

d.       Then what we did was to put in the following SQL Query which we will explain why we did this after the query


  FROM[dbo].[Mart_TD_Products]with (nolock)




  Selectnullas[ProductsSK],1 asProductsSKToUse

                                                               i.      Now as you can see above we have create a duplicate column names from our Dimension Table.

                                                              ii.      The reason for this is in the second part of the query with the Union All

                                                            iii.      Now you can see that we have actually put in a NULL value, and then given this NULL value a value of -1

1.       NOTE: This corresponds to the data that we already have in our Products Dimension table.

                                                            iv.      So now when you load your query into the SSIS Cache it will also have a value of the following:

1.       clip_image006

e.       Now when the lookup is running in SSIS, when it finds a NULL value it will assign it a value of ‘-1’ to the ProductsSKToUse

4.       Then click on Columns and create the lookup

                                                               i.      clip_image008

                                                              ii.      As you can see above we dragged the ProductsSK to our ProductsSK in our Available Lookup Columns

                                                            iii.      And then below you can see that our Output Alias is ProductsSKToUse

5.       This will then be used in our insert into our Fact Table.

6.       And in this way when there is a NULL Value it will still be matched.

a.        So as you can see for our 2 rows, we had 2 matches even when 1 row was NULL

b.       clip_image010

SSAS (SQL Server Analysis Services) – Securing Measures in Measure Group for specific Roles

What I had to do today for the first time is to only show specific measures in the measure group for particular Roles (Users or AD Groups)

·         The reason that this was required is because we had sensitive information that was not for everyone’s eyes.


NOTE: From my recent review and read through from the following book, that the least restrictive will apply for a particular User.

·         So if User (BOB) belonged to one SSAS Role which only allowed access to the count of rows. And User (BOB) also belonged to an SSAS Role which has access to the financials he would be granted access to both the count and the financials.

·         Another thing to remember is that your calculated measures will be shown regardless of the Role permissions.


You can get a copy of this here:

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012



·         We are going to create a Sales Role, which will only be able to see the Sales Orders Measures group and associated Measures for the Adventure Works Cube.

·         We will also create a Financial Reporting Role, which will be able to only see the Financial Reporting Measures group and associated Measures for the Adventure Works Cube.


Creating the Roles

1.       The first thing that we are going to do is to create the 2 roles.

2.       Right click on Roles and New Role

a.        clip_image002

3.       It will default to Role.Role.

a.        Right click, select Rename and change it to Sales Orders.Role

b.       It will prompt you if you want to change the Object Name also, click Yes.

c.        clip_image004

4.       Next we will create our Financial Reporting Role.

a.        You can follow steps 2 and 3 above so that you will then see your Financial Reporting.Role.

b.       clip_image006


Adding Permissions and configuration to see particular measures

In this next section we are going to configure each of our Roles, and then configure which measures they can see.


1.       We will start with the Sales Orders.Role.

a.        In the General Tab, make sure you set the database permission for this role to:

                                                               i.      Read definition

                                                              ii.      clip_image008

b.       Click on the Membership tab and put in your domain users, or ideally your Domain Groups who are part of the Sales team.

c.        Then click on Cubes.

                                                               i.      Then next to the Cube Name of Adventure Works, make sure that you change the access from None to Read as shown below:

                                                              ii.      clip_image010

                                                            iii.      After allowing Read access click on the Save button, it will then prompt you saying that the following objects will also be saved:

1.       clip_image012

2.       Click Ok.

                                                            iv.      NOTE: You have to first save the access to the cube, if you do not do this in the next steps under Dimension Data you will NOT see the Measure Groups in order to change the permissions.

d.       Next click on Dimension Data.

                                                               i.      Where it says Dimension click on the drop and select the following as shown below:

1.       clip_image014

2.       Then click Ok.

3.       NOTE: You will see all of the Measure Group values.

a.        So you will need to know which measures are associated to your Measure Group.

b.       In our example for Sales Orders it was the following:

c.        clip_image016      

                                                              ii.      Now click on the Deselect all Members, this is so that we only need to select the Members that we WANT to enable.

1.       clip_image018

2.       Now scroll through the list until you find Order Count.

3.       And as shown below put a tick next to Order Count.

a.        clip_image020

                                                            iii.      Now save your Sales Order.Role.

2.       Now open your Financial Reporting.Role

a.        In the General Tab, make sure you set the database permission for this role to:

                                                               i.      Read definition

                                                              ii.      clip_image008[1]

b.       Click on the Membership tab and put in your domain users, or ideally your Domain Groups who are part of the Sales team.

c.        Then click on Cubes.

                                                               i.      Then next to the Cube Name of Adventure Works, make sure that you change the access from None to Read as shown below:

                                                              ii.      clip_image021

                                                            iii.      After allowing Read access click on the Save button, it will then prompt you saying that the following objects will also be saved:

1.       clip_image022

2.       Click Ok.

                                                            iv.      NOTE: You have to first save the access to the cube, if you do not do this in the next steps under Dimension Data you will NOT see the Measure Groups in order to change the permissions.

d.       Next click on Dimension Data.

                                                               i.      Where it says Dimension click on the drop and select the following as shown below:

1.       clip_image014[1]

2.       Then click Ok.

3.       NOTE: You will see all of the Measure Group values.

a.        So you will need to know which measures are associated to your Measure Group.

b.       In our example for Financial Reporting it was the following:

c.        clip_image024  

                                                              ii.      Now click on the Deselect all Members, this is so that we only need to select the Members that we WANT to enable.

1.       clip_image025

2.       Now scroll through the list until you find Amount

3.       And as shown below put a tick next to Amount.

a.        clip_image027

                                                            iii.      Now save your Financial Reporting.Role


Updating the cube with the Roles and testing.

The final step is to update the cube with the new roles and to test.


1.       What I did is I did a process Update on the Products dimension.

a.        NOTE: This just enables me to push the changes to the SSAS cube.

2.       Next I opened up the Adventure Works Cube and then clicked on the Browser Tab.

3.       Now where it says Change User click on the Icon.

a.        clip_image029

b.       This will then open the Security Context – Adventure Works window.

c.        Then click on Roles and click on the drop down and select Sales Orders

                                                               i.      clip_image031

d.       Then click Ok.

4.       Now when we browse the cube under Measures we only see the following:

a.        clip_image033

5.       And if you had to follow steps 3 above and change it to Financial Reporting you would see the following in the browser.

a.        clip_image035