BI-NSIGHT – SQL Server 2016 – Power BI Updates – Microsoft Azure Stack

Well I have to admit that it seems that the Microsoft machine has been working flat out to get out new products and updates.

If my memory serves me, this is the third week in a row that Microsoft has released new products and updates. I am really enjoying it! But hopefully this will slow down, so that we can catch our breath and actually play with some of the new products and features.

So let’s get into it. There is quite a lot to go over!!

SQL Server 2016

So with Microsoft Ignite happening this week, the wonderful guys from Microsoft have started to announce what we can expect to be in the next version of SQL Server.

I am going to focus mainly on the BI (Business Intelligence) features, and there are quite a few! Some of what I am detailing below I have only seen pictures on Twitter, or I have read about it. As well as the preview not even being released yet, I am sure that there will be some changes down the line.

SQL Server Reporting Services

It finally appears that Microsoft has been listening to our cries and requests for updates in SSRS (SQL Server Reporting Services)!

Built-in R Analytics

I think that this is really amazing, even though I am not a data scientist, I think it is a smart move my Microsoft to include this. What this means in my mind is that you can now get the data scientists to interact and test their R scripts against the data as it sits in the transactional environment. And from there, this could then be used to create amazing possibilities within SSRS.

Power Query included in SSRS

From what I have seen people tweeting about as well as what I have read, it would appear that Power Query will be included in SSRS. This is fantastic and will mean now that virtually any data source can be consumed into SSRS.

New Parameter Panel, chart types and design

I am sure we can all agree, that SSRS has required an overhaul for some time. And it seems that finally we are going to get this. It would appear that the parameters panel is going to be updated. I do hope that it will be more interactive and in a way react similar to the way the slicers do in Excel. As well as getting new chart types. Here again I am going to assume that it will be similar to what we have in Power BI!

And finally there was also mention that the reports will be rendered quicker, as well as having a better design. I also am hoping that they will deploy this using HTML 5, so that it can then be viewed natively on any device. It is going to be interesting to see what Microsoft will incorporate from their acquisition of Datazen.

SQL Server Engine

Built-in PolyBase

Once again, this is an amazing feature which I think has boundless potential. By putting this into the SQL Server Engine this means that it is a whole lot simpler to query unstructured data. Using TSQL to query this data means that for a lot of people who have invested time and effort into SQL Server, now can leverage this using PolyBase. And along with this, you do not have to extract the data from Hadoop or another format, into a table to query it. You can query it directly and then insert the rows into a table. Which means development time is that much quicker.

Real-time Operational Analytics & In-Memory OLTP

Once again the guys at Microsoft have been able to leverage off their existing findings with regards to In Memory OLTP and the column store index. And they have mentioned in their testing that this amounts to 30x improvement with In-memory OLTP as well as up to 100x for In-Memory Column store. This is really amazing and makes everything run that much quicker.

On a side note, I did read this article today with regards to SAP: 85% of SAP licensees uncommitted to new cloud-based S/4HANA

I do find this very interesting if you read the article. What it mentions is that firstly 85% of current SAP customers will not likely deploy to the new S/4HAHA cloud platform. Which in itself does not tend well for SAP.

But what I found very interesting is that to make the change would require companies to almost start again for this implementation. In any business where time is money, this is a significant investment.

When I compare this to what Microsoft has done with the In-Memory tables and column store indexes, where they can be used interchangeably, as well as there is some additional work required. On the whole it is quick and easy to make the changes. Then you couple this with what Microsoft has been doing with Microsoft Azure and it makes it so easy to make the smart choice!

SSAS (SQL Server Analysis Services)

I am happy to say that at least SSAS is getting some attention to! There were not a lot of details but what I did read is that SSAS will be getting an upgrade in Performance usability and scalability.

I am also hoping that there will be some additional functionality in both SSAS OLAP and Tabular.

SSIS (SQL Server Integration Services)

Within SSIS, there are also some new features, namely they are also going to be integrating Power Query into SSIS. This is once again wonderful news, as it means now that SSIS can also get data from virtually any source!

Power BI

Once again the guys within the Power BI team have been really busy and below is what I have seen and read about in terms of what has been happening within Power BI

Office 365 Content Pack

I would say that there are a lot of businesses that are using Office 365 in some form or other. So it makes perfect sense for Microsoft to release a content pack for Office 365 Administration

As you can see from the screenshot below, it gives a quick overview on the dashboard to see what activity is happening. As well as details of other services. I am sure that this will make a quick overview of your Office 365 systems really easy to see. And also if there are any potential issues, this could also be highlighted!

Visual Studio Online Content Pack

Another content pack that is about to be released is for people who use Visual Studio Online, I personally do not currently use this. But it does look great for people to once again have a great overview of what is going on.

You can read more about it here: Gain understanding and insights into projects in Visual Studio Online with Power BI

And as you can see below, what you can view once you have got it setup within Power BI.

Power BI planned updates

Below are the updates that I had previously voted for in Power BI. It is great to see that the Microsoft team is actively listening to their customers and implementing some of the idea’s. I have to say that I do not think that there are many other software companies that are doing this currently. And also being able to roll it out as quickly as Microsoft is.

Set Colors and Conditional Formatting in visuals

  • This is great as it will allow the report authors to have more control in terms of how their reports look.

Undo / Redo button in browser & designer

  • This might seem like a small update, but I know personally from working with the Power BI reports, that sometimes you just want to see what a different report looks like. Or adding another element. And with the Undo / Redo buttons, it just saves that little bit of time, as well as to make the report authoring experience that much more enjoyable.

Power BI Announcements from Microsoft Ignite

Below are some announcements that I have read up about either on Twitter or one of the blogs that I follow. It is really great to see so many things in the pipeline.

This means that there is a lot to look forward to, as well as ensuring that we have new and wonderful things to show.

I got this picture via Twitter, which someone must have taken at the Microsoft Ignite Conference. As you can see it is not very clear, but it does show the next update in the Power BI Designer.

You can also see the undo and redo buttons.

It also appears that in the Power BI service, there will be support for SSRS files, namely the .rdl files! Here is another picture taken from Microsoft Ignite.


Then there is the Many to Many relationships and bi directional cross filtering will be supported in SQL Server 2016 tabular models, which I am sure will also be included in the new Power BI backend. As this is where it stored all the data.


For Hybrid BI, there will be support for live querying for SSAS, currently this is already in place for SSAS Tabular.


It also looks like there will be a scheduled refresh for SQL Server Databases as a source. Which is great for people who either do not have either of the SSAS cubes, or want to get some of their data into Power BI.

Microsoft Azure Stack

While this is not exactly BI, it is related to BI, in that with Azure Stack you can get the Azure functionality on your own hardware which is fantastic. And I am sure for a lot of businesses this will be welcomed.

You can read more about it here: Microsoft Brings the Next Generation of Hybrid Cloud – Azure to Your Datacenter


Automating refreshing of Power BI Reports and Dashboards with Excel Workbooks and Power Update

I recently tested to see if it was possible to upload an Excel Workbook to One Drive for Business, and then use this within Power BI. This was successful.

Then I saw, and it appeared that if you used an Excel Workbook from One Drive for Business, that it would automatically refresh the file. Which I thought it might possibly mean that if you updated your Excel Workbook in your One Drive for Business folder it would then update your dashboards and reports in Power BI.

The outcome was that it does indeed do this. Which means that we not have a means to update Power BI dashboards and reports from our Excel Workbooks which can be stored On Premise.

This also gives us the ability to take any source information that we can get into the Power Pivot Model and get this to update in Power BI. So this also means that anything that we do with Power Query, which then goes into our Power Pivot Model (Data Model) can then be updated in Power BI!

And below I will explain how to do this using Power Update

NOTE: The reason is you could manually update your Excel Workbook, and then save it to your One Drive for Business. But ideally if we can automate the process, this then means that we can have our Power BI Dashboards and reports updating automatically.

NOTE II: If you would like to know more about Power Update, please read the following Blog Post Power Pivot Pro (Rob Collie): Introducing Power Update

  • Within the version they do have a free option!

Example Data:

  • We are going to use an Excel workbook which I used previously which had some Adventure Works data. The details of where the Excel Workbook are is listed below.

Uploading Excel Workbook with Power Pivot and Power View to One Drive for Business

Below are the steps where I upload my Excel Workbook, which has a Power Pivot Data Model as well as a Power View Report to One Drive for Business.

  1. The first thing is that I created my Excel Workbook, with Power Pivot and Power View.
  2. NOTE: The reason that I created this is so that when it is uploaded to Power BI, the entire Power Pivot Model will be exposed in Power BI.
  3. In order to do this, please refer to my previous blog post to create it with your own data
    1. Power Pivot and DAX in Power BI
  4. Next I uploaded my Excel Workbook to One Drive for Business

Importing your Excel Workbook from One Drive for Business into Power BI and create a simple report and Dashboard

Next we will import our Excel Workbook using the file which we uploaded into One Drive for Business within Power BI

  1. Log into Power BI
  2. Then click on Get Data
  3. Once this opens, make sure you select Excel Workbook, then click on Connect
  4. Now on the next screen you will see the options for Computer, OneDrive – Personal, OneDrive – Business
    1. Click on OneDrive – Business
    2. NOTE: The first time it might prompt you to log in and authenticate you to your OneDrive for Business Folder
    3. Then go to the location where you saved your Excel Workbook in the previous steps.
    4. Then click
    5. NOTE: As you will see above currently my Modified Date is from 5 days ago.
  5. Now under Datasets click on your Excel Workbook you just uploaded.
    1. Click on Explore
    2. NOTE: You will also see that the Last refresh succeeded is Tue Apr 28 12:11:34 BMT +1000
  6. We are going to create a very simple report and then a dashboard
    1. We then saved this report as AW Auto Update
  7. We then and created a new Dashboard with the same name as above.
    1. NOTE: If you create a dashboard with the same name as your report, it will automatically put the pinned
      items into that dashboard.
  8. We then went back to our report from step 6 and pinned both items to our dashboard, as shown below:
  9. Now we are at the point to use Power Update and to test the auto updating of Excel Workbooks from One Drive for Business

Using Power Update to update our data as well as Upload to One Drive for Business

In this next section we are going to configure Power Update, to refresh our data in our Power Pivot Model. And then automatically upload it into our One Drive for Business Folder

NOTE: You can download and find out how to install Power Update from this blog post by Power Pivot Pro (Rob Collie): Introducing Power Update

  • Within the version they do have a free option!
  1. Due to the way I brought the data into my Power Pivot Model in my Excel Workbook, I did go and update my source query to the AdventureWorksDW2014 database.
    1. But you would normally have your query dynamic where it is getting its source data from.
  2. For our example to show the automatic refreshing of data from One Drive for Business into Power BI, I am going to ensure that we have Sales for 2010 – 2013
  3. In the next steps I am going to show you how to configure Power Update to refresh your Excel Workbook.
    1. Open Power Update
    2. Click on New
    3. Give your Task a Name
      1. In our example it will be called AW Auto Update.
      2. Click Next
    4. For our example we set it to Daily (or Multiple times a day)
      1. Click Next
    5. On the Setup a daily schedule we just accepted the defaults
      1. Click Next
    6. On the Set up multiple times per day schedule, we set ours to run every 1 hour
      1. Click Next
    7. Now on the step for select Destination Type, ensure to select SharePoint
      1. NOTE: Even though we are using One Drive for Business the underlying
        data is stored within SharePoint. So it will work.
      2. Click Next
    8. For our example we only have a single workbook, so on the Select source workbooks, we selected Update a single workbook
      1. Click Next
    9. On the Select workbook to update, select the location of your Excel Workbook on your file
      system or shared folder.
      1. Click Next
    10. Now on the SharePoint Settings, we have to configure the following.
      1. First click on New under Credentials
        1. Make sure you put a tick next to Office 365 / Power
        2. Click
      2. Now on the SharePoint Site URL you will need to put in the following:
        1. Go to your One Drive for Business Folder in your browser
        2. NOTE: Copy and paste the entire URL, Power Update is smart and it knows which part of the URL it requires.
        3. Then click on Select
          1. Now browse to the folder location where you previously uploaded your Excel Workbook in the previous section: Importing your Excel Workbook from One Drive for Business into Power BI and create a simple report and Dashboard
        4. So that once completed it will look similar to the following below:
      3. Click Next
    11. Now the Wizard is completed and you should see the following in Power Update
  4. Next we are going to manually run this now, so that we can show how it will auto refresh the data in Power BI once the process is completed.

Automatic Refresh of Power BI Report confirmation

Below we will see if Power BI refreshed the Excel Workbook from our One Drive for Business

  1. The first place to see if our Excel Workbook has been updated via Power Update
    1. As we can see above it has been updated.
  2. Now we will have to wait and for Power BI to go and check the Excel Workbook and update the reports and dashboard
    1. NOTE: From what I read, the Dashboard tiles auto update every 10 minutes, I am not sure what the schedule is for Excel Workbooks in One Drive for Business
    2. But there must be some mechanism in place where it looks and updates Excel Workbooks connected to One Drive for Business
  3. I then went back to my Datasets and clicked on the Ellipses button and after about 4 – 5 minutes I saw the following:
    1. As you can see above the Last refresh succeeded time has changed from our previous value to Tue Apr 28 12:58:13 GMT +1000
  4. Now I then went back to our Dashboard and saw the following below
    1. As you can see from above, we now have the year 2010 (albeit small) as well as the increase in Sales


So in conclusion you now have a process to fully automate getting data from your On Premise source Excel
Workbook, and getting your reports to Automatically update in Power BI.


BI-NSIGHT | Power Pivot &  DAX in Power BI – SharePoint 2016 – Power BI AS Connector – Power BI Designer Preview Update – New Power BI Content Packs

In a week there have been a whole host of updates and changes within the Microsoft BI space. So let’s get to it.

Power Pivot & DAX in Power BI

This is related to my blog past that I created earlier this week (Power Pivot and DAX in Power BI). And it was amazing to see that later on in the week there was a Power BI Designer update which now also allows DAX calculations as well as a whole host of other changes.

The thing that I want to highlight in terms of my blog post. Is that once you have created your Power Pivot Model, and then created a Power View report in Excel, everything that you can do in Excel, will not be available to you when you upload your Excel workbook into Power Pivot. So this will give you some additional functionality which is currently not available by default in Power BI.

Below is a screenshot of the dashboard that I created using Power Pivot and Excel.

SharePoint 2016

Microsoft last week also released some information with regards to SharePoint 2016 and what we can expect. The major take away that I got from this blog post from Microsoft was that I was looking for any new features with regards to BI. And there was a little glimmer of hope, where they said that they were looking to build a hybrid approach, so that companies could leverage the best of both. They did not go into too much detail, as it is still a long way off their release date. But to me it does look promising in terms of having some of their amazing work that they have done in the cloud space (Power BI), put back into the on premise space.

You can read their blog post here: SharePoint Server 2016 update

Power BI AS Connector Update

Last week Friday, Microsoft also released an update for the Power BI AS Connector. It was great to see that they added some additional features.

Starting with the Automated checks to ensure that your account can connect to the Power BI Service. This will ensure that once you have resolved all the errors, you will then be able to successfully connect to Power BI.

Support for Network Proxy, which is great as a lot of companies have network proxies in place for various reasons. And now you can still connect to Power BI via your network proxy.

Notification when a new Power BI AS Connector is available. Which I think is really great and a feature that I see in some of Microsoft’s other products (Power Query). I find it great, so that I do not have to go and make a reminder to go and look to see if there is a new version, it will just notify me that there is.

There is now support for all languages, which is a good thing. As not everyone has their native language in English.

They have also resolved a whole stack of bugs and diagnostics. The one that I really liked is that it will now let you know if your password has expired. Meaning that you cannot connect to the Power BI Service. This is great, because in the past I have had services running where the password for the domain account expired. And it took at times a lengthy amount of time to actually find the cause of the issue!

You can read the blog post from Microsoft here: What’s new in the April release of the Analysis Services Connector?

Power BI Designer Preview Update

I have to say that this is a major update in terms of the offering for the Power BI Designer.

I am not going to go into detail for all the updates, but just highlight the ones that I think are game changers.

Initial Support for DAX Measures

As per my blog post at the top, I could see that they had already provisioned Power BI to be able to seamlessly work with DAX measures. And now it is available as part of the Power BI Designer. This is really great and something that a lot of customers and users have been asking for.

This will enable people to create great reports and dashboards, because they can now create their custom DAX measures which will also improve the reporting experience.

As part of the support for DAX Measures they have also included the following:

  • Formula bar, so that you can now create your DAX Measures in the formula bar, with the intelli-sense and using Multiple lines
  • Comments, this allows you to put comments into your DAX Measures, which is great because if you come back at a later point, at least you can have some idea of what you were doing. As well as if other people are going to be working on your data, they can have some comments to guide them in terms of what you were doing with the particual DAX Measure.
  • Saving Incomplete Measures, this is another great feature, because there have been loads of times when I have either been chatting to someone, or you want to go and validate some data before adding it to your DAX measure. And in the past it would either error, or you would have to cut it out, remove the DAX Measure and then come back and add it in again. This is really useful.

Data Types, formatting, rename and delete in the Report View

I have bundled all these updates into one here, because they all relate to working within the Report View.

What this means that if you want to change a data type, or the formatting, renaming a column or measure, or even delete the column or measure, you can now do this directly in the report view.

I see this as a great feature. As once again when working in Power Pivot, you always have to open Power Pivot if not open, go back in, make the change to your column or measure, then go back into your sheet, which will reflect the update and then continue on working. As you can just see from my typing out what would need to be done how much time that takes.

Instead now in Power BI Designer you can just right click, make your change and you are done! Such a time saver as well as great functionality.

You can read about all the other features and updates here: 16 New Updates to the Power BI Designer Preview

Power BI Content Packs

Today the Power BI Team also announced and released to additional content packs for Power BI.

I think that these content packs are really a smart move from Microsoft. It is a great starting point, because often there is so much new data, that to try and understand where to start or what to look for can be quite daunting. So Power BI gives you a launching pad to start from.

The two additional content packs that got release today are below:

Monitor and explore your Microsoft Dynamics Marketing data with Power BI

Stream sensor data to Power BI with Temboo

I want to quickly highlight that with Temboo, if you have any IDE (Integrated Drive Electronics) this allows you to go and upload it directly into Power BI. And the data can be uploaded in real-time. Which then means that your reports and dashboards will also update in real-time. To me there are so many sensors as well as people putting these types of devices into their own homes, that this is a great feature.

Go and click on the video to see what they are doing. Really powerful stuff.

That’s it for this week! Looking forward to see what is going to be coming out next week! Stay tuned.

SSIS – Configuring SSIS Package to run job after data loading has completed

  • In our example below, we are going to be using a SharePoint 2013 Data Alert job which was created in SQL Server Agent, and then insert this into an existing
  • The reason for doing this, is so that every time our SSIS Project runs, as part of the process, it can then fire off a SQL Server Job.
  • Essentially in the past the SQL Server Agent Jobs run
    individually and are not tied into when our data has loaded.
  • By doing it in this manner, it enables the related job to be run after every data load. Which then enables the business to get the relevant data at the relevant times.


NOTE: You can modify this for any SQL Server Job.

Find the related SQL Server Agent Job Name

Below details how to find the related SQL Server Agent Job name. As typically when you create a SharePoint Data Alert, or SQL Server Reporting Services (SSRS) Data
Driven Subscription or Email Subscription, it creates the job name with a GUID.

This makes it rather difficult to find the correlating Job Name.

  1. The easiest way to find the SQL Server Agent Job Name, is once you have created your SharePoint Data Alert, or SSRS (Data Driven Subscription or Email Subscription) is to go directly into your SQL Server Agent Job Monitor where the Job was created.
  2. Then open the Job Activity Monitor
  3. Then scroll through the list until you find a job with the following
    1. The name will be in a GUID format:

    2. The Last Run Outcome is set to Unknown
      1. And the Last Run is set to never

    3. NOTE: If there is more than one job with the above properties, then look to see when you scheduled the start time of your
      1. If that still does not help you might have to go into the job
        properties and into the schedules to see how often it runs to find the correct job.
  4. If this is in a Test
    environment, you can try and run the job to ensure that you do have the correct job.
  5. Make a Note of the Name of the Job
    1. In our example it was: 68CAE336-3D38-42A6-9526-F32F4D501AA4


Modifying the SQL Server Agent Job

Below are the steps to modify the SQL Server Agent job, so that it will not run on a schedule.

NOTE: The reason we are doing this is because we want the job to be run from within our SSIS Project after it has loaded our relevant data.

  1. Go into the properties of your SQL Server Agent Job and click on the Schedules

  2. Then in the Schedule
    click on the Schedule and select
  3. Once completed there will now not be any schedules for the job.

  4. Then click Ok.


Creating and configuring your SSIS Package to run the SQL Server Agent Job for you

Below are the steps to create and configure your SSIS package to run the SQL Server Agent Job as part of the SSIS Project.

  1. Create your new SSIS Package.
  2. Drag in an Execute SQL Task
  3. We renamed our Execute
    Task to the following:

    Run SharePoint Data Alerts

  4. Then we went into the properties.
    1. We then put in our Connection to our SQL Server Database, which is the same as the where the SQL Server Agent Job is
    2. Then under the SQL Statement we put in the following:

      EXEC msdb.dbo.sp_start_job N’68CAE336-3D38-42A6-9526-F32F4D501AA4′ ;

      1. NOTE: The thing to note here is that we are using the MSDB
        Procedure to start our Job.
        1. And that we are using the Job
          name which we noted in previous steps.
  5. Now that you have completed your Execute SQL Task, it is time to test it.
  6. Right click and select Execute SQL Task.
  7. If successful it should come back looking like the following below:

  8. As well as actually getting the email in your Inbox, as is the case with our SharePoint Data Alert.

For SQL Server 2014 – Use the following link below to use an oData Source to get data from SharePoint List into your table

Below are the steps if you want to use SQL Server 2014 to get data from a SharePoint list into your database.

The URL below explains how to download and install the OData Source for SQL Server

1.       Click on the following link below


2.       Then once you load the page click on Download

a.       This will then bring up which components you want to download.

b.       Scroll down and select the following:

c.        clip_image001

3.       Then download and install the component.

a.       You can just accept all the defaults in the Wizard.

4.       Then if you now open SSDT, and drag in a new Data Flow Task, then click on the Common Section you should see the following:

a.       clip_image002

5.       NOTE: We would want to use the X86 version for our testing in SSDT, due to SSDT being a x86 version.

6.       NOTE II: You will also have to install the component on your SSIS Server and configure your SSIS package to run in 32bit mode, so that when it is run as part of the job in SSIS it will function correctly.

Configuring your OData Source in SSIS

NOTE: You might have to create your SQL Table, to get the data from your SharePoint List into the table before you complete the steps below. Or you can create the table after step 2 below.


1.       Once you have followed the steps above, the one thing to note is how to get the correct URL so that we could create our connection to our Lists

a.       With the example below our URL was:


b.       Now in order to get the URL for the lists, we had to put in the following:

                                                               i.      clip_image003

                                                             ii.      NOTE: In your Site, you just need to add the following:


                                                            iii.      Also ensure the account that you use has access to the SharePoint List.

c.        And then you have your connection completed.

d.       After which you can then select your List from the drop down in your

2.       And then we configured our OData Source with the following:

a.       clip_image004

b.       Under Use collection or resource path, ensure that you select Collection.

c.        Under Collection, click on the drop down, and ensure that you select your SharePoint List that has been created in SharePoint

                                                               i.      In our example it was called: CrimeReportingSubscriptions

d.       You can then click on Preview to see that you are getting the expected data.

e.       Then if you click on Columns you will then see all the SharePoint related columns, as well as the columns from your List

f.         clip_image005

g.       NOTE: Ensure that you do have some data already in your list.

h.       Then click Ok to have your OData Source completed.

3.       Then once completed it will look like the following in your Data Flow

Framework for Automating SQL Server Reporting Services (SSRS) data driven subscriptions – Part 2

Description: What the goal was to find a way to automate how an end user could subscribe to a SSRS Data Driven Subscription without any intervention from the Business Intelligence Developer. And also enable the end users to have their own selections. And finally if they did not want to receive the report any longer to then delete their subscription.

In Part-2 we will be creating our SSRS Report, creating the Data Driven Subscription and then finally linking this to our Power View Report


So what we wanted to do, is to create a SQL Server Reporting Services (SSRS) report, then use the data driven subscription (DDS) to email the reports to the people that created the request in the Share Point List. And then finally have a link to a Power View report for more user interaction with their data.

NOTE: This will all be completed on SQL Server 2012


·         We have our data from our SharePoint List stored in the following table:

o    [dbo].[Mart_TD_SharePoint_ReportSubscriptions]

·         We are going to create a report which will show the Internet Sales.

o    Within this report it will have the Date Range Selected by the User

o    It will also have the Product Line selected by the User.

·         Once the report is created we will then create our Data Driven Subscription.

·         We will also create a Power View report based on the same Cube where our report gets its data from.

o    Once the Power View Report is created we will then add a link to our SSRS report for the users to click on for more information.


Creating our SSRS Report

The first thing that we are going to need to do is to create our SSRS report with the required parameters so that this can be used in our Data Driven Subscription.


As with our Example we will require the following parameters in our report.

·         Start Date

·         End Date

·         Product Line


1.       If need be create a new SSRS Project.

2.       Then as with our example we created our report with the following name:

a.        clip_image001

3.       Next we created our Shared Data Source to our SQL Server Analysis Services (SSAS) cube.

a.        We then added this Shared Data Source to our report.

4.       We are now also going to create another Shared Data Source which will connect to our SQL Server table where we have stored our details for our DDS

a.        NOTE: This will be your SQL Server where you have got your [dbo].[Mart_TD_SharePoint_ReportSubscriptions]

b.       We created the following Shared Data Source as per our example:

                                                               i.      clip_image002

5.       Next we are going to create our 3 Parameters as detailed above:

a.        clip_image003

b.       clip_image004

c.        NOTE: The reason for the above to Parameters that they are set to Text is because we are going to be getting these values later via our DDS, which will then be passed dynamically to our query.

                                                               i.      Along with this, the end user will never use this report due to it being emailed to them.

d.       Also we will be passing the DateKey Values to our MDX Query as explained further in the example.

e.       clip_image005

6.       Next we create our MDX Query

a.        NOTE: Initially we will hard code the values so that we can create our report and test that it works.

b.       So you would create your new Dataset as below:

c.        Here is the MDX Query:

Selectnonempty {[Measures].[Internet Sales Amount]} on 0

,nonempty {(



                 ,[Product].[Product Line].&[S]


                 )} on 1

from [Adventure Works]


d.       It would look like the following:

                                                               i.      clip_image006

e.       Next click on Fields, and we changed it to the following so that it is more friendly:

                                                               i.      clip_image007

f.         Then click Ok.

7.       We then created our Chart and completed all the formatting in order to get the report looking good.

a.        This is how we put in our values into the Chart:

b.       clip_image008

c.        So once completed the report will look like the following:

d.       clip_image009

8.       The next thing that we are going to do is add our Parameters to our MDX Query

a.        Go into your Data Set you created above: InternetSales_Dates_ProductLine

b.       On the right hand side next to query click on the Function button:

c.         clip_image010

d.       Remove the current MDX query and replace it with the following below:

=“Select non empty {[Measures].[Internet Sales Amount]} on 0 ,non empty {([Date].[Date].&[“+Parameters!StartDateKey.Value+“]:[Date].[Date].&[“+parameters!EndDateKey.Value+“],[Product].[Product Line].&[“+Parameters!ProductLine.Value+“])} on 1 from [Adventure Works]”


e.       NOTE: The above must all be on one line otherwise it will not work.

f.         Then click Ok to go back to your report.

9.       Now Preview the report putting in values for the Parameters:

a.        clip_image011

b.       After clicking View Report you should see the following:

c.        clip_image012

10.    Now you can deploy the report to your Reporting Server.

a.        NOTE: Remember to deploy the report as well as the Shared Data Sources.


Configuring and creating your Data Driven Subscription (DDS)

1.       The first thing that you will need to do is to change the data source for your report to use stored credentials.

a.        As with our example we are going to complete the following for our 2 Shared Data Sources from our Report we deployed earlier

                                                               i.      clip_image013

b.       The reason for this is so that when the report runs from the job schedule later, it will be running from a SQL Server Agent job, which will then require a way to be authenticated for the data source.

c.        So this has to be completed before configuring your DDS.

d.       We configured ours to use a Domain account as shown below:

                                                               i.      clip_image014

e.       Then click Test Connection to ensure that it will connect and work:

                                                               i.      clip_image015

f.         Then click Apply to save the changes.

2.       You can now test your report on the report server just to ensure that if you pass the correct parameters it will work.

3.       Next we are going to configure our DDS as explained below:

a.        Click on the arrow next to your report and select Manage as shown below

b.       clip_image016

c.        Then click on Subscriptions

d.       Now click on new Data-driven Subscription

                                                               i.      clip_image017

e.       This will then start the Report Subscription Wizard

f.         On the first screen Step 1 – Create a data-driven subscription: Internet Sales – Report Subscription we configured it with the following:

                                                               i.      clip_image018

                                                              ii.      As you can see from above we gave it a name.

                                                            iii.      We then specified that it will be delivered via email

1.       NOTE: You will have to have configured your SSRS Server to relay email through a server for this option to appear.

                                                            iv.      And finally we specified to use a Shared Data Source.

1.       NOTE: This is what we configured in step 1 above.

                                                              v.      Click Next.

g.        On the Step 2 – Create a data-driven subscription: Internet Sales – Report Subscription we then selected our Shared Data Source:

                                                               i.      clip_image019

                                                              ii.      NOTE: You must select the SQL Server data source connection because in the following step we are going to use a TSQL Query to get our required information.

                                                            iii.      Click Next

h.       On the Step 3 – Create a data-driven subscription: Internet Sales – Report Subscription this is where we use a TSQL Query to pass the details which will be required in the later steps


       [Product Line For Query]

      ,[Email Address]



         ,‘This report Subscription is for Product Line ‘+[Product Line]+‘ and Date Range: ‘+[Date Range]asSubjectDetails

  FROM  [dbo].[Mart_TD_SharePoint_ReportSubscriptions]with (nolock)


                                                              ii.      NOTE: You will see that we have put in an extra column so that we can use this as our Subject for our email.

                                                            iii.      Then click on Validate, this will ensure that it can make the connection to the Shared Data Source which you configured in the previous step.

                                                            iv.      clip_image020

                                                             v.      Click Next

i.         Now on the Step 4 – Create a data-driven subscription: Internet Sales – Report Subscription, we are going to configure the sending of the email with the following below:

                                                               i.      To:

1.       This is the to email address

2.       You will change it to Get the value from the database, and select Email Address

a.        clip_image021

                                                              ii.      Subject

1.       This is the subject for your email that is being sent

2.       You will change it to Get the value from the database, and select Subject Details

a.        clip_image022

                                                            iii.      In our example we changed the Include Link to No Value

1.       NOTE: This is because we did not want them to click back to the report that was emailed.

2.       clip_image023

                                                            iv.      Click Next

j.         On the Step 5 – Create a data-driven subscription: Internet Sales – Report Subscription, this is where you configure what parameters will be passed from your query in Step 3 above

                                                               i.      As with our example we configured it with the following:

1.       clip_image024

                                                             ii.      Click Next

k.        On the Step 6 – Create a data-driven subscription: Internet Sales – Report Subscription, this is where you can configure if you want to create a schedule.

                                                               i.      We wanted to create a schedule so that the report would be emailed out

                                                              ii.      clip_image025

                                                            iii.      Click Next

l.         On the final screen Step 7 – Create a data-driven subscription: Internet Sales – Report Subscription, this is where you can configure your schedule which for our requirements we configured it with the following below:

                                                               i.      clip_image026

m.      Then click Finish to complete your Data Driven Report Subscription

                                                               i.      NOTE: If you have not configured your Shared Data Source to use stored credentials you will get the following when you click Finish

                                                             ii.      The current action cannot be completed. The user data source credentials do not meet the requirements to run this report or shared dataset. Either the user data source credentials are not stored in the report server database, or the user data source is configured not to require credentials but the unattended execution account is not specified. (rsInvalidDataSourceCredentialSetting) Get Online Help

                                                            iii.      You will then need to go back and modify the Shared Data Source and then re-create your DDS

n.       You will then see the following:

o.       clip_image027

4.       Now if you want to manually test this and you have access to the SQL Server, you can find the report subscription if you go into your SSRS Server and open the SQL Server Agent, then the Job Activity Monitor

a.        Then the way to find it is to look for the Job Names that start with a GUID

b.       As with our example it had the following name:

                                                               i.      clip_image028

c.        You can also confirm by looking at the Next Run

                                                               i.      clip_image029

                                                              ii.      NOTE: When you create your subscription it will always be for the following day to start.

d.       Right click and select Start Job as Step

                                                               i.      It should complete very quickly and be successful

                                                              ii.      clip_image030

5.       Now if you go and check your email you should see the following below:

a.        NOTE: For our subscription we had selected the following

                                                               i.      Date Range: July 2007 – June 2008

                                                             ii.      Product Line: Mountain

b.       clip_image031

c.        Which as you can see from above we received.


Creating the Power View Report and putting the link into the SSRS Report

1.       We then went ahead and created our Power View report which connected to our SSAS Cube and looked like the following:

a.        clip_image033

b.       NOTE: This is before we put in any filters to only show our selection from our report.

2.       Next we went back into our SSRS report and put in the following below so that we can use it as a Hyperlink in our report

a.        Then in the bottom of the report right click and select Insert and then Text Box

b.       clip_image034

c.        Double click in the Textbox or within your table so that it is highlighted.

                                                               i.      Then right click and select Create Placeholder

                                                              ii.      clip_image035

                                                            iii.      Now this will open the Placeholder Properties

                                                            iv.      In the General Window you first need to put in your text or expression into the Value area.

1.       Click on the function button

2.       As with our example we put in the following below.

3.       You will see that we have already formatted it with our HTML Tags

=“<a href=””http://powerviewServer/powerpivot/_layouts/15/ReportServer/AdHocReportDesigner.aspx?RelativeReportUrl=/powerpivot/Shared%20Documents/Adventure%20Works%20DW%202012/Report%20Subscription%20-%20Adventure%20Works.rdlx&ViewMode=Presentation&PreviewBar=False&rf=[Product].[Product%20Line]%20eq%20%27Mountain%27 “”>Click HERE to view more information</a>”

                                                              v.      From the above example we have set the Preview Bar to false so that when people are viewing the report, they can only view the report and not make any changes.

1.       It is highlighted above in RED

2.       &PreviewBar=False

                                                            vi.      From the above example we have hardcoded our Filter above with Mountain

1.       It is highlighted in Purple

2.       &rf=[Product].[Product%20Line]%20eq%20%27Mountain%27

                                                           vii.      Now what we will do is to replace the hardcodes value with our value from our query, which is shown below:

&rf=[Product].[Product%20Line]%20eq%20%27″+First(Fields!Product_Line.Value, “InternetSales_Dates_ProductLine”)+”%27

                                                         viii.      So now the entire HTML Tag from above will look like the following:

=“<a href=””http://powerview2013dev/powerpivot/_layouts/15/ReportServer/AdHocReportDesigner.aspx?RelativeReportUrl=/powerpivot/Shared%20Documents/Adventure%20Works%20DW%202012/Report%20Subscription%20-%20Adventure%20Works.rdlx&ViewMode=Presentation&PreviewBar=False&rf=%5BProduct%5D.%5BProduct%20Line%5D%20eq%20%27&#8221;+First(Fields!Product_Line.Value, “InternetSales_Dates_ProductLine”)+“%27 “”>Click HERE to view more information</a>”

                                                            ix.      Once you have formatted your expression with your HTML tags then click Ok.

d.       The final part is to enable your Placeholder for HTML.

                                                               i.      Under Markup Type click on HTML – Interpret HTML tags as styles

                                                              ii.      clip_image036

                                                            iii.      Then click Ok.

e.       It should now look like the following:

                                                               i.      clip_image037

3.       Once the report is run or emailed it will then look like this at the bottom of the report:

a.        clip_image038

4.       Finally deploy your changes to your report server.

5.       Then to test to make sure it works correctly, run the job again as explained above.

a.        Once you get the email, as with our example we had selected Mountain.

b.       After clicking the above link to view more information we saw the following Power View Report.

c.        clip_image040




Framework for Automating SQL Server Reporting Services (SSRS) data driven subscriptions – Part 1

Description: What the goal was to find a way to automate how an end user could subscribe to a SSRS Data Driven Subscription without any intervention from the Business Intelligence Developer. And also enable the end users to have their own selections. And finally if they did not want to receive the report any longer to then delete their subscription.

In Part 1 we will be creating the SharePoint List, and using SQL Server Integration Services to get the data into our table.

·         What we will be doing below is to first create a SharePoint List

o    NOTE: This is created on SharePoint 2013

§  And for SSIS we will be using SQL Server 2012

·         Then we will use SSIS and a new Data Flow Task to get the data out of our SharePoint List and put this into a SQL server table.

·         After which we will then transform some of the data and then store it in a Dimension table.

o    This dimension table will then be used later in a SSRS Data Driven subscription so that we can email the report out.


Example for documentation:

·         We are going to create our List on SharePoint 2013

o    Within our List we are going to have the following fields:

§  Subscription Name

§  Date Range

·         We will specify custom Date Ranges

§  Product Line

·         We will pre-populate the Product Line list

§  Email Address

·         All the above information will be coming from the AdventureWorks2012DW SQL Database


You can get a copy of this here:

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012


Creating SharePoint 2013 List

1.       In your SharePoint 2013 website click on the Site Contents:

a.        clip_image001

2.       Then click on add an app

a.        clip_image002

3.       Then click on Custom List, this is so that we can create our own Custom List

a.        clip_image003

4.       This will bring up the Adding Custom List Window

a.        As with our example we created the following Custom List Name:

                                                               i.      clip_image004

b.       ClickCreate

5.       Now click on your Custom List you created to go into your Custom List

a.        clip_image005

6.       Now at the ribbon on the top click on List and then List Settings on the right hand side

a.        clip_image006

b.       This will then take you to the List Settings

7.       As per our Example explained above we are going to be creating the following columns detailed below by doing the following.

8.       Under Columns click on Create column:

a.        clip_image007

b.       First we created our Subscription Name by clicking on Title and changing it to the following:

                                                               i.      clip_image008

                                                              ii.      Click Ok to create the column

c.        Click on Create Column again to create the Date Range.

                                                               i.      NOTE: We are going to create names for our Date Range which we will convert to actual dates when we insert the data in SSIS in the later steps.

1.       The choices we are going to create are the following due our data having actual data from July 2007 to June 2008

2.       Ranges:

a.        July 2007

b.       July 2007 – December 2007

c.        July 2007 – June 2008

3.       clip_image010

4.       Click Ok to create

d.       Click Create column again for the Product Line.

                                                               i.      Once again it will be a choice of the following:

1.       Accessory

2.       Mountain

3.       Road

4.       Touring

                                                              ii.      clip_image012

                                                            iii.      Click Ok to Create

e.       The final column to be created is the Email Address column as shown below:

                                                               i.      clip_image013

                                                              ii.      Click Ok to Create.

9.       So once completed you will see the following:

a.        clip_image014

10.    Next we will use SSIS to bring the data down into a SQL Table.


Downloading and installing the SSIS component to query data from SharePoint Lists

1.       Click on this link below and download the following:


2.       clip_image015

3.       Once it has been downloaded, you can run the MSI and complete the steps.

a.        NOTE: You will have to close down and open up your BIDS (Business Intelligence Development Studio) or SSDT (SQL Server Data Tools) in order to see the SharePoint List data.             


Creating the SSIS Package to get the data from the SharePoint list into a SQL Server Table.

1.       Create a new SSIS Package and give it an appropriate name.

a.        As with our example we named it the following:

b.       clip_image016

2.       The next thing that you need to do is to create a connection to your SharePoint Server by doing the following:

a.        Right click on the Connection Managers and select New Connection.

b.       Scroll down and select the following:

                                                               i.      clip_image017

c.        Then click Add which will bring up the SharePoint Credential Connection Manager Editor.

d.       As with our example we put in the following:

                                                               i.      clip_image018

                                                              ii.      NOTE: Make sure that the SQL Server Agent Job service account has the permissions to SharePoint.

1.       Alternatively use a Custom Credential which will require access to the SharePoint List.

2.       In our environment we have got a domain account who’s password does not change that we use.

3.       Next drag in a Data Flow Task and as with our example we gave it the following name:

a.        clip_image019

4.       Now go into your Data Flow Task and configure it with the following:

a.        Click in the SSIS Toolbox and under Common you will see your SharePoint List Source

                                                               i.      clip_image020

b.       Drag this into your Data Flow Task and configure it with the following:

                                                               i.      Double click to go into the Properties

                                                              ii.      On the connection Managers Window you have to select your connection Manager:

1.       clip_image021

                                                            iii.      Then click on the Component Properties and put in the following:

1.       clip_image022

a.        This is so that it will look for all the SharePoint Lists

2.       clip_image023

a.        This will be the actual Root of your SharePoint location

b.       NOTE: It must NOT be the actual URL To your list, if you do this it will not find the List.

3.       Then scroll up to the Site List Name and in here you will put your List that you created above.

a.        As shown below is our example:

b.       clip_image024

                                                            iv.      Now click on Column Mappings and if the above settings are correct you should see all the columns as shown below:

                                                              v.      clip_image025

c.        Then click Ok to complete your SharePoint List Source

5.       Next is to create the destination and for this you need to create your destination table first.

6.       Below is the scripted out table that we used:



       [Date Range][nvarchar](255)NOTNULL,

       [Product Line][nvarchar](255)NOTNULL,

       [Email Address][nvarchar](255)NOTNULL,

       [Subscription Name (LinkTitle)][nvarchar](255)NOTNULL,

       [Subscription Name (Title)][nvarchar](255)NOTNULL,

       [Subscription Name (LinkTitleNoMenu)][nvarchar](255)NOTNULL,













       [App Modified By][nvarchar](255)NOTNULL,








a.        NOTE: Some of the columns from the source might be set to nvarchar(4000) but I know that we would not have anything this long so we left them all at 255 to save space.

7.       Next we dragged in our OLE DB Destination:

a.        NOTE: You would have to have created a Connection to your Destination Database

b.       clip_image026

c.        As you can see we configured it with the following:

                                                               i.      clip_image027

d.       And mapped the columns as required.

                                                               i.      NOTE: Ensure that all the mappings are correct

8.       Once done it should look like the following:

a.        clip_image028

b.       As noted above the reason for the warning is due to some columns from the source having nvarchar(4000)

9.       Now the final step is to run it and see if it works as expected

a.        But before doing this first add an item to the list:

b.       clip_image029

c.        ClickSave

10.   Now run your data flow task and you should hopefully see 1 row transferred

a.        clip_image030


Creating the Stored Procedure to insert the data into the Dimension (storage table) – To be used in Data Driven Subscription

Below is the Stored Procedure (SP) so that we can store the details from the SharePoint List and create some additional column.

The destination table will also be used later in the Reporting Services data driven subscription.


1.       Below is the SP which you can use to create your SP.

a.        NOTE: You will have to first create the destination table, which you can complete within the SP.



/****** Object:  StoredProcedure [dbo].[prc_LoadDIM_Mart_TD_SharePoint_ReportSubscription]    Script Date: 2014-04-23 07:50:25 AM ******/







— ===============================================================================================================

— Disclaimer

— The sample scripts are provided AS IS without warranty of any kind.

— I further disclaims all implied warranties including, without limitation, any implied warranties of

— merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance

— of the sample scripts and documentation remains with you.

— In no event shall I, its authors, or anyone else involved in the creation, production, or delivery of

— the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business

— profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use

— of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the

— possibility of such damages.

— ===============================================================================================================




— ===============================================================================================================

— Author:           Gilbertq

— Create date: 16 Apr 2014

— Description:      Here we are going to Define our Start and End Dates for our selection

—              from our SharePoint List

— NOTE: The format of the Start and End Dates are going to be the DateKey

—       from our Date Table so that this can be passed to our SSAS

—       AdventureWorksDW2012 cube.

— ===============================================================================================================








— ===========================================================================

— 1. Getting the DateKey for our Selection: 01 July 2007

— ===========================================================================


Set@Selection_StartJuly2007= 20070701


— ===========================================================================

— 2. Getting the DateKey for our Selection: 31 July 2007

— ===========================================================================


Set@Selection_EndJuly2007= 20070731


— ===========================================================================

–3. Getting the DateKey for our Selection: December 2007

— ===========================================================================


Set@Selection_December2007= 20071231


— ===========================================================================

–4. Getting the DateKey for our Selection: June 2008

— ===========================================================================


Set@Selection_June2008= 20080630


— ===========================================================================

— Below is the query to either update an existing Subscription or if new

— then insert into the table

— ===========================================================================



       Using (

                     SELECT   [ID]

                                  ,[Date Range]

                                  ,[Product Line]

                                  ,Casewhen[Product Line]=‘Accessory’then‘S’

                                        when[Product Line]=‘Mountain’then‘M’

                                           when[Product Line]=‘Road’then‘R’

                                           when[Product Line]=‘Touring’then‘T’

                                           Endas[Product Line For Query]

                                  ,[Email Address]

                                  ,[Subscription Name (Title)]





                                  ,Casewhen[Date Range]=‘July 2007’Then@Selection_StartJuly2007

                                         when[Date Range]=‘July 2007 – December 2007’Then@Selection_StartJuly2007

                                         when[Date Range]=‘July 2007 – June 2008’Then@Selection_StartJuly2007


                                  ,Casewhen[Date Range]=‘July 2007’Then@Selection_EndJuly2007

                                         when[Date Range]=‘July 2007 – December 2007’Then@Selection_December2007

                                         when[Date Range]=‘July 2007 – June 2008’Then@Selection_June2008



                           FROM[AdventureWorksDW2012].[dbo].[Staging_tb_SharePointList]with (nolock)                    




       and[Dest].[Email Address]=[Source].[Email Address]

       and[Dest].[Subscription Name (Title)]=[Source].[Subscription Name (Title)]




               ,[Dest].[Date Range]=Source.[Date Range]

               ,[Dest].[Email Address]=Source.[Email Address]

               ,[Dest].[Subscription Name (Title)]=Source.[Subscription Name (Title)]







               ,[Dest].[Product Line]=Source.[Product Line]

               ,[Dest].[Product Line For Query]=Source.[Product Line For Query]

       WhenNotMatchedbyTargetthenInsert (


               ,[Date Range]

               ,[Email Address]

               ,[Subscription Name (Title)]







               ,[Product Line]

               ,[Product Line For Query]


        )Values (


               ,Source.[Date Range]

               ,Source.[Email Address]

               ,Source.[Subscription Name (Title)]







               ,Source.[Product Line]

               ,Source.[Product Line For Query]









c.        With the above SP the reason for creating the StartDateKey and EndDateKey variables is due to our choice in the SharePoint list:

                                                               i.      clip_image031

                                                              ii.      And the above correlates to our variables in the SP.

                                                            iii.      NOTE: The StartDateKey and EndDateKey will be used later in the SQL Server Reporting Services (SSRS) Data Driven subscription, which in turn will pass the information to the SSRS Report Parameters

d.       We also had to put in a Case statement for the Product Line For Query so that it would match the underlying values in our SSAS dimension

e.       Another thing that we did do is to validate the email address to ensure that there are no bogus email addresses.

2.       So this is what the table will look like with the query for the Data Driven Subscription

3.       clip_image032


The next steps will be in the Reporting Services quick wins document where we will explain how we create our report with the require parameters.

After which we then create our Data Driven subscription

And then finally create our Power View Report and pass this link with the filter in our SSRS report.


SharePoint 2013 – Creating a data source to an SQL Server Analysis Services (SSAS) OLAP Cube

Below are the details when using SharePoint 2013 when you want to collect to an SQL Server Analysis Services (SSAS) OLAP cube, instead of only being able to use the SSAS Tabular.



·         In our example we are going to connect to our Adventure Works DW 2012 SSAS


1.       Go into your SharePoint 2013 website and then click on Files, then New Document and then select Report Data Source.

a.        clip_image002

2.       This will then open the Data Source Properties window where you will configure it with the following explained below.

a.        Where it says Name put in a name so that you know it is your connection file.

b.       clip_image004

c.        Next where it says Data Source Type click on the drop down and select the following:

d.       clip_image006

e.       Now where it says connection string below is the syntax that you have to use with an explanation below:

Data Source=Server\InstanceName;initial catalog=AdventureWorksDW2012;cube=’Adventure Works’

                                                                                       i.      From the syntax above the part highlighted in RED is to specify the SSAS OLAP Name and if required instance name.

1.       As with our example it was called:

a.        Server\InstanceName

                                                                                      ii.      The next section highlighted in GREEN is your SSAS OLAP Cube database name.

1.       As with our example is:

a.        initial catalog=AdventureWorksDW2012

                                                                                    iii.      The final part which you have to specify is the actual cube name, which is highlighted in PURPLE

1.       As with our example it is called:

a.        ;cube=’Adventure Works’

                                                                                    iv.      clip_image008

f.         In the next section which is the Credentials this is what will be used when running the data source.

                                                                                       i.      What we have setup in our environment is we have a static domain account where the password does not change. This is so that we can then use this account and grant it permissions to the required cubes as well as roles.

                                                                                      ii.      NOTE: We also did this because it is a lot simpler to configure the security to our SSAS OLAP cube via the standard roles.

                                                                                    iii.      So with our example we select Stored Credentials and put in our User Name and Password:

1.       clip_image010

2.       NOTE: We selected Use as Windows credentials so that it knows it is a Windows Domain account.

                                                                                    iv.      Then click on Test Connection to make sure the connection to the cube is correct as well as the credentials.

1.       clip_image012

3.       Then click Ok to create the Report Data Source.

4.       NOTE: If you are concerned about security for people using this data source to see specific data you can do one of two things:

a.        ONE: Change the above credentials to Windows Authentication.

b.       TWO: You can also limit who has access to the actual Report Data Source, Power View Report as well as the Folder that stores all the information and set unique permissions.

5.       Now as you can see below we created a quick and easy Power View Report on our SSAS OLAP Instance.

6.       clip_image014

Date Filtering in Excel from your Analysis Services Dimension

I recently had to set up another Date Dimension and went through the process of setting up all the properties in the date dimension so that when it is used in Excel, the end users would then be able to use the Date Filters built into Excel.

The Excel Date Filters are very handy when creating Excel documents. As an example if you create your Date Filter to be for the last month. Every time you open your Excel document it will automatically filter to the current month.

So below are the details on how to set this up on your Date Dimension. I have used the Adventure Works DW 2008R2 Database and Analysis Services Project.

  1. Open up your Dim Date Dimension
  2. Then you need to click on your Date Key and select Properties
  3. You MUST now add the following to your Date Key Properties:
    1. Where it says Key Column make sure it has the default which should be:

i.      DimDate.DateKey(Integer)

  1. Where it says Name Column click on the Ellipses button and select the following:

i.      DimDate.FullDateAlternateKey  (WChar)

ii.      NOTE: Make sure that it is set to WChar

  1. Where it says Value Column click on the Ellipses button and select the following:

i.      DimDate.FullDateAlternateKey (Date)

ii.      NOTE: Make sure that it is set to Date

  1. Then if you have a Hierarchy defined you need to create the attribute relationships in order for the Date to filter correctly in Excel.
  2. With our example we had a hierarchy with the following:
    1. IT went Calendar Year – Calendar Quarter – English Month Name – Full Date Alternate Key
    2. So in the Attribute relationships screen we created the following relationships with the screenshot below:

Figure 2 – Attribute Relationships Example

  1. NOTE: When creating the Attribute Relationships it always goes from Right to left, starting with the Year and ending with the Date.
  2. The final step is for each of the attributes used in your Hierarchy you must ensure that they ALL have a Name Column Value defined.
    1. As in our example for Date it had TD_Date.Date (WChar) in the Name column, as well as all the others for English Month NameCalendar Quarter and Calendar Year.
    2. Next you need to set the Type of your Dimension Attribute to match what your date function is
      1. EG:

i.      If it is Calendar Quarter then you need to set the Type for Quarter to Quarters

  1. You do this by doing the following for each of your Dimension attributes in your Hierarchy above

i.      So it would be for Full Date Alternate Key, English Month Name, Calendar Quarter, Calendar Year

ii.      So we are going to start with Full Date Alternate Key as our Example. But you would need to do this for all of the above in bold.

iii.      Click on Full Date Alternate Key in the Attributes pane.

  1. Then either right click and select Properties or click on Properties window
  2. Under Basic look for Type
  3. Click on the Drop Down Next to type
    1. Then click on Date, then the plus Sign next to Date and scroll down until you get to Days

  1. Then click on Days.
  2. Once done it will then look like the following:

  1. You will then need the Type for each of the following below:
    1. English Month Name:
    2. Type:

i.      Months

  1. Calendar Quarter
  2. Type:

i.      Quarters

  1. Calendar Year
  2. Type:

i.      Years

  1. You then need to put in the following so that you don’t get duplicates when trying to process the dateDimension
    1. NOTE: You do not have to do this for all the levels only specific ones below

i.      English Month Name

  1. Go into the Properties for English Month Name and under Key Columns put in the following, with the same order as shown below:

i.      Calendar Quarter

  1. Go into the Properties for Calendar Quarter and under Key Columns put in the following, with the same order as shown below:

  1. The final step is to set your Dimension to Time
    1. You do this by clicking on Dim Date at the top of your Attributes and right clicking andselecting Properties

i.      Where it says Type change this to Time

  1. Then finally do a Full Process on the Date DIM.
  2. Once that is done it will invalidate all the cubes associated with the date DIM, so you will also have to do a Full Process on the cubes for the new Date DIM to pull through.
  3. Finally go through to your Excel Spreadsheet, possibly refresh the data if it already had data.
    1. Then put in the Date Hierarchy in your Row Labels and put something in your values, you should then see your Date Filters as shown below

Date Filtering in Excel from your Analysis Services Tabular Model

Following on from how to format the date in Analysis Services UDM Model, today I found myself trying to do the same thing for the tabular model. Where Excel was viewing the dates as a label and not as an actual Date. And after some looking around it is fairly simple to implement so that you can then use the Date Functions within Excel.

  1. Go into your Tabular Project
  2. Go into your Date Sheet that you have imported.
  3. Then click on any Column within your Date Sheet, so that it is selected

i.      NOTE: I had already set the properties in my Date Column to the following:

  1. Now at the topclick on Table, then click on Date, then you will see Mark As Date Table.

  1. This will then open the Mark as Date Table Window
    1. Click on the drop down and select your correctly formatted Date Column.

i.      In my example below it was called Date


  1. Then click Ok.
  2. Now deploy and then process your Tabular Model.
  3. Now when you go into Excel you will see the following: