SSRSDB – Automated contact with users when reports are not used

2015-07-22_10-39-10

 

Hi everyone,

This is the 4th part of the SSRSDB blog series. In this part we will be looking at how we can use data that we are collecting to automate contact to our users. Have you ever been asked to create a new dashboard that replace current reports, then asked if you can remove those replaced reports and been told “no, we still need those”. This happens a lot. The problem it causes is repetition of reporting, when there is repetition it can lead to multiple reports with slightly different business rules, this leads to confusion and misleading information.

How do we avoid this happening? One method is to analyse how frequency of the report use and compare it to a predefined limit of usage. In this section I want to look at this approach.

In the last section we looked at the additional metadata which holds additional data to support monitoring the report, this includes who owns the report and their email address. The metadata also contains the following additional columns:

Inactive Period – The amount of days that is allowed to elapse without usage. This needs to be set per report, the reason for this is that you will likely have some reports that are run daily and other that are only ran one every quarter.

Inactive Action – There are 2 outcomes at the stage:

NOTHING – Do Nothing
EMAIL – Send a friendly email to the user.

Final Period – The amount of days that can elapse for the final action.

Final Action – This is the action that takes place once the inactive period has elapsed. At current you’re limited to the following actions:

NOTHING – Do nothing
HIDE – Hide the report*
DELETE – Trigger the ReportServer to delete the report*

*Microsoft do not want you to do this, this is making a change to the ReportServer without using the ReportServer, this is a big no-no. However the process to hide and the process to delete is triggering the same stored procedures that the ReportServer GUI uses.

A typical set-up might be to set a report to email after 7 days, the report is then still available to users if they want to run the report, after a further 7 days the report is set to either hide or it is deleted. If you’re working in an environment that does not source control your reports then you can still get a version of the report back from dim_catalog in the event it is deleted in error – On a side note you should be source controlling your reports, use an SVN and source control solution.

With this data we can start looking at a process to automate contact with our users. Ok let’s look at the SQL to do this. What we have is a query to check if a user needs to be contacted and then we are going to loop over these with a cursor and send them a friendly email to say “Hey! You are not using the report I worked on!”.

To allow this to work you need to have configured database mail on your server. You can find a short guide on configuring database mail using this link – http://blog.sqlauthority.com/2008/08/23/sql-server-2008-configure-database-mail-send-email-from-sql-database/. Once you have this configured then you can apply the following code to contact your users*.

*If you have not updated your metadata then this will not make contact.

 

Notify the user

 
 
DECLARE DeprecatedReportsCursor CURSOR FOR 
WITH LastExecution AS 
(
		SELECT 
			[Item_Skey]
			, MAX([ExecutionStartDate]) maxdate
		FROM 
			[dbo].[Fact_Transaction_Executions]
		GROUP BY
			[Item_Skey]
 
)
SELECT
	c.Item_Name
	, COALESCE(CONVERT(NVARCHAR(50),MAXDATE),'**NEVER RUN**') AS 'Execution_Date'
	, M.Item_Owner
	, CASE	WHEN maxdate IS NULL THEN 999
			ELSE DATEDIFF(DAY,MAXDATE,GETDATE())
	END AS 'Days_Since_Last_Execution'
	,C.item_URL
	,m.Item_In_Active_Period
	,CASE WHEN (m.Item_Final_Period - DATEDIFF(DAY,MAXDATE,GETDATE())) < 0 THEN 0 ELSE m.Item_Final_Period - DATEDIFF(DAY,MAXDATE,GETDATE()) END AS 'Days_To_Remove'
	,M.Item_Owner_Email AS 'Email_Address'
FROM 
	dbo.Dim_Catalog C
	LEFT JOIN LastExecution LE ON LE.[item_skey] = C.item_Skey
	LEFT JOIN Support.Item_Metadata M ON C.Item_BusinessKey = M.Item_BusinessKey
WHERE 
	c.SCD_Is_Current = 1
	AND c.Item_Type = 2	
	AND c.Item_Is_Hidden = 0
	AND M.Item_Type = 'Main'
	AND M.Item_In_Active_Action = 'EMAIL'
	AND DATEDIFF(DAY,MAXDATE,GETDATE()) >= CONVERT(INT,m.Item_In_Active_Period)
ORDER BY	
	c.item_Name
;
 
DECLARE @Date varchar(100)
DECLARE @MainBody varchar(MAX)
DECLARE @Link varchar(500)
DECLARE @ReportName VARCHAR(500)
DECLARE @ExecutionDate VARCHAR(500)
DECLARE @ReportRequestedBy VARCHAR(500)
DECLARE @DSE INT
DECLARE @InActivePeriod VARCHAR(5) 
DECLARE @DaysToRemove VARCHAR(5)
DECLARE @EmailSubject VARCHAR(500)
DECLARE @OwnerEmailAddress VARCHAR(100)
 
 
OPEN DeprecatedReportsCursor
FETCH NEXT FROM DeprecatedReportsCursor INTO @ReportName, @ExecutionDate, @ReportRequestedBy, @DSE, @Link,@InActivePeriod,@DaysToRemove, @OwnerEmailAddress
 
WHILE @@FETCH_STATUS = 0
	BEGIN 
 
SET @MainBody = '
<html>
<head>
	<title>Notification Email</title>
</head>
<body>
	<table width="800" border="0" cellpadding="0" cellspacing="0" bgcolor="#ffffff" style="font-size: 13">
			<tr bgcolor="#008a9f" width="734px"><td><font color="#008a9f">h</font></td></tr>
			<tr bgcolor="#00c3e1" width="734px"> <td><font color="#00c3e1">h</font></td> </tr>
			<tr bgcolor="#FFFFFF" width="734px"> <td><font color="#FFFFFF">h</font></td> </tr>
			<tr valign="top" align="center">
				<td style="border-width : 0px;"><img src="http://www.hyperbi.co.uk/wp-content/uploads/2015/01/HyperBi.png" alt="" title="" border="0" height="150" />
				</td>
			</tr>
			<tr bgcolor="#FFFFFF" width="734px"> <td><font color="#FFFFFF">h</font></td> </tr>
			<tr valign="top">
				<td style="border-width : 0px; font-family:Arial;">
					<h1 style="color:#E33030; text-align: center; ">Report due for depreciation. <u>Act now!</u></h1>
					<p>Hello,</p>
					<p><b>' + @ReportName + '</b> - Has not been used for more than ' + @InActivePeriod + ' days. </p>
					<p>This report is now scheduled for depreciation and will be removed <strong>' + CASE WHEN @DaysToRemove = 0 THEN 'Today' ELSE  'in ' + @DaysToRemove + ' days' END + '</strong>.</p>
					<p>
					Please click the link below to avoid this action.  <br />
					</p>
					<p>
					<a href="' + @Link + '">' + @ReportName + ' </a>  <br />
					</p>
					</BR>
					<p>Regards,</p>
					<p><i>Management Information Team.</i></p>
					</br></br>
 
				</td>
 
			</tr>
					<tr bgcolor="#ffffff" width="734px"> <td><font color="#ffffff">h</font></td> </tr>
					<tr bgcolor="#00c3e1" width="734px"> <td><font color="#00c3e1">h</font></td> </tr>
					<tr bgcolor="#008a9f" width="734px"><td><font color="#008a9f">h</font></td></tr>
					<tr bgcolor="#000000"><td style="padding: 10px; font-family:Arial"><font size="1" color="#FFFFFF">
					<p style="font-size: 12; font-family:Arial"></br>
					Please send any feedback to <a style="text-decoration: none; color: #FFFFFF" href="mailto:reportqueries@hyperbi.co.uk">reportqueries@hyperbi.co.uk</a>
					</p>
					<p style="font-size: 12; font-family:Arial">
					This email was auto generated. Please do not reply directly. 
					</p>
					<p></p></br>
 
</font></td></tr>
 
		</table>
 
 
	</body>
</html>
'
 
SET @EmailSubject = 'Important: Report due for depreciation - ' + @ReportName;
SET @ReportRequestedBy = @OwnerEmailAddress
 
		EXEC msdb.dbo.sp_send_dbmail
		@profile_name = 'hyperbi',
		@recipients = @OwnerEmailAddress,
		@subject = @EmailSubject,
		@body = @MainBody,
		@body_format = 'HTML'
 
		PRINT @MainBody
		FETCH NEXT FROM DeprecatedReportsCursor INTO @ReportName, @ExecutionDate, @ReportRequestedBy, @DSE, @Link,@InActivePeriod,@DaysToRemove, @OwnerEmailAddress
 
	END
CLOSE DeprecatedReportsCursor	
DEALLOCATE DeprecatedReportsCursor

Working our way from the top we have a common table expressions which is calculating when a report was last accessed (successfully – non-successful runs are not stored in the ExectionStorageLog table). This is fed in to a query which is comparing the report with the metadata to calculate if the user should be contacted by email – this must be expressed in the metadata. This is all loaded in to the a CURSOR statement (typically I would say that CURSORs are evil and should be avoided at all costs, however much like all questions related to database administration it depends, looping to send emails is a great reason to use a CURSOR).

From here we load a set of variables from the cursor, these variables are loaded in to an HTML block which is used for the emails. You will notice that I have used quite an old style of HTML, this isnt because I am old school, but because these emails are typically going to Outlook, which has very basic HTML processing. Once the HMTL is built it is being passed over to database mail, EXEC msdb.sp_send_dbmail handles this for us. Once executed this will fire an email for every report which has not been accessed beyond the in active period. The user will receive an email that looks like this:

2015-07-22_17-23-32

This email is completely customisable with a little knowledge of HTML. It features your company’s logo (1), a “call to action” prompting the user that they need to decide what to do with their report, all the details about their report are listed (3), a link to the report is included (4)(this link allows the user to click on the link and run the report, running the report will reset the counter) and lastly it contains all the contact information for your report writing team.

Personally I have an agent job scheduled to run once a week on a Monday morning.

 

Automatic expiration.

Sending an email is all well and good however what happens if no one responds, or the only person who used that report left the company months ago. We need an automated solution to remove/hide these based on the final action. To do that we have another agent job.

 
 
DECLARE DeprecatedReportsCursor CURSOR FOR 
WITH LastExecution AS 
(
		SELECT 
			[Item_Skey]
			, MAX([ExecutionStartDate]) maxdate
		FROM 
			[dbo].[Fact_Transaction_Executions]
		GROUP BY
			[Item_Skey]
 
)
SELECT
	c.Item_BusinessKey
	,m.Item_Final_action
	,c.Item_Path
FROM 
	dbo.Dim_Catalog C
	LEFT JOIN LastExecution LE ON LE.[item_skey] = C.item_Skey
	LEFT JOIN Support.Item_Metadata M ON C.Item_BusinessKey = M.Item_BusinessKey
WHERE 
	c.SCD_Is_Current = 1
	AND c.Item_Type = 2	
	AND M.Item_Type = 'Main'
	AND DATEDIFF(DAY,MAXDATE,GETDATE()) >= CONVERT(INT,m.Item_Final_Period)
ORDER BY	
	c.item_Name
;
 
DECLARE @Item_ID varchar(100)
DECLARE @SQL varchar(MAX)
DECLARE @FinalAction VARCHAR(50)
DECLARE @Item_Path VARCHAR(250)
 
OPEN DeprecatedReportsCursor
FETCH NEXT FROM DeprecatedReportsCursor INTO @Item_ID, @FinalAction, @Item_Path
 
WHILE @@FETCH_STATUS = 0
	BEGIN 
 
IF @FinalAction = 'HIDE'
BEGIN 
	SET @SQL = '
	UPDATE c SET Hidden = 1
	FROM reportserver.dbo.CATALOG c
	WHERE ItemID = ''' + @Item_ID + ''''
	PRINT 'Report Hidden'
END 
 
IF @FinalAction = 'DELETE'
BEGIN 
	SET @sql = 'EXEC ReportServer.dbo.DeleteObject @Path=N''' + @Item_Path +  ''',@AuthType=1,@Prefix=N''' + @Item_Path + '%'''
	PRINT 'Report deleted'
END 
 
IF @FinalAction = 'NOTHING'
BEGIN 
	PRINT 'nothing'
END 
 
PRINT @SQL
 
EXEC (@sql)
 
FETCH NEXT FROM DeprecatedReportsCursor INTO @Item_ID, @FinalAction, @Item_Path
 
	END
CLOSE DeprecatedReportsCursor	
DEALLOCATE DeprecatedReportsCursor

This process is very similar to the process above, we have a CURSOR that loops over all the reports that are past the final notice period and either does nothing, hides or deletes the report.

As stated above this is all done in the same way as the report server to avoid causing any internal issues (use this at your own risk).

 

I have this as a separate job which runs the evening of the first job.

 

With that you have an automated report expiration process for SSRS.

 

Thanks for reading.

 

SSRSDB – Loading the SSRS data warehouse report monitor

ssrsdb3

Hi everyone,

This is the 3rd instalment to my series looking at reporting on SQL Server Reporting Services. In the last section we looked at the data warehouse. In this section we will look at loading the DW with all the data we need to effectivly monitor our SSRS environment.

The ETL process is formed from 2 steps.

Step 1 – Back up the existing ReportServer and restore it to a new database “ReportServer_Clone”. There are several reasons why you might want to do this, the main one is to remove any strain from your production server. Microsoft does not advise that you make alterations to the ReportServer database so we are goon on that front too. The final stage is to add a view and persist that view to a table. This is a very expensive view which is why it is persisted. This view is used for most of the load process so persisting to a table enables the DW to load faster.

Step 2 – The ETL phase. The ETL has 5 phase.
Phase 1 – Load variables. The SSIS packages is designed to be as dynamic as possible and fit to your environment. When you create the SSRSDB there is a table called “Configuration” under the “Support” schema. This holds all the connection and dynamic variables.

2015-07-21_11-07-58

Phase 2 – Load the dimensions.
You will notice that there are less dimensions being loaded than were created in the build process. This is because some are enumerator tables and will not change over the life of your versions of SSRS, for that reason there is no need to load them daily.

2015-07-21_11-08-47

For all other dimensions these are loaded using TSQL MERGE, to enable this process we need a staging table to make the comparison. Essentially what a TSQL MERGE does is an UPSERT, an UPDATE and INSERT. The dimensions are all type 2 slowly changing dimensions. So when a change is registered we are closing off the old row and adding a new one.

Each sequence container contains a truncate staging EXECUTE SQL task followed by a DATA TRANSFER task to move the data from the ReportServer_Clone database to the staging tables. There is a degree of transformation that also happens, mainly around converting INT tables to expanded results. Example, IsHidden is a boolean value BIT value (0 OR 1) when filtering this in a report it is not obvious what 0 and 1 relate to, so this is transformed to be “Hidden” and “Not Hidden”.

The dimensions being loaded here are Dim_Catalog – report details, Dim_Format – Format of the report, Dim_User – who has been using the report, Dim_Role – what role does the use have and finally Dim_Dataset.

2015-07-21_11-16-23

Once the data has been loaded in to staging it passed in to the TSQL MERGE.

Phase 3 – Fact loading. The facts are loaded using the LOOKUP design pattern. When you set up the SSRSDB you will have a set of historic data, when this is first loaded you want all that historic data. After that has been loaded there is no point reloading all the data as this is just a waste of IO. Instead as part of the configuration once the ETL has been ran once it will set the “full load” option to “Incremental”. Once this has happened only the data loaded that day will be removed.

Three fact tables are being loaded Fact_transaction_Execution, Fact_Factless_PUR and Fact_Factless_Dataset_fields.

2015-07-21_11-23-38

The first step is to either remove today’s data or all the data (dependant on whether this is the first time you’re loading the SRSDB). Once the data is removed we move the data from the ReportSerer_Clone database to the SSRSDB.

(1) A query to extract the data and transform it, (2) look up the current Item_skey, (3) look up the current Formay_skey, (4) look up the current User_skey, (5) look up the current Action_skey, (6) look up the current Source_skey and finally load to SSRSDB.

2015-07-21_11-46-03

The same process happens for the other 3 fact tables.

Phase 4 – Loading and updating the metadata. Part of the power of SSRSDB is the extended metadata. This is achieved using a simple Excel spreadsheet. When SSRSDB is created a table is also created which hold the metadata, Support.Item_Metadata. The columns in this table mirror that of the Excel Spread sheet.

2015-07-21_11-56-34

The metadata allows you to store additional data about the report which will be used to automate communicating changes/usage of reports. The metadata stores the style (theme management blog coming soon), date it was created, who requested it, what type of report is it (Main or drilldown), the report owners email (again for automation) In active period (how many days can a report go without being used, what happens after that time, final period and what happens.

When you first use the SSRSDB this will be blank, rather than manually populating this there is a process to read the file and write out the existing contents of the ReportServer, any changes after this will be updated.

The final phase is to read the ReportServer logs. This is a for each loop to loop over each log and load it in to the SSRSDB.

In the download attached you will also find a set of jobs that you can create. Once these have been created and scheduled you have a fully automated ReportServer data warehouse. I personally have this scheduled to run every 30 minutes.

Thanks for reading.

SQL Saturday #411 Cambridge

img_44961

WOW! What a week. In less than 7 days I have been accepted to speak at 3 SQL Saturday conferences! The schedule currently looks like this:

July – SQL Saturday Manchester
August – SQL Saturday Oslo
September – SQL Saturday Cambridge

SQL Saturday in the UK has become something of a regular occurrence. In 2015 alone have 4 (Exeter, Manchester, Cambridge and Southampton), but it all started in Cambridge with the first UK based SQL Saturday back in 2012. I missed that event but went to the following event in 2013. I had a blast! The only downside to the conference was that I didn’t spend enough time in Cambridge! I am hoping to change that this time and take in some of the sights. What I remember from last time were bikes! Bikes everywhere! I don’t have the best luck with bicycles – Ask me when you see me – I will show you the scars….

This will be the 3rd time that Mark and the team have organised SQL Saturday Cambridge and they have certainly gone big! 2 full days of training and 6 tracks on the Saturday! That is bigger than big! That is MASSIVE! In 2013 SQL Saturday Cambridge tried something different by making the even half SQL half SharePoint. I don’t use SharePoint, but like many I administer it from time-to-time. In 2013 I went to a SharePoint session and really enjoyed it. I will do the same in September.

Looking forward to seeing you there. Register now: http://www.sqlsaturday.com/411/EventHome.aspx

Thursday 10th September 2015

Penelope Coventry – Automating business processes with SharePoint, Office 365 and Azure

Benjamin Nevarez – SQL Server Query Tuning & Optimization

Niko Neugebauer – A Day with Columnstore Indexes

Friday 11th September 2015

Rasmus Reinholdt Nielsen – Building tomorrows ETL architecture with BIML and MDS today

Kennie Nybo Pontoppidan – SQL Anti-Patterns

Mark Broadbent & Regis Baccaro – Implementing Business Intelligence & SharePoint High Availability Solutions

Microsoft – Building a Data Driven Strategy

SQL Saturday #418 Manchester

2015-06-09_21-10-26

A few days ago I posted that I had been selected to speak at SQL Saturday in Oslo. You know what they say, “SQL Saturday’s are like buses. You wait for one and two come along at the same time”. I am really happy to announce that I will be speaking at SQL Saturday in Manchester UK. On the 24th and 25th of July Manchester will play host to a SQL Saturday!

Not only is this the first time there has been a SQL Saturday in Manchester, this is the first time I will have been to Manchester! I live in the Southwest of the UK in Exeter and I never have a great need to head further north than Bristol (83 miles or 133 KM – terrible I know) so I am really looking forward to this event!

**Edited 10/06/2015**

When I originally wrote this post there was not a great deal of information about the training days on offer at SQL Saturday Manchester. Well the team have pushed an update out and they are great (So great I had to come back and edit the post). Manchester has 6 training day session.

1. Building Big Data Solutions with Microsoft Azure HDInsight – Graeme Malcolm | Microsoft
2. Machine Learning without a PhD in statistics – Andrew Fryer | Microsoft
3. SQL Server Business Intelligence – Chris Testa-O’Neill
4. SQL Server 2014 In-memory technologies – Tony Rogerson
5. Practical R: A hands-on training day – Steph Locke
6. Data Management Lifecycle by Redgate

I am hoping to extend my visit in Manchester and see Mr Fryer’s session on Machine Learning. If I could split myself I would go and see Steph Locke’s session. Steph did a similar session in Exeter and it was a great success!

Make sure you check out the event/schedule and I will see you on July 25th in Manchester!

SSRS Inception – Part 2

totem

Hi everyone, this is the second part of this series looking at creating a monitoring solution for SQL Server Reporting Services. The second instalment looks to cover the following:

  1. The process of backing up your existing SSRS instance
  2. The structure of the SSRSDB

The process of backing up your existing SSRS instance

My personal preference for getting data from the ReportServer database to somewhere that you can work with it without causing issues to your production server is to take a backup and restore is to another database. I have called this ReportServer_Clone (Original right! :P)

The process will be as follows:

  1. Take a backup of the ReportServer – You dont need the ReportServer_TempDB.
  2. Restore that elsewhere
  3. Load the ReportDW with your data – Via SSIS.
  4. Run the usage reports

You might be asking why cannot I just not query the SSRS ReportServer directly? Well you can – but at your own risk. The ReportServer is a temperamental old gal. You can cause a whole lot of issue by making just the smallest of changes outside of the report manager.

As I have stated the backup and restore process is just my preference, it is one of the many options that you have available to you for getting your data out of ReportServer. You might choose to use transactional replication of the key tables (refer to part one for a list of key tables). Other alternatives would be use a HADR option, AlwaysOn availability groups or mirroring with snapshots . Both methods will work well for you. Personally for the size of my environment I opted for a simple BU and restore.

The process will back up the ReportServer and restore it to your server of choice, you can do this locally or push the backup to a different server. Once you have a restored version we will add a view and then persist that view to a table. The view we will add is from Bret Stateham (B). Bret has a fantastic codeplex project that does a lot of the hard work for you. I have made very minor amendments to the codeplex project. Bret if you’re reading this. Thank you for your fantastic scripts. You can find Bret’s codeplex here: SSRS queries.

Now that we have a working restored version of the ReportServer we will want to look at the data SSRS is generating. The easiest way to do this is to query the ExecutionLog3 view. This is the process for monitoring that ships out of the box from Microsoft. It will give you a very basic view of what is happening on your environment. Bret has expanded this with ‘SSRS Queries’. For me and my organisation the users wanted something that was in a style they understood, they already use SSRS, so why not use SSRS as the presentation layer – before we get in to the presentation layer we need to touch on where the data is going to be stored. As a data warehouse developer, creating a DW seemed the best option.

The structure of the SSRSDB

There are many factors that went in to the decision to model this data as a DW. The key reason is to capture history.

ReportServer will only hold the history for 60 days. By modelling this is  to a DW we can keep the history for as long as we feel necessary. If your ReportServer has become very large and you are experiencing performance issues or possible issue with storage, then you could look to reduce the amount of history stored to 14 days and let SSRSDB handle the storage on a larger slower disk – or even Azure.

There are so many moving parts in SSRS, and SSRS changes all the time. An example of change might be an alteration to the TSQL of a query, report styling, users who have access, new columns added, columns removed, change of shared data source etc. There is an awful lot of information inside ReportServer that is lost when changes happen. Does this scenario sound familiar?  A user calls you and claims they had access to a report last week but cant open it now. How do you prove that they did have access? Simple answer track their access daily.

In part one of this series we looked at the main tables in the ReportServer. The ReportServer is all hinged around a central table – execution_log. When modelling this in to a DW I was working towards a standard Kimball star schema. If you have read the Kimball groups books then you might be familiar with their 4 step process for defining a DW.

  1.  Select the business Process – We want to monitor the execution of SSRS reports.
  2. Declare the grain – We will monitor at the per execution grain.
  3. Identify the dimensions – Calendar, User, Item, User etc.
  4. Identify the facts – Report executions

Now we have the 4 step defined we can model that in to our star schema.

ssrsdb_dia

We have a few different types of dimensions and facts that have been employed in this project. We have our standard slowly changing dimensions and also a collection of role-playing dimensions. Role-playing dimensions are used to simplify the model, where we have many different dates associated to an execution/item – When was the item executed, when was it created, when was it last updated etc. We could have physically created multiple version of the calendar dimension however, that would have involved maintaining multiple versions – and I really hate doing more work than I need to! So we have a series of views based on calendar.

To capture change we will be implementing type 2 slowly changing dimensions – apologies to some if it feels like I am teaching you to suck eggs. For those who are unfamiliar with Kimball DW design, a slowly changing dimension tracks history over time and comes in several flavours. You can read more about them here. SSRSDB uses predominantly type 2 SCD, a type 2 SCD captures history by adding a new line for every changed row. To ensure that we know which rows are current we also typically add an current flag (a bit boolean value) and also an active from date and an active to date. This allows you to quickly see which dimension attributes reflect the production system while maintaining history and allowing us track user movement.

We also have multiple types of fact table. Fact_Executions is a standard transitional fact table. Each row represents a row in the expectation log tables in the ReportServer. Along side this is also 2 additional factless fact tables to monitor changes in report structure and user permissions. Now that we have our SSRSDB model we can begin to design a solution to load it with data from the ReportServer_Clone database.

We will look more at the ETL phase of SSRSDB in part 3.

Thanks for reading!