PortalSuperUser and broken EditProfile.aspx

Sometimes support is just about figuring out what someone did wrong when configuring an application. I recently did a new install of SharePoint Server 2010 with Enterprise features that was to be used for a migration from an older MOSS 2007 farm. I set my new Server up with SP1 and created mysite and intranet web applications. The Intranet just had a test Team Site ready to be replaced with the migrated content. I set up the User Profile Service and imported objects from Active Directory. The profiles were there, mysites were set up and user search was working. I though everything was fine but then I clicked the Edit my Profile link on the Profile page of a mysite and got an unexpected error.

OK so I’m sure that the tracelogs will help me out here.  I used ULSViewer (http://archive.msdn.microsoft.com/ULSViewer) to capture the trace logs at the point of the error.  Immediately after the unexpected error in the tracelog was:

Application error when access /_layouts/EditProfile.aspx, Error=Server Out Of Memory. There is no memory on the server to run your program. Please contact your administrator with this problem. at Microsoft.SharePoint.Library.SPRequestInternalClass.GetUserToken(String bstrUrl, String bstrLogin) at Microsoft.SharePoint.Library.SPRequest.GetUserToken(String bstrUrl, String bstrLogin) at Microsoft.SharePoint.SPWeb.GetUserToken(String userName) at Microsoft.SharePoint.Publishing.CacheManager.<.ctor>b__0(SPSite newSite) at….

Server out of memory that must be absurd.  After a long spell of troubleshooting with a very good  Microsoft PSS engineer I suddenly realised that if I accessed EditProfile.aspx.aspx from another web app then it worked fine.  Something must be wrong with my mysite web app,

What I eventually worked out was that In order to get rid of the Event Log warning:

Object Cache: The super user account utilized by the cache is not configured. This can
 increase the number of cache misses, which causes the page requests to consume
 unneccesary system resources.To configure the account use the following command 'stsadm -o setproperty -propertyname portalsuperuseraccount -propertyvalue account -url webappurl'. The account should be any account that has Full Control access to the SharePoint databases but is not an application pool account.

On the http://mysite web app (and other web apps), I follow Mirjams article: http://www.sharepointchick.com/archive/2010/10/06/resolving-the-super-user-account-utilized-by-the-cache-is.aspx.  I checked again that I had the accounts in the User Policy:

This all looked right.  So in Powershell then, had I set these correctly?

PS C:\Users\administrator.xxx> $wa=Get-SPWebApplication http://mysite:80
PS C:\Users\administrator.xxx> $wa.properties["portalsuperreaderaccount"]
i:0#.w|xxx\SPsuperreader
PS C:\Users\administrator.xxx> $wa.properties["portalsuperuseraccount"]
i:0#.w|xxx\SPsuperuser

Oh, that’s not right, these are the settings for a Web App that is running in Claims authentication.  The MySites web app is running in Classic authentication so these needed to be re-set.

PS C:\Users\administrator.xxx> $wa.properties["portalsuperuseraccount"] = "xxx\spsuperuser"
PS C:\Users\administrator.xxx> $wa.properties["portalsuperreaderaccount"] = "xxx\spsuperreader"
PS C:\Users\administrator.xxx> $wa.update()
PS C:\Users\administrator.xxx>IISRESET

I’m sure that with a “normal” content Web App when you get these wrong you just get an access denied whenever you try to browse the web app.  It is strange that with a MySite host and SSSC enabled everything works until you get an Unexpected Error/Server out of memory when you try to edit the user profile.

There is now a Technet article linked from Mirjam’s (http://technet.microsoft.com/en-us/library/ff758656.aspx)

The object cache stores properties about items in Microsoft SharePoint Server 2010. Items in this cache are used by the publishing feature when it renders Web pages

Well, if the warning relates tot he Publishing Infrastructure and the object cache I recon it has very little to do with the MySite site collections anyway.  Perhaps the warning should only appear in the event log when Publishing Infrastructure feature is enabled within site collections in a given web app.

So it really is important to get these settings right, if you are going to apply them.

Posted in SharePoint 2010 | Tagged , , | 1 Comment

Another round of Microsoft Updates stops SharePoint Services 3 with SQL Embedded from working

At TSG we provide support for a lot of customers. Our support teams tell me of a few recent SharePoint failures. This is at least the third time, that I can think of, a Windows Update has applied a patch to SharePoint Services v3 (there are still a lot of them out there) that it doesn’t recover from.  You seem to get a mixed bag of results dependent upon what Operating System and what SQL version was used and various other factors.

As a general guide we see many SharePoints that just carry on working, several that need Product and Technologies Config Wizard (PSConfigUI) run and a few hardcore ones that need more help.  As a general rule, if SharePoint was installed with SSEE/SQL Embedded/Standalone option it will fall over after some updates.  Running Product and Technologies wizard on its own won’t help as that will fail as well.  If you are in this situation, try following Article ID: 944154 it should get you back and running.

You always need to consider which database engine you are installing with.  SSEE will mean that you don’t have a database size restriction but it comes with a bunch of problems and issues that you also need to be aware of.  Remember with SQL 2008R2 Express Microsoft increased the size limit to 10GB.  Express edition has some of the drawbacks of embedded but it makes several of them go away.  If you are anywhere near that that size for your content databases then you really should consider the features of Workgroup or above.

Posted in SharePoint Services v3 | Tagged , , | 1 Comment

Reporting From SharePoint Lists

Introduction
Overview
One of my occasional tasks involves taking data that is stored in one, or more, SharePoint lists and presenting it in a formatted table or chart.  The requirements vary as does the infrastructure and there are many options available.  In this article I’m going to take a while to talk through the options available and how to make some of them work in different scenarios.

Scenarios to consider
SharePoint is a big technology, deployed in a wide variety of circumstances and situations.  Part of the job of a consultant is to determine the best way to use the available technology in any given circumstance.  Some of the things that need to be considered when choosing how to report on a SharePoint list include:

  1. SharePoint Version (3/2007 or 2010)
  2. SharePoint Edition (Services/Foundation, Server Standard, Server Enterprise)
  3. SQL Version (2005/2008 or 2008R2)
  4. SQL Edition (Express Advanced Services, Workgroup, Standard or above)
  5. Report access from LAN only or across an Internet Facing Web App
  6. Whether Reporting Services is in Integrated or Native Mode
  7. Size and complexity of the SharePoint farm
  8. Complexity of the reports required
  9. Whether unsupported techniques are allowed

I’ll try to clarify some of these a bit further later on.

Report Mechanism
There are several mechanisms available for displaying the list data.  I’ll list out the ones that I can think of here and then try to talk about each one.  Some I’ll cover in detail.  These include:

  1. SharePoint Designer DataView (with/without jQuery)
  2. Google Charts
  3. ChartPart for SharePoint by Victor Wilen
  4. SharePoint built-in Charts
  5. Reporting Services using XML Connection Type
  6. Reporting Services using SharePoint List Connection Type
  7. Reporting Services using a SQL view of the SharePoint Database
  8. Codeplex SSIS Extension that connects Integration Services to SharePoint List and warehouse the data in a separate SQL table
  9. Third party SharePoint List connection type
  10. Chargeable, third-party chart web parts

Discussion
SharePoint Designer DataView

Although it takes a bit of effort and doesn’t have a good deployment story SharePoint Designer can produce some impressive looking tabular data especially when augmented by a bit of jQuery or JavaScript.  There are then some examples of using these to produce simple graphs.  Natalya has a great example of this: Vertical Bar Graph in Data View Web Part (DVWP).  The trouble with these is how quickly the SharePoint UI gives up and you move into hand crafting XSLT to display your data.  The story gets a bit better in some ways with 2010, you can take any existing list view and just add conditional formatting, fonts and other tweaks to it; often without even unghosting things.  Your authentication type and current zone (default/intranet etc) doesn’t prevent things within dataviews from working the same way.

ChartPart for SharePoint by Victor Wilen
You can find this on codeplex http://chartpart.codeplex.com/.  I found this in use on one of our customer sites recently but have no personal experience of it, however it works for them.  The customer in question is currently running Windows SharePoint Services v3 and is talking to us about an upgrade to Foundation 2010.  unfortunately Victor hasn’t yet released a version that works in 2010.

SharePoint built-in Charts
You are running SharePoint Server 2010 with enterprise features; congratulations you now have access to use it’s built-in charts.  I’ve only tried these on one site so far where I wanted to do a pivot chart on a list.  Unfortunately I couldn’t find pivot chart so had to move on and present the data a different way.

Reporting Services using XML Connection Type
David Wise wrote a good article on: Connecting SQL Reporting Services to a SharePoint List.  David’s approach is to use the XML data type in SQL Reporting Services and connect to SharePoint’s lists web service.  The XML data type has been there for a few SQL versions so will work on an older Server as well.  You then construct an XML query that uses the GetListItems method to retrieve a bunch of data from the list.  You can then use the layout tools in reporting services to create a report or chart and finally show this chart inside a SharePoint site with either Page Viewer Web Part (PVWP) or Report Viewer Web Part.  As part of your web service query you can supply parameters that filter the amount of data that is retrieved from the list: How to filter a sharepoint list with report
parameters.  That way you get back the 100 results you want, not the 6000 items in the list.  However because the query is XML there is no useful data typing and you will find yourself using cDate and the like in the report designer to constantly turn things into the correct data type for you to display or sum etc.  Field names also come through in need of attention that adds more time to writing the report.

Immediately we’ve opened up a bunch of other caveats including: how do you authenticate to the lists web service, is Reporting Services in integrated or native mode, are reporting services and SharePoint co-located or on separate Servers, are you in the default zone or across the Internet.

Firstly, the XML data type supports Windows Integrated authentication only.  This means that your users will have access in a report only to lists and items that they have access to in the SharePoint User interface.  If SharePoint and Reporting Services are on the same Server (meaning also that you have a singe Web Front End) then this authentication can be relatively simple whether in Native mode or Integrated mode.  If SharePoint is on one Server and Reporting Services is on another then you need to come up with something else and it varies between modes and your location.

You can have Reporting Services in Native mode and configure SharePoint and Reporting Services to support Kerberos authentication.  This, on a really good day, allows your user’s authentication to double hop between Servers and arrive at the Reporting Services Server.  Remember though that Kerberos requires direct access to the Domain Controllers so even if you publish SharePoint to the Internet (https://sharepoint.example.com) it won’t work from there.  Remember also that you put a PVWP into a Web Part (or 2010 Wiki page) in order to display the report, you then put the internal URL of your report server with some render URL parameters, thus it’s not going to work when you are across the Internet anyway.  Internally you may choose to do away with the need for Kerberos by editing the Data Source once published and using Reporting Services to securely store the credentials for you.  Congratulations your users no longer need access rights to the SharePoint list that you are showing them in the report (maybe this is what you wanted anyway).

You can have Reporting Services in Integrated mode, set its Authentication Mode to Windows Authentication.  You are back to relying on Kerberos for the double hop.  Now you upload your reports to a document library, from the LAN you can click on a report and render it.  However reports only render in the default zone and will fail, if you are in another zone.  You can though create a web part page and put the Reporting Services Report Viewer web part on the page and then point that at the report in the document library.  This trick then allows the report to render from the Internet Zone and it doesn’t matter that we jump to the other Server for which SharePoint has an internal URL.  Oh no, we just failed again – Kerberos won’t work for us because we don’t have direct DC access from the external location.

You can have Reporting Services in Integrated mode, set its Authentication Mode to Trusted Account.  This should solve our need for Kerberos and allow us to render those reports from the Internet without a double hop to a second box.  Fail again, you can’t use data sources that only support Windows Integrated on a Report Server that is Integrated Mode when Authentication Mode is set to Trusted Account.   Oh and you have to look pretty closely into your choice of data source to find out what its authentication options are.

Reporting Services using SharePoint List Connection Type
If you are lucky enough to be using SQL 2008 R2 Reporting Services then you have access to the SharePoint List Connection Type.  Remember though, you won’t get this in Express AS or even Workgroup, its Standard or above (check the comparison matrix on Microsoft).  This connection type gives you a GUI for choosing the columns that you want to pull through, these then arrive with decent names and proper data types.  However there is no filtering in the query; you’ll get back 6000 items and need to use the report to through away all but the 10 you want to show to the user.

The SharePoint List connection type only supports Windows Integrated authentication so all the stuff above about Kerberos and Trusted Accounts applies here as well, meaning that it is also only suitable for the default zone across the LAN or a VPN.

Reporting Services using a SQL view of the SharePoint Database
Be warned this is officially not supported.  You will find people on the Internet warning you off this technique but lets explore it anyway.  SharePoint stores all the data in every list in the alluserdata table of the content database.  If you have control over the SQL Server then you can directly query that table to extract the data in a list.  My preference is to create another database and then create a series of views within it.  Each view looks at the SharePoint content database to retrieve results for a specific list.  I can then use those views to build reports.

Before I talk about how I build the queries, I want to first consider this in comparison to the systems above.  Firstly, here is another example of why you might create a new Site Collection instead of a new Site; consider using STSADM to create a new site collection in a new content database.  On a recent project, the main content database was 8GB so I created an additional one for the new application that I was going to report on, thus my unsupported queries can’t affect the bulk of the data.

Now I’m in TSQL I can do parameterised queries on the table and retrieve just the 10 rows that represent the 10 items in a list of 6000.  How can this not be better than using the SharePoint List connection type, pulling everything back and throwing most of it away.  Oh and check the clustered index on the AllUserData table, it looks like it should help with these queries as well.

Now I’m in TSQL I can use either Windows Authentication or SQL Authentication.  So if I’m in SharePoint Integrated mode and my report Server is on a separate machine, I can now use Trusted Account, store my SQL username and password and solve all my authentication issues.  With one more small trick I can finally make my reports available in ALL SharePoint Zones.

What is that small trick, well if you publish a report to a SharePoint library and click on it, to run it across the Internet you get a message telling you that this can only work in the default zone.  Instead I create a Web Part Page and put a Reporting Services Report Viewer Web Part on the page, I point that web part at the report’s RDL and the report can then render in the non default zone and I finally have my solution.  If it’s not supported, why is it the only way that works.

*****Update March 2012 Relating to SharePoint 2010/SQL 2008R2 Reporting Services**********
If you are using SharePoint 2010 you will have the SQL 2008R2 Report Viewer, you may have this with an older SharePoint version anywhere.  There are two tricks with this to force it to work:
1 – If you use the browse button to find your report in the list then it will put a relative URL in place.  You need to prefix this with the http:// stuff that turns it into an absolute URL for your default zone.
2-You need to expand the View section and untick Show Report Builder Menu
*************End Update

OK, so how do I query the SharePoint Content database.  In my new Site collection, I create a list called Regions:

SharePoint Regions List

SharePoint Regions List

This simple list just has the title column that I’ve renamed to Region.  If you use List Settings on the ribbon and inspect the URL, you will be able to see the list ID.  You need to take the %2Ds and turn them into dashes.  Then in SQL Management Studio you can use a query on the database similar to the following

SELECT      tp_ID AS ID, nvarchar1 AS [Region]
FROM         WSS_Content.dbo.AllUserData
WHERE     (tp_ListId = ‘{9045D3AA-475B-48DA-8043-631D4DC79421}’)
  AND (tp_IsCurrentVersion = 1) AND (tp_DeleteTransactionId = 0)
  AND (AllUserData.tp_RowOrdinal = 0)
ORDER BY ID

  • tp_ListId indetifies the list that we want data for
  • tp_IsCurrentVersion makes sure we only get the latest version of the item
  • tp_DeleteTransactionId makes sure we don’t get items that are in the recycle bin

The table has a bunch (64 I think) Varchar columns, 16 Int columns, some date columns, some SQL Variant columns etc.  The data type you pick in SharePoint effects which type of column will store the actual data.  The Title column is always stored in varchar1.  The next Text column you add to the list will be in varchar2.  Calculated columns are in the SQLVariant columns.  Lookups are in the Int columns (you need to watch for that one).  On a simple list, each item will only take one row in the all user data table (tp_RowOrdinal=0).  If you have more than 16 Integers then your list will start using RowOrdinal 0 and RowOrdinal 1 to fully represent each list item.  I don’t know if there is a limit on the number of RowOrdinals that a list can use, however each time it adds one, you take a performance hit reading or writing data to that list.  SharePoint 2010 has a default limit of 6 rows per list, but this can be increased.

Integers can be the biggest cause of a rollover onto another row, they are used for every lookup, also each time you add a workflow to a list there is a workflow status column created, which uses the Integer columns as well.  So 25 lookups, 10 workflows and you are onto three rows.  Remember to consider if you should normalise your design; this gets a chunk easier in 2010 anyway.

You generally need to look at the data in the list, look at a query with all columns in the database and match the two to see which database column/Row Ordinal, matches which list column.  I have also seen a Bamboo Solutions web part that will show this information without you having to work it out yourself.

If you run the query above, you’ll get a result similar to the one below:

SQL Select SharePoint Regions List

I’ll take the query that I’ve worked out above and use it to create the RegionsView or Vw_Regions, if you prefer.  If I have to do a view on another list that uses regions as a lookup I can then either join to the RegionsView, or use a sub select to turn the integer ID into the text of the lookup.

Conclusion
Like most things in SharePoint, there is more than one way to meet the customer’s requirements.  I have seen, in production, sites using: dataviews, reports based on SharePoint List connection type, reports based on SQL Views of the SharePoint database and sites using bespoke/third-party charting web parts.  I think that you have to consider each situation on its merits and decide which option is going to be the best fit for the customer.

Posted in Uncategorized | 1 Comment

Why don’t you get a dev to do that?

Introduction
A while ago I read an article by Marc Anderson about the difference between an IT Pro and a Dev.  Check it out:

http://sympmarc.com/2011/06/16/whats-the-difference-between-an-it-pro-and-a-dev/

It makes some interesting points.   I’m definitely in the IT Pro camp,  I’ve had a few goes at getting into development but I learned a long time ago that I was not very good  and should leave it to the experts.   However I do odd bits of code: JavaScript and C#.  I’ve produced:

•CRM3 workflow plugins for tidying strings
•SPD custom workflow extensions (various)
•Console app for updating SharePoint list items
•CRM3/4 form based JavaScript
•SharePoint JavaScript

The development that I can do, I’ve learnt because the opportunity to learn it arose.  If a project needs something done, that I don’t know how to do,  then there is an opportunity to learn something new and expand my skill set.  I like these opportunities but can only take advantage of them when it is sensible to do so.

I kindoff set myself a guide that I don’t produce things that need a UI to be written, I like  environments that provide the error handling and normally don’t need me to recurs through things.  My simple dev exercises get by without a separate technical design document and mean that there is only one set of project management.  This means more rapid project delivery.

The other day,  in the office,  I was showing one of my reports to a colleague who said why don’t you get a developer to do that?

Some Thoughts
This set me thinking (off and on)  about why and I have been struggling to answer.   At TSG we have several .Net developers (as well as other environments) but I think of them for doing:

•Custom web parts
•Bespoke web services
•Bespoke web applications
•Complex site/list definitions (done simple ones myself)

These things will generally have their own scope/URD/FRD, technical design and project management, hence you might describe our developers as software engineers. I’m slow compared to an experienced coder, so I would never try to compete on a full size development project but on a 2 day add-in this is less of an issue.

Report writing isn’t really Dev,  is it?
I remain to be persuaded that report writing is development, it is data analysis, queries and layout.  Although again I found myself adding code to a report recently as the customer’s server predated LookupSet.

More than one string to your bow
It is important to be fully employed and able to do as much of the work that comes your way as possible.   This helps make you a valuable member of staff and might keep you employed over less versatile colleagues.

Conclusion
So why don’t I get a dev to do that?
I want to deliver the best value on the project for the customer whilst providing a good solution.  At the same time I want to make myself valuable to my employer, add “strings” and progress my own personal development.

Posted in General Thoughts | Tagged , | Leave a comment

InfoPath and Day of Week function

The first InfoPath project that I carried out was for a pub chain.  Representatives would visit the bars and fill in a paper survey of atmosphere and service.  These paper surveys were then keyed into Excel and manually summarised onto several other spreadsheets.   I developed an InfoPath form that could be used instead and then published to a SharePoint Form library.  Views on the library would allow forms to be summarised by venue, area, month etc.

On the paper form they recorded the date and the day of the week so they could use that as a dimension to report on.  I thought that if I’m producing an interactive form then it would be daft to choose a date from a date picker and then have to tell the computer what day of the week that relates to.  Of course that’s really easy in Excel, or almost any other application; it turns out the same isn’t true with InfoPath.  This was InfoPath 2003 with SharePoint Portal Server 2003 but the situation hasn’t improved much since then.

InfoPath with its limited set of functions doesn’t have a day of week function at all.  Meaning that you would need to use script, or now managed code, to convert dates into a day.  This set me off looking for a way to use an InfoPath formula instead.  I came across a thing called Zeller’s Congruence; Merlyn and Wikipedia talk about it as shown below:
http://www.merlyn.demon.co.uk/zeller-c.htm
http://en.wikipedia.org/wiki/Zeller’s_congruence

Well the Zeller formula is pretty complex, uses the MOD function that is also missing from InfoPath and needs different handling for Jan and Feb vs the rest of the year.  I’ve taken just that component and put it into a form of its own (shown below)

formdesign

I have a single date field and a text field for day of the week.  I converted the text field into a dropdown list using numbers (0-6) as values and days (Sunday – Saturday) as labels.  There are then two action rules on date box.  The first action rule has a condition to detect January or February dates and execution of further rules does not continue when this one fires.

condition1

The action rule is as follows:

rule1 formula1

The formula is below:

Jan and Feb Zel
round((((floor((26 * number(substring(., 6, 2) + 10) – 2) / 10) + floor(number(substring(., 9, 2))) + floor((number(substring(., 3, 2)) – 1) * 1.25) + floor(number(substring(., 1, 2)) / 4) – floor(number(substring(., 1, 2)) * 2)) / 7) – floor((floor((26 * number(substring(., 6, 2) + 10) – 2) / 10) + floor(number(substring(., 9, 2))) + floor((number(substring(., 3, 2)) – 1) * 1.25) + floor(number(substring(., 1, 2)) / 4) – floor(number(substring(., 1, 2)) * 2)) / 7)) * 7)

The rest of the year rule has the following action:

rule2 formula2

Rest of Year Zel
round((((floor((26 * number(substring(., 6, 2) – 2) – 2) / 10) + floor(number(substring(., 9, 2))) + floor((number(substring(., 3, 2))) * 1.25) + floor(number(substring(., 1, 2)) / 4) – floor(number(substring(., 1, 2)) * 2)) / 7) – floor((floor((26 * number(substring(., 6, 2) – 2) – 2) / 10) + floor(number(substring(., 9, 2))) + floor((number(substring(., 3, 2))) * 1.25) + floor(number(substring(., 1, 2)) / 4) – floor(number(substring(., 1, 2)) * 2)) / 7)) * 7)

The end result of this is that when you set a date, the dropdown box will show you the weekday for that date.  This is achieved without the need for code or script but using just InfoPath formula and rules

Posted in InfoPath | Tagged , , , | 7 Comments

SQL Reporting Services and Working Days across datasets

I’ve completed several SQL Server Reporting Services Projects across versions that have needed to calculate the number of working days between two dates.  For example I’ve needed to calculate if fulfilment of an order or some other process has met its Service Level Agreement.

When I am using a SQL Server datasource it is relatively straight forward to remove weekends.  I then need to take out the Bank Holidays etc.  Some of my work is with Microsoft Dynamics CRM which has  handy way to record business closures.  As long as I train the administrators to record all closures in single days I can use a Select statement like the one below:

DATEDIFF(dd, new_datereceived, new_despatchdate) + 1)
– DATEDIFF(wk, new_datereceived, new_despatchdate) * 2
– (CASE WHEN DATENAME(dw, new_datereceived) = ‘Sunday’ THEN 1 ELSE 0 END)
– (CASE WHEN DATENAME(dw, new_despatchdate) = ‘Saturday’ THEN 1 ELSE 0 END)
– (SELECT COUNT(*) AS Expr1 FROM FilteredCalendarRule WHERE      (starttime BETWEEN FilteredNew_myorder.new_datereceived
AND FilteredNew_myorder.new_despatchdate)
AND (subcode = 5))
– 1
AS [WorkDays to Ready]

On a recent project I was reporting from a third-party business system, for the sake of this article we’ll call it fluorite.  Now fluorite is SQL based however the developers wouldn’t give me access to directly query their tables.  Instead they created a report model and recommended that I reported from that using Report Builder.  Now using a report model presents a few issues and restrictions.

One issue was hot to store the list of business closures.  The customer already had SharePoint (Windows SharePoint Services v3) and the reports were using SQL 2008 R2 Reporting Services.  This meant that I could create a SharePoint List for Business Closures and then create a SharePoint List Datasource with a Dataset pointing at the closures list.  The screenshots below show the Dataset properties and the query builder.  In the query I’m pulling through the Title and the Date column.  The Title column isn’t really needed here as I don’t care why they’re closed.  The Date column just contains the date of the closure

SharePoint Dataset Properties

SharePoint Dataset Properties

SharePoint List DataSet

Query Builder

I thought that I could use one of the new 2008R2 lookup or lookupset functions to read/filter and count results from my Business Closures dataset.  This turned out not to be possible.  On thinking again I realised that I need to put code on the Report to process this dataset.  The following articles helped me here:

Firstly Joshua Clark has written a great article on how to put code onto a Reporting Services project http://joshua-clark.blogspot.com/2008/10/how-to-write-custom-code-in-sql.html.

Secondly I found a blog post on how to remove the weekends with code.  I could do this with an expression but this is a good way http://social.msdn.microsoft.com/Forums/en-US/sqlreportingservices/thread/c8558211-9414-482b-8f78-e9e307b5c8ea/.  Check out the answer from Nehemiah Willis who provides some code for iterating through all days between two dates and ignoring the weekends.  The article then says

if you want to account for holidays it would prolly be best to have a business day lookup table then u can query for business like:
select count(*) from calendar where calendar_date between @startdate and @enddate

So now I need a way to do that without being able to write a select statement but needing to process a dataset of business closures.  Well the answer can be found in the following thread: http://social.msdn.microsoft.com/Forums/en-US/sqlreportingservices/thread/a7d59224-0ee5-491e-883b-2e5fcb3edeab.  Check the answer by Miguel Catalao

Heres how, the ideal is to “transform” the data set into a multivalued parameter (or if you need several fields, transform it in multiple multivalued parameters).
The multivalued Report Parameter must have the following characteristics:
Hidden = True
Allow Multiple Values = True
Available Values tab:
Chose the desired dataset. Select the searchable id as Value id, and the field you want to expose as Label Field.
Default Values Tab – Get Values from a Query.
Choose the same Dataset as choosen in the available Values Tab.
Value Field the same you choose for value id.
Set the parameter to never refresh (or it will be loading the data from each iteraction of another parameter).

So I now know how to create code, have a function that will remove weekends and know how to process my dataset.  To put this all together I added the following code:

Report Code

Report Code block

Function work_days(ByVal StartDate As DateTime, ByVal EndDate As DateTime) As Integer
Dim Bus As Integer
Dim temp As DateTime
Dim i as Integer
Bus = 0
temp = StartDate
While (temp < EndDate)
if (temp.DayOfWeek <> DayOfWeek.Saturday And temp.DayOfWeek <> DayOfWeek.Sunday) Then
Bus += 1
End If
temp = DateAdd(“d”, 1, temp)
End While
for i = 0 to Report.Parameters!ClosureDates.Count() -1
if (Report.Parameters!ClosureDates.Value(i) >= StartDate and Report.Parameters!ClosureDates.Value(i) <= EndDate ) Then
Bus -= 1
End if
next i
Return Bus
End Function

Then all I need to do is call the code from a placeholder expression in my report:

RS Expression Box

Expression calling the report code

My report can now use the secondary data set to process the bank holidays and deliver on the requirement at the beginning

Posted in SQL Server | Tagged , , | 2 Comments

Hello world!

Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!

Posted in Uncategorized | 1 Comment