Friday, 23 December 2011

My ASP.NET MVC Page (using Forms Authentication) is not Rendering CSS and Javascript on the Login View - All Requests Show as Redirects to the Login Controller Action in Fiddler

I recently tried to deploy an ASP.NET MVC 4 mobile application (using jQuery Mobile 1.0) to one of the Oakton Amazon Web Services (AWS) web servers. This application used Forms Authentication.

I span up a completely new instance of Windows Server 2008 R2, restored a backup database and xcopy deployed the new application to a newly created Virtual Directory. I ran the Microsoft Web Platform Installer to get the latest MVC framework and supporting components.  I enabled Forms And Anonymous Authentication on that IIS 7 Site and made sure that All users can access the CSS files even before logging in with the following entries in the web.config file:

<location path="Content">
            <system.web>
                  <authorization>
                        <allow users="*" />
                  </authorization>
            </system.web>
      </location>
      <location path="Styles">
            <system.web>
                  <authorization>
                        <allow users="*" />
                  </authorization>
            </system.web>
      </location>
      <location path="Scripts">
            <system.web>
                  <authorization>
                        <allow users="*" />
                  </authorization>
            </system.web>
      </location>

However, when hitting this server, the login screen just wouldn't render correctly. Looking at the requests showing up in Fiddler - it seemed that even the CSS and javascript files were being redirected to the login page. Some of my colleagues had a look - but were also stumped. All the permissions looked right!

While the application pool user had the correct permissions, the problem was that Anonymous users (i.e. everyone before login) were not running as the Application pool user - they were still running as the default which is IUSR. I simply edited the Anonymous authentication credentials setting in IIS 7 to use the Application Pool Credentials rather than IUSR. Alternatively, I could have given the IUSR_MachineName user permissions on the required supporting directories to fix the problem.


This resolved the problem - and CSS and jQuery were again accessible for all users (including anonymous ones). As usual this seems pretty obvious in hindsight - but the mad rush to get the whole environment running and to deploy the application meant that this critical link was missed.

Let this be a reminder.
DDK

Friday, 16 December 2011

TFS 2010 - How do I create email alerts when anything is checked into TFS, Build's Fail/Succeed or Work Items are Assigned/Changed?

This is quite simple to do with the right tools installed on your development machine:

1) Ensure that you have an SMTP Server Configured for your TFS box - as described on MSDN at "How to: Configure SMTP Server and E-mail Notification Settings in the Services Web.Config File" 
2) Install the Visual Studio 2010 Team Foundation Server Power Tools from MSDN
3) Open up the Alert Explorer which installs a set of alert actions. The alert explorer is accessible from several different menus within Visual Studio 2010. See screenshots below:

From the Team Menu:

On TFS Work Items:


From the top level TFS Server Node:



On Branches or Folders within the TFS Source Control Explorer windows:



There are several predefined alerts that come with the Visual Studio TFS Power Tools. These are shown below:



The alert filters are quite flexible. You can modify and change these Alerts in the alter definition editor as shown below:


DDK

Monday, 5 December 2011

SQL Master Data Services (MDS) - How to I retrieve the connection string from Microsoft.MasterDataServices.Workflow.Properties.Settings? - it is not available via ConfigurationManager.AppSettings or ConfigurationManager.ConnectionStrings

A question today from one of my Colleagues surrounded the retrieval of a value from an app.config file used by a Master Data Services Workflow Extender.

Master Data Services (MDS) workflow extenders support the creation of Business Logic (amongst other things such as external WCF Service calls) against data changes in the Master Data Services catalogs. These workflow extenders run in the context of the SQL Server Master Data Services executable (Microsoft.MasterDataServices.Workflow.exe) - and consequently rely upon the app config file named Microsoft.MasterDataServices.Workflow.exe.config.

Unfortunately, it doesn't appear as though the MDS System Settings classes give you access to that connection string either (http://msdn.microsoft.com/en-us/library/ff487028.aspx). Consequently, you have to default to basic .NET functionality for handling configuration sections.

The traditional appSettings section of the Microsoft.MasterDataServices.Workflow.exe.config file is not used by MDS - it actually uses an applicationSettings section of the file that is not accessible by the usual ConfigurationManager.AppSettings and ConfigurationManager.ConnectionStrings nodes in the config file. Specifically, it uses a custom System.Configuration.ApplicationSettingsGroup of type System.Configuration.ClientSettingsSection.  


To consume this section, you can use the following method (GetSettingValueFromAppConfig):

using System.Configuration;
using Microsoft.VisualStudio.TestTools.UnitTesting;

namespace CompanyName.MasterDataWorkflow.Tests
{
    /// 
    /// Summary description for GetConnectionSettingsTest
    /// 
    [TestClass]
    public class GetConnectionSettingsTest
    {
        public GetConnectionSettingsTest()
        {
        }

        [TestMethod]
        public void TestGetConnectionSettings()
        {
            //Should return the value from the local applicationSettings Node.
            //Fully qualified section name
            const string sectionName = "applicationSettings/Microsoft.MasterDataServices.Workflow.Properties.Settings";
            const string settingName = "ConnectionString";

            Assert.AreEqual(
                GetSettingValueFromAppConfig(settingName,sectionName),
                "Server=.;Database=MasterDataServices;Integrated Security=SSPI") ;

        }

        private string GetSettingValueFromAppConfig(string settingName, string sectionName)
        {
            System.Configuration.ClientSettingsSection section =
               (System.Configuration.ClientSettingsSection)
                System.Configuration.ConfigurationManager.GetSection(sectionName);
            foreach (SettingElement setting in section.Settings)
            {
                string value = setting.Value.ValueXml.InnerText;
                string name = setting.Name;
                if (name.ToLower().StartsWith(settingName.ToLower()))
                {
                    return value;
                }
            }
            return string.Empty;
        }
    }
}


DDK

Thursday, 1 December 2011

SharePoint 2010 - How to Get XML Sample Data for Debugging Web Parts that support XSL transform customization (e.g. Search and Content Query Web Parts)

XSLT is the preferred technology for customizing the visual output of SharePoint Web parts (primarily as it is standards-based and non-proprietary). However, there is absolutely no XSLT debugging support provided within the SharePoint environment (this is surprising as it is so pervasive). With these limitations in mind, XSLT should ideally be developed outside of SharePoint in a proper IDE that supports XSL debugging such as Visual Studio 2010 or Altova's XMLSpy.

However, debugging XSLT requires a data source (in the form of XML sample data) as input to the XSL transform. The simplest way to get a sample input XML file provided by SharePoint for a XSL-enabled web part is to do the following:

1) Open ContentQueryMain.xsl and find the XSL that looks like the following:

<xsl:template match="/">
  <xsl:call-template name="OuterTemplate"> </xsl:call-template></xsl:template>

Replace this XSL with the following so that it outputs all xml rather than just applying the OuterTemplate xsl template :
<xsl:template match="/">
  <xmp><xsl:copy-of select="*" /></xmp>
</xsl:template>

To avoid modifying system files like ContentQueryMain.xsl, you can make a copy the ContentQueryMain.xsl e.g. ContentQueryMainDebug.xsl and point your CQWP to the Content QueryMainDebug.xsl file instead - using the MainXslLink property of the Content Query Web Part (defined in the .webpart file).

2) View Source on the page and locate the block of XML that has been output by your web part.

3) Save it as a text file and use it as input into your favourite XSL development tool.

DDK

SharePoint 2010 - Modifying the Core Search Results Web Part to Display Results Sorted by Site Name or Document Title (With Paging Limitations...)

I recently had a requirement from my client to have the following functionality in SharePoint 2010:
a) The Search Core Results web part displaying sorted in Alpha order
b) The Search Core results part should just show sites (not documents) that the current user has access to.

Requirement b) was simple - you can set the keywords on the Core Search Web Part to just display sites (using the "contentclass:STS_Site" keyword in the "Fixed Keyword Query" property - and the part would default to showing a list of sites the current user has access to (via the normal SharePoint 2010 security trimming functionality).


However, the first requirement was a little bit trickier. The default Core Search Web part only provides for 2 search sort options - by date and by relevance (which is the default).


To fix this, the XSLT which defines the search results needs to be changed. This XSLT is specific to the instance of the web part - modifying it won't affect the normal search functionality of your site.

To modify the XSLT for the search results web part, you need to first uncheck the "Use Location Visualization" checkbox. The XSLT that opens is around 700 lines long and has a lot of different XSL templates defined within it. To sort the search results by alpha order, you need to use an xsl-sort call within the main apply-templates call.

Some sites such as http://kwizcom.blogspot.com/2008/11/how-to-change-moss-search-retults-sort.html have a simple suggestion - but this would not work as the "select node is missing" on the "apply-templates directive. You need to provide a valid parent to allow for sorting. The fix was to change the empty apply-templates node in the default search XSL:
<xsl:apply-templates />

to the following:
<xsl:apply-templates select="All_Results/Result">
    <!-- The xsl-sort needs operate upon a single field - it doesn't work if the sort has to evaluate child nodes--> 
    <xsl:sort select="title" />
   </xsl:apply-templates>

Note that I didn't require the other elements in the Search XML (the TotalResults and the NumberOfResults nodes), so this solution may not work in your scenario. This list can then act as a facility for cross-site collection navigation (which is not available out of the box in SharePoint 2010)


Another limitation of this approach is that it will only work on a non-paged resultset - which is a pretty major limitation! In our scenario (for Phase 1 of our provisioning solution), it was acceptable for our customer to increase the page size to avoid any pagination from occurring. Your mileage may vary.

Other solution options are:
  1. Scripting - Using jQuery to do a search call and sorting and paging the results yourself
  2. Server side - with your own custom web part that also does the paging for you - using the SPGridView or inheriting from
    Microsoft.Office.Server.Search.WebControls.CoreResultsWebPart
    
Phase 2 of our project will use a combination of Server Side customization (as described at http://msdn.microsoft.com/en-us/gg620579) by extending the CoreResultsWebPart
 and the jQuery approach above for easy inline searching of accessible site collections.
DDK

Wednesday, 9 November 2011

Error executing child request for ChartImg.axd when Rendering a Custom Charting WebPart in SharePoint 2010

 I deployed a custom SharePoint 2010 web part today that uses the Microsoft Chart Controls For ASP.NET 3.5. However, after deployment, I began receiving the following exception when it rendered within an IIS 7 Hosted site (SharePoint 2010 running Windows 7)

Error executing child request for ChartImg.axd

I already had the handler entries set up the same as seen in the code samples. The main problem is that most web.config examples and the official samples don't include the required web.config entries for the ChartImg.axd httphandler (so that they operate correctly in IIS 7 and above).

Windows 7 and Windows Server 2008 use IIS 7 so using the httpHandlers section won’t work like it does in previous versions of IIS – you must instead add an item to the handlers config section of the system.webserver node, like so:

<system.webServer>
    <handlers>
      <add name="ChartImageHandler" preCondition="integratedMode" verb="GET,HEAD,POST" path="ChartImg.axd" type="System.Web.UI.DataVisualization.Charting.ChartHttpHandler, System.Web.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
    </handlers>
  </system.webServer>

You should also make sure that you are using the POST value in the verb attribute like so:

verb="GET,HEAD,POST".

Once your web application is configured correctly, you should now be able to go to http://SITENAME:PORTNUMBER/Charting.axd and receive a blank screen without any errors. This shows that the charting httphandler is now operational.

Full example of web.config with correct Chart handler entries:
<?xml version="1.0"?>
<configuration>
  <appSettings>
    <add key="ChartImageHandler" value="storage=file;timeout=20;dir=c:\TempImageFiles\;" />
  </appSettings>
  <connectionStrings/>
  <system.web>
    <pages>
      <controls>
        <add tagPrefix="asp" namespace="System.Web.UI.DataVisualization.Charting"
        assembly="System.Web.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
      </controls>
    </pages>
    <compilation debug="true">
      <assemblies>
        <add assembly="System.Web.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/>
        <add assembly="System.Windows.Forms, Version=2.0.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089"/>
      </assemblies>
    </compilation>
    <httpHandlers>
      <add path="ChartImg.axd" verb="GET,HEAD,POST" type="System.Web.UI.DataVisualization.Charting.ChartHttpHandler, System.Web.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" validate="false"/>
    </httpHandlers>
  </system.web>
  <system.webServer>
    <handlers>
      <add name="ChartImageHandler" preCondition="integratedMode" verb="GET,HEAD,POST" path="ChartImg.axd" type="System.Web.UI.DataVisualization.Charting.ChartHttpHandler, System.Web.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
    </handlers>
  </system.webServer>
</configuration>

Tuesday, 1 November 2011

FIX - I'm using the ASP.NET 3.5 Charting Controls in SharePoint 2010 to render charts - but the charts don't print out in Internet Explorer (they're fine in Firefox). Why?

Internet Explorer seems to have to re-download the images generated by the charting control when printing them - if the Chart Image Handler (ChartImg.axd) setting is at deleteAfterServicing=true, then the subsequent request to render the chart with that same GUID (for printing) will fail.

To resolve, you can change the setting on your web.config for "deleteAfterServicing" to false so it doesn't automatically remove the requested image each run.

<add key="ChartImageHandler" value="storage=file;timeout=20;dir=c:\inetpub\YourApplicationName\temp\;deleteAfterServicing=false">

It is also best to have a cleanup job on this directory depending on your space requirements and number of chart requests made to your site.

DDK

Monday, 31 October 2011

Specifying Sort Order When Querying the SharePoint List Web Service with CAML and jQuery (via Lists.asmx)

I had a couple of problems today with specifying the sort order of data returned from a CAML query via a jQuery AJAX call. It seemed that no matter what setting I put in, it always seemed to return an unordered resultset.

I resolved the problem by looking at the requests in Fiddler that were produced by U2U CAML query Builder. The main problem was that I wasn't correctly wrapping the CAML "Query" node in a parent "query" node. Now you only have to do this when you're doing jQuery calls directly against web services - and are shielded from this somewhat when you are calling the SharePoint client object model (for details you can look here http://msdn.microsoft.com/en-us/library/gg701783.aspx). Why I'm using web service calls directly is a discussion for another time (and I wouldn't automatically turn to using it as a preference - as described here http://msdn.microsoft.com/en-us/library/ee539764.aspx) - but I'll say for now that it is for consistency with existing code (The client object model is using the web services underneath as well, but has batching facilities). See below for an example of a jQuery call which does sorting that worked for me. U2U CAML Query builder will also give you the valid internal name for the field.

var errorMessage;
  //Add order by if variable is set in other part of page (so it can be switched on or off on a page-by-page basis.
  if (typeof missingCategoryDisplayMode_IsOnline != 'undefined' && missingCategoryDisplayMode_IsOnline == true  ) //Variable should be set by another content query web part on page
  {
   var orderBy = "\
       \
      ";
  }

try
  {
   var contentCategoryCAML = "<?xml version='1.0' encoding='utf-8'?> \
        <soap:Envelope xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' \
           xmlns:xsd='http://www.w3.org/2001/XMLSchema' \
           xmlns:soap='http://schemas.xmlsoap.org/soap/envelope/'><soap:Body> \
           <GetListItems xmlns='http://schemas.microsoft.com/sharepoint/soap/'> \
           <listName>Content Categories</listName> \
           <query>\
           <Query xmlns=''>\
           " + orderBy +
           "</Query>\
           </query>\
           <viewFields>\
           <ViewFields xmlns='' />\
           </viewFields>\
           <queryOptions>\
           <QueryOptions xmlns=''/> \
           </queryOptions></GetListItems> \
         </soap:Body></soap:Envelope>";
   $.ajax({
    url: listUrl,
    type: "POST",
    dataType: "xml",
    data: contentCategoryCAML,
    complete: ContentCategoryResult,
    contentType: "text/xml; charset=\"utf-8\""
   });
  }
  catch (ex)
  {
   displayError("There was an error retrieving data from the Content Categories list. Please see IT support for assistance.");
   return;
  }
  
  function ContentCategoryResult(xData, status)
  {
   $(xData.responseXML).find("z\\:row").each(function () 
   {           
      var Title = $(this).attr("ows_LinkTitle");           
      arrContentCat.push(Title);  //Adding the results in an array
   });
  }


This adds items to an array that are used later for rendering. I needed a different sort order based on the page - so this was done using a seperate javascript file that is placed into a Content Editor Web Part on pages to set the sort order variable.

I also use the SharePoint client model to programatically determine the site collection path and to ensure that the list web service of the current site collection root is used (you can't determine that just by the url):

function MenuInitialize()
 {
     var clientContext = new SP.ClientContext();
        var siteColl = clientContext.get_site();
        myweb = siteColl.get_rootWeb();
        clientContext.load(myweb);
        /* Execute async required*/
        clientContext.executeQueryAsync(Function.createDelegate(this, executeRequestAndRender), Function.createDelegate(this, getFailed));
 }
 
 function executeRequestAndRender() {
  var siteCollectionUrl = myweb.get_serverRelativeUrl();
  var listWebServiceUrl = siteCollectionUrl + "/_vti_bin/lists.asmx"
  //When at the root, siteCollectionUrl is '/', but other sites have /sitecollection/sites
  listWebServiceUrl = listWebServiceUrl.replace('//', '/');
  /* When complete, we can render requests */
  RenderMenu_ContentCategoryList(listWebServiceUrl);
 }


For the SharePoint Client Object model to work of course, you should always tell Script on demand (SOD) to process the code that references the SP client object model only when the SP.js file has been loaded:

$(document).ready(function()
{
 /*Ensure that the Sp.js is loaded (using Script on Demand (SOD) SharePoint library) so the client object model functions correctly */
 SP.SOD.executeOrDelayUntilScriptLoaded(MenuInitialize, 'SP.js');

DDK

Tuesday, 18 October 2011

SharePoint 2010 - My jQuery Scripts are Broken on the Publishing Portal Thumbnails.aspx Page. All other pages are fine.

I've had this problem many times in ASP.NET but never in SharePoint 2010. Today, one of our jQuery script implementations was failing on just a couple of the SharePoint pages - in particular the Publishing Portal Thumbnails Page.

All the jQuery objects were returning null values even though it worked for all the pages in the SharePoint site. The problem is back to that golden oldie of the jQuery object shortcut of $ conflicting with the Microsoft AJAX framework. Both frameworks use the $ as an object shortcut and so this causes problems. I almost felt nostalgic getting this exception :o).

Why does it only happen on a couple of pages in SharePoint 2010? They are the only ones which have script references to the MS AJAX javascript libraries.

There are ways around this using aliases (via the jQuery.noConflict() command) - but the simplest way is to just use jQuery as a best practice when working within SharePoint 2010 -  with a find and replace of "$(" with "jQuery(" as needed. This fixed the problem.

Too easy!

DDK

Wednesday, 12 October 2011

CSS Positioning Refresher - Relative vs Absolute vs Static vs Fixed - SharePoint 2010 Branding

My current client is a media company who wanted branding done as part of a SharePoint 2010 Portal implementation project. One problem arose because their standard operating environment uses Internet Explorer 7. Our IE 7 specific issues arose because of IE7's position behaviour and it's interaction with a dynamic pop-down DIV that we were injecting with jQuery.

In IE7 (not in IE8+ or Firefox), our whole content div in SharePoint was getting pushed down by our hidden div used as part of the pop-down menu. Turns out our resolution to fix this IE 7 was to make our hidden div to use position:absolute rather than relative/static.

The critical difference between relative and absolute in particular is not how it changes the behaviour of the div - but rather how it affects the flow of OTHER elements on the page. This is particularly important when you are injecting elements into SharePoint with jQuery. The best explanation of CSS positioning (and how it affects other html elements) I've found is here:

http://www.w3schools.com/css/css_positioning.asp


RELATIVE: The content of relatively positioned elements can be moved and overlap other elements, but the reserved space for the element is still preserved in the normal flow.
ABSOLUTE:
Absolutely positioned elements are removed from the normal flow. The document and other elements behave like the absolutely positioned element does not exist (Yes! this is what we really want)

This still doesn't explain the rendering difference in IE7 vs IE8 vs Firefox apart from the fact that other browsers seem to be more forgiving when specifying your CSS positioning compared to IE7 (in Standards mode).

DDK

Wednesday, 5 October 2011

SharePont 2010 - Why is Inline Editing not working even when enabled for the current view?

One of my colleagues at Oakton had issues in that a SharePoint 2010 Custom List wasn't allowing editing inline.

Even when modifying the view and enabling the "Allow Inline" editing property of the list view - no errors, it just didn't work and the Edit icon just opened the full blown SharePoint modal dialog box as usual.



The problem was that the list view must use the Default Style for Inline editing to work.



DDK

Thursday, 8 September 2011

Fix - My SonicWall Client Keeps Prompting for a Phone Book Entry even though I'm already connected

When connecting via Telstra Mobile Broadband and attempting to connect to a SonicWall VPN with the SonicWall client, I kept getting prompted for a Dialup Phone Book entry mutliple times. Every time I connected (though not when using normal wireless from home), I had to keep attempting a connection until the username and password prompt displayed.

The fix for this problem is to ensure that the "LAN Only" drop down is selected:
1) Open the SonicWall VPN Global Client
2) Click on Peers and Choose the correct connection; Click Edit
3) Under Interface Selection, Choose "LAN Only" rather than "Automatic" in the drop down list.


Tuesday, 6 September 2011

Fix - Duet Enterprise/SharePoint 2010 Exception - "System.ServiceModel.QuotaExceededException: The size necessary to buffer the XML content exceeded the buffer quota."

When attempting to establish communications between SAP and SharePoint (using the Duet Enterprise Claims provider) today at a new client, I encountered the following exception in SharePoint (as per the ULS logs):

InnerException 1: System.ServiceModel.QuotaExceededException: The size necessary to buffer the XML content exceeded the buffer quota. Server stack trace:

at System.ServiceModel.Channels.BufferedOutputStream.WriteCore(Byte[] buffer, Int32 offset, Int32 size)
at System.Xml.XmlStreamNodeWriter.FlushBuffer()
at System.Xml.XmlBinaryNodeWriter.FlushBuffer()
at System.Xml.XmlStreamNodeWriter.GetBuffer(Int32 count, Int32& offset)
at System.Xml.XmlBinaryNodeWriter.UnsafeWriteText(Char* chars, Int32 charCount)
at System.Xml.XmlBinaryNodeWriter.WriteText(Char[] chars, Int32 offset, Int32 count)
at System.Xml.XmlBaseWriter.WriteChars(Char[] chars, Int32 offset, Int32 count)
at System.Xml.XmlBinaryWriter.WriteTextNode(XmlDictionaryReader reader, Boolean attribute)
at System.Xml.XmlDictionaryWriter.WriteNode(XmlDictionaryReader reader, Boolean defattr)
at System.ServiceModel.Channels.ReceivedFault.CreateFault12Driver(XmlDictionaryReader reader, Int32 maxBufferSize, EnvelopeVersion version)
at System.ServiceModel.Channels.MessageFault.CreateFault(Message message, Int32 maxBufferSize)
at System.ServiceModel.Channels.SecurityChannelFactory`1.ClientSecurityChannel`1.TryGetSecurityFaultException(Message faultMessage, Exception& faultException)
at System.ServiceModel.Channels.SecurityChannelFactory`1.SecurityRequestChannel.ProcessReply(Message reply, SecurityProtocolCorrelationState correlationState, TimeSpan timeout)
at System.ServiceModel.Channels.SecurityChannelFactory`1.SecurityRequestChannel.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message) Exception rethrown
at [0]:
at Microsoft.SharePoint.BusinessData.SystemSpecific.Wcf.WcfSystemUtility.Execute(Object[] args)
at Microsoft.SharePoint.BusinessData.SystemSpecific.Wcf.WcfSystemUtility.ExecuteStatic(IMethodInstance methodInstance, ILobSystemInstance lobSystemInstance, Object[] args, IExecutionContext context)
at Microsoft.SharePoint.BusinessData.Runtime.DataClassRuntime.ExecuteInternalWithAuthNFailureRetry(ISystemUtility systemUtility, IMethodInstance methodInstanceToExecute, IMethod methodToExecute, ILobSystemInstance lobSystemInstance, ILobSystem lobSystem, IParameterCollection nonReturnParameters, Object[] overrideArgs)
at Microsoft.SharePoint.BusinessData.Runtime.DataClassRuntime.ExecuteInternal(IDataClass thisDataClass, ILobSystemInstance lobSystemInstance, ILobSystem lobSystem, IMethodInstance methodInstanceToExecute, IMethod methodToExecute, IParameterCollection nonReturnParameters, Object[]& overrideArgs)

I also received the following exception immediately after the above exception:
System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.ServiceModel.QuotaExceededException: The size necessary to buffer the XML content exceeded the buffer quota. Server stack trace:

at System.ServiceModel.Channels.BufferedOutputStream.WriteCore(Byte[] buffer, Int32 offset, Int32 size)
at System.Xml.XmlStreamNodeWriter.FlushBuffer()
at System.Xml.XmlBinaryNodeWriter.FlushBuffer()
at System.Xml.XmlStreamNodeWriter.GetBuffer(Int32 count, Int32& offset)
at System.Xml.XmlBinaryNodeWriter.UnsafeWriteText(Char* chars, Int32 charCount)
at System.Xml.XmlBinaryNodeWriter.WriteText(Char[] chars, Int32 offset, Int32 count)
at System.Xml.XmlBaseWriter.WriteChars(Char[] chars, Int32 offset, Int32 count)
at System.Xml.XmlBinaryWriter.WriteTextNode(XmlDictionaryReader reader, Boolean attribute)
at System.Xml.XmlDictionaryWriter.WriteNode(XmlDictionaryReader reader, Boolean defattr)
at System.ServiceModel.Channels.ReceivedFault.CreateFault12Driver(XmlDictionaryReader reader, Int32 maxBufferSize, EnvelopeVersion version)
at System.ServiceModel.Channels.MessageFault.CreateFault(Message message, Int32 maxBufferSize)
at System.ServiceModel.Channels.SecurityChannelFactory`1.ClientSecurityChannel`1.TryGetSecurityFaultException(Message faultMessage, Exception& faultException)
at System.ServiceModel.Channels.SecurityChannelFactory`1.SecurityRequestChannel.ProcessReply(Message reply, SecurityProtocolCorrelationState correlationState, TimeSpan timeout)
at System.ServiceModel.Channels.SecurityChannelFactory`1.SecurityRequestChannel.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message) Exception rethrown
at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at BCSServiceProxy.IWXManageCustomerIn.FindCustomerByElements(FindCustomerByElementsRequest request)
at BCSServiceProxy.WXManageCustomerInClient.BCSServiceProxy.IWXManageCustomerIn.FindCustomerByElements(FindCustomerByElementsRequest request)
at BCSServiceProxy.WXManageCustomerInClient.FindCustomerByElements(BPCCustGetAll BPCCustGetAll) -
-- End of inner exception stack trace ---
at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
at System.Reflection.MethodBase.Invoke(Object obj, Object[] parameters)
at Microsoft.SharePoint.BusinessData.SystemSpecific.Wcf.WcfSystemUtility.Execute(Object[] args)

With my experiences with SAP, buffer overflows normally indicate that a large Java stack trace is coming back from SAP and it overloads the WCF client buffer which is expecting a normal SOAP response.

To see the real exception, I tried to use Wireshark (http://www.wireshark.org/)- but our setup had (not by choice) Netweaver and SharePoint on the same box - and Wireshark cannot listen to localhost traffic.

As usual, the Fiddler Tool HTTP proxy (http://www.fiddler2.com/fiddler2/) came to the rescue- as it can actually listen to traffic occurring between applications on the same machine. When running Fiddler and listening to the https traffic, it popped up with an exception regarding certificate errors:

Session #24: The remote server (ausyd-a-sh1) presented a certificate that did not validate, due to RemoteCertificateChainErrors.

SUBJECT: CN=ausyd-a-sh1, OU=I0020310622, OU=SAP Web AS, O=SAP Trust Community, C=DE
ISSUER: CN=ausyd-a-sh1, OU=I0020310622, OU=SAP Web AS, O=SAP Trust Community, C=DE
EXPIRES: 1/01/2038 11:00:01 AM
I went to the server url in Internet Explorer and sure enough there was a certificate error occurring when going to the URL of the SAP server - at https://ausyd-a-sh1:8001/


Basically this exception was occurring because the SAP SSL certificate was not in the Trusted Root Certification Authorities store in Windows. To import it, I just opened mmc at a command prompt and added the certificates Snap-In (via File - Add/Remove Snap In... - for the Local Computer.

In Certificates (Local Computer), I went to Trusted Root Certification Authorities - Certificates and Imported the SAP Certificate in.

However, I kept getting the same exception. On closer inspection, the thumbprint of the SharePoint Security Token Service (STS) certificate was not the same as what had been imported into SAP - this was becuase SharePoint had been reinstalled - which changes the SharePoint STS certificate (i.e. SharePoint STS certificates are install specific).

I then received the following Exception - which was related to user mappings in SAP:

An unsecured or incorrectly secured fault was received from the other party. See the inner FaultException for the fault code and detail


To resolve this last exception:
1) We made sure SharePoint:: was used to prefix the name in SAP (in the VUSEREXITID table) - this prefix would normally be added through the Duet Active Directory user import job (specific to Duet Enterprise) - but we don't have AD in our environment.
2) We then updated the STS certificate in the SAP Certificate store (using SAP transaction /nstrust)
3) Used the /nsaml2 transaction to update the certificate used there as well.

This resolved all our SAP to SharePoint communication issues.

DDK

Tuesday, 30 August 2011

FIX - SharePoint 2010 - Disabled New, Extend and Delete Buttons for Web Applications in Central Administration

My current client (a large NSW Government Department) has a test environment that doesn't have Active Directory and they required Duet Enterprise to be installed. After a basic install of SharePoint 2010 with SP1- in Windows Server 2008 R2, SQL 2008 R2, I was alarmed to find that I couldn't create new web applications via Central Administration. When I hovered over the buttons, it said that this functionality is disabled due to insufficient permissions:


I also couldn't add users to the farm administrators account - I would get an odd exception - "Local administrator privilege is required to update the Farm Administrators' group."

I tried rebooting, uninstalling, reinstalling to no avail - the same problem persisted.
There are many recommended fixes for this problem - most of which didn't work for me:
  1. Run in an alternative browser such as Google Chrome/Firefox - this didn't work for me.
  2. Ensure that the Application Pool account in IIS has the correct database permissions - this wasn't the problem in my situation as the user was a full local administrator and sysadmin on the database
  3. Ensure that you run IE as an administrator - this is already done by the default shortcut to the SharePoint Central Admin (I wasn't opening up SharePoint Central Admin from a seperately instantiated browser - so this also wasn't the problem)
  4. Ensure that UAC is turned off as this somehow interferes with the application of security to those controls. Most references to this indicated that this worked for Windows 7, but there was no reference to this working for Windows Server products.
To my surprise, number 4 was the one that worked for me. To do this (in Windows Server 2008+), you have to go to:
Control Panel - User Accounts - Turn User Account Control On or Off

After the reboot, the controls were suddenly re-enabled. This is not a recommended configuration - but this was purely for demonstration purposes rather than being a production-ready install (we would be using Active Directory for that anyway).

DDK






Friday, 26 August 2011

SQL Server 2008 R2 Master Data Services (MDS) IWorkflowTypeExtender - Implementation and Debugging

SQL 2008 R2 Master Data Services (MDS) has a basic plugin framework which allows you to handle events when business rule workflows are kicked off. You implement your custom plugins by:
1) Creating a class that implements IWorkflowTypeExtender. This is contained in the following assembly:
C:\Program Files\Microsoft SQL Server\Master Data Services\WebApplication\bin\Microsoft.MasterDataServices.Core.dll

2) Build and deploy the file to the bin directory (typically "C:\Program Files\Microsoft SQL Server\Master Data Services\WebApplication\bin") or a subdirectory of it if you use a PrivatePath (discussed below).
3) Modify the "Microsoft.MasterDataServices.Workflow.exe.config" file to point to your new assembly:

<configuration>
  <configsections>
    <section name="loggingConfiguration" requirepermission="true" type="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.LoggingSettings, Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35">
    <sectiongroup name="applicationSettings" type="System.Configuration.ApplicationSettingsGroup, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">
      <section name="Microsoft.MasterDataServices.Workflow.Properties.Settings" requirepermission="false" type="System.Configuration.ClientSettingsSection, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">
    </sectiongroup>
  </configsections>


4) Add another section to your config file that indicates what workflows to listen to (based on the value flag)

<applicationSettings>
    <Microsoft.MasterDataServices.Workflow.Properties.Settings>
      <setting name="ConnectionString" serializeAs="String">
        <value>Server=.;Database=MasterDataServices;Integrated Security=SSPI</value>
      </setting>
      <setting name="WorkflowTypeExtenders" serializeAs="String">
        <value>PAC=CompanyName.MasterDataWorkflow.WorkflowExtender, CompanyName.MasterDataWorkflow;OOB=Microsoft.MasterDataServices.Workflow.WorkflowTypeTest, Microsoft.MasterDataServices.Workflow, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91</value>
      </setting>
    </Microsoft.MasterDataServices.Workflow.Properties.Settings>
  </applicationSettings>
  <appSettings>

The plugin will allow you to obtain information about the calling workflow via the StartWorkflow(string workflowType, System.Xml.XmlElement dataElement) method that must be implemented as part of the interface of IWorkflowTypeExtender. You receive all the business context data via the data element parameter as an Xml snippet.

Once you have deployed this assembly and update the config files, the simplest way to debug inside Visual Studio is to go to your Project Properties, Debug Tab and Set the Startup Program to be
"C:\Program Files\Microsoft SQL Server\Master Data Services\WebApplication\bin\Microsoft.MasterDataServices.Workflow.exe".

After doing that, add a commandline parameter of "-console" which will give you a visual display of workflow plugins that have been loaded into the MDS Workflow Application Domain. It also allows you to print out to the console via Console.WriteLine() to assist with your development and debugging efforts.

Debug Setup

The MDS Workflow Console window:

As a best practice, use a PrivatePath in your Assembly Bindings in your Microsoft.MasterDataServices.Workflow.exe.config file. This will allow fusion to find your custom binaries in a subfolder rather than the binary root. Keeping all custom binaries in subfolders will help you to avoid file naming conflicts or bloating your root MDS workflow bin directory.


The simplest way to see what's going on is to just render the Xml from the workflow handler to the MDS workflow console window via an XmlTextWriter e.g.

public void StartWorkflow(string workflowType, System.Xml.XmlElement dataElement)
        {
            Console.WriteLine("workflow type: {0} ", workflowType);

            var writer = new XmlTextWriter(Console.Out);
            writer.Formatting = Formatting.Indented;
            dataElement.WriteTo(writer);

DDK

Monday, 15 August 2011

Fix for Exception - "Initialization of the data source failed" in TFS 2010 Excel Reports against SSAS 2008 R2

At one of my banking clients, there was an issue with the setup of the TFS Reports in Excel. When opening the Excel reports provided with TFS, it was attempting to connect to the TFS Analysis Services Cube - and generating the following exception:

Initialization of the data source failed.

Check the database server or contact your database administrator. Make sure the external database is available, and then try the operation again. If you see this message again, create a new data source to connect to the database.




Then you get prompted for credentials, and then get a message about reinstallation of drivers.


There are a few red-herrings in the messages provided by Excel - but the underlying problem in our situation was just a permissions one.

To resolve, you just need to give the end user with the problems permissions to access the SSAS database (e.g. by adding them to the TFSDataReaders role in SSAS).  Alternatively, you can add the user as a TFS Administrator in the TFS Administration console (though this is not preferred as it is against the security principle of "least priviledge").

Till next time,
DDK

Thursday, 28 July 2011

How to force a SharePoint 2010 Business Connectivity Services (BCS) Client Cache to rebuild

WARNING! This tip involves potential loss of data.
The Business Connectivity Services (BCS) Client Cache is a combination of PST files and a SQL Server Compact Edition database - that allows for you to operate with Business data while in disconnected or limited-connection scenarios.

It contains all the subscription information and data for any items that you have made available offline via Business Connectivity Services. However, there is no official way of flushing this cache - your BusinessDataCache.sdf file will just keep growing and growing ad infinitum. You can query this data by connecting to it via Visual Studio 2010 via the Server Explorer/Data Connections task pane.(For reference, the Schema of the BCS Client Business Data Cache is shown below)


At the risk of data loss (if you have any unsynched items), there is a way of recreating your cache without deleting your whole Windows Logon profile. This is as follows:

  1. Uninstall any Office Add-ins you have e.g. Outlook Addins that use the BCS cache or create subscriptions in the BCS cache.
  2. Kill the BCSSync.exe process in memory using Windows task manager.
  3. Go to the following folder on the machine:
    %userprofile%\AppData\Local\Microsoft\BCS
  4. Rename BusinessDataCache.sdf to zzBusinessDataCache.sdf
  5. Run "%ProgramFiles%\Microsoft Office\Office14\BCSSync.exe" /Restart /Activation /RestartApps to restart the BCS Synchronization service.
  6. Your BusinessDataCache.sdf  file will be regenerated.
DDK

How to Troubleshoot Business Connectivity Services (BCS) in SharePoint 2010 using Performance Monitor Data Collector Sets

When dealing with Business Connectivity Services (BCS) errors (especially if they are occurring on Offline clients such as Outlook), the Event Log and SharePoint ULS Logs don't give you the whole picture.

To get a full verbose breakdown as to why your configuration or code is generating errors, you need to use Windows Performance Monitor (perfmon) to generate a verbose trace to get the full details and stack traces that are often needed to diagnose issues.

To set up an Event Trace on BCS you can do the following:
  1. Start Up Performance Monitor by typing "perfmon" at the Windows Start Prompt.
  2. In Windows Performance Monitor, Expand the "User Defined Node" and then Right-Click and Choose "New Collector Set"

  3. In the First Screen of the Wizard, Name the Collector Set (e.g. BCSCollectorSet). Choose the "Create Manually (Advanced)" radio button.
  4. When asked "What type of data do you want to include?", choose "Event Trace Data"

  5. When asked "Which event trace providers would you like to enable?", select the "Microsoft-Office-Business Connectivity Services" and the "Microsoft-SharePoint-Products-Business Connectivity Services" providers. If you are using Duet Enterprise, you may also want to add the "Duet Enterprise" Event Provider.

  6. Click "Finish" to Leave the other settings at the defaults.
  7. Start your trace by clicking on your collector set and clicking the play icon.
  8. Once the trace is recorded, you can then open up the logs and view them in Windows Event Viewer by clicking on "Open Saved Log" and pointing to your newly created BCS Event trace file.


If found this particularly useful when troubleshooting BCS Offline Caching issues on Outlook clients.

The event trace indicated that an old version of my BCS models and connections was being used to the Duet/SAP system.

To resolve my problem, I had to remove all instances of the BCS lists in Outlook by uninstalling Data Lists and Solutions that refer to that one SharePoint 2010 External Content Type. This resolved my caching problems and it stopped giving me "Access Denied" Exceptions when Synching with Outlook 2010.
 
DDK

Wednesday, 27 July 2011

Fix for "The service was unable to start because the version of the database does not match the version of the product." in ForeFront Identity Manager Sychronization Service for SharePoint 2010

I had an issue today with the ForeFront Identity Manager Synchronization Service when Synching to Active Directory and SAP in the User Profile Store. It was generating the following exception when I attempted to run a full or incremental User Profile synchronization:
  

"The service was unable to start because the version of the database does not match the version of the product."
 
If you get this problem, you will need to reprovision the User Profile Synchronization Service Application by stopping and restarting it (and reentering the password for your syncrhonization service user). This resolved the issues I was having - and the ForeFront Identity Manager service could then start correctly. The steps to do this are as follows:
 

1.Central Admin>System Settings>Manage Services on Server (select the server where the User Profile Service is running)

2.Stop the User Profile Service

3.Stop the User Profile Synchronization Service (you will be prompted that this will deprovision the service)

4.Once the services have stopped, Start the User Profile Service again

5.Start the User Profile Synchronization Service again (you will be prompted to enter the password for the User Profile Service's svc account) -note1: this service can take a little while to restart; note2: if it does not restart successfully, restart the server and try again (this has worked for me)

6.Once complete, the User Profile Service and User Profile Synchronization service should show as Started, and the 2 corresponding FIM services on the server should be running again.

For Reference:
I've mentioned this in the blog previously, but you can start up the ForeFront Identity Manager UI from the following location:

"%ProgramFiles%\Microsoft Office Servers\14.0\Synchronization Service\UIShell\miisclient.exe"

DDK

Friday, 22 July 2011

SharePoint 2010 Business Connectivity Services (BCS) Local Cache Data

When you take a SharePoint 2010 list offline (e.g. into SharePoint Workspace or Outlook), BCS entities are created in your local drive. These are often supported by ClickOnce Office Solutions deployed via Visual Studio Tools for Office 4.0 (VSTO). For reference, the following locations are used for BCS data and ClickOnce installations:

BCS Client Cache Data files are located in a hidden folder at:
%USERPROFILE%\AppData\Local\Microsoft\BCS

Supporting ClickOnce Applications are located at:
%USERPROFILE%\AppData\Local\Apps

DDK

Friday, 15 July 2011

Fix for VSTO ClickOnce application - "Unable to install this application because an application with the same identity is already installed. To install this application, either modify the manifest version for this application or uninstall the preexisting application"

If you receive the following exception even though your VSTO ClickOnce application is already uninstalled:

"Unable to install this application because an application with the same identity is already installed. To install this application, either modify the manifest version for this application or uninstall the preexisting application."


Then you may have issues with your ClickOnce application Cache. You have 2 options for clearing it:

Running the following at a Visual Studio Command Prompt:
mage -cc

OR

Running the following at a command line on any machine with the problem:
rundll32 dfshim CleanOnlineAppCache


See http://vijayvepa.wordpress.com/2011/01/11/unable-to-install-this-application-because-an-application-with-the-same-identity-is-already-installed-to-install-this-application-either-modify-the-manifest-version-for-this-application-or-uninstall/ for more details.

DDK

Thursday, 14 July 2011

Business Benefits of Duet Enterprise and Comparison to SAP integration via Biztalk/Custom Solutions - Bringing SAP Business Data and SharePoint Collaboration Features Together and Ensuring a ROI

I've been involved in several SAP-SharePoint integration projects recently. Such projects typically relied heavily on an Enterprise Service Bus (ESBs - such as Biztalk or SAP PI), customizations and code. When you integrate SAP with SharePoint, there are many benefits that are provided when users don't have to use SAP GUI directly:
  • Hard benefits such as reduced licensing costs
  • Usability benefits such as allowing for a much simplified User Interface (people familiar with SAP GUI and SAP Portal will chuckle at this)
  • Supportability Benefits (as SharePoint and .NET skills are more common that SAP Portal skills - at least in the Australian market).
However, since the release of Duet Enterprise 1.0 (http://ddkonline.blogspot.com/2011/01/duet-enterprise-released-and-available.html) at the end of January, another integration option has been thrown into the mix.

When my clients have been considering Duet Enterprise, one of the most often asked questions is

"What does Duet Enterprise offer me that I can't do already with SharePoint, SAP, custom BAPIs, SAP Enterprise Services and SharePoint 2010 Business Connectivity Services?".

Duet Enterprise brings a lot to the table - both on the SAP side and on the SharePoint side - however it's not the perfect solution for all problems. While consistency between systems is a vital principle in enterprise architecture, you really do have to choose the right combination of tools to fit a given problem. Indeed, as with any architectural approach - you have to weigh up the pros and cons.

I see the Duet Enterprise product offering (and the development framework it provides) as having the following advantages over just a custom integration build:
  1. Security and Security Best Practices. Through the Duet Enterprise Claims-based Role Provider, it allows you to leverage the SAP security model and SAP Roles in SharePoint. This avoids the duplication of a potentially massive security hierarchy into the SharePoint space. By design, communication channels between systems are encrypted. This is one of the biggest value-adds. This also allows you to keep all your unstructured data (e.g. pdfs/documents) in SharePoint and still leverage the security model in SAP to limit access to these resources.
  2. SAP UWL Functionality - It exposes your SAP Universal Worklist (UWL) into the native SharePoint task list - you don't have to do a mashup or WSRP or some other IFrame hack.
  3. Maintainability and Reduced Development Efforts -  It provides some out of the box functionality (site creation, business centric collaboration) as a guideline and template for further development.
  4. Diagnostic Tool Support - Duet Provides Health Checks that allow you to check your setup, and provides mechanisms (via Correlation Identifiters) to help you diagnose problems if requests are failing. 
  5. Official Supportability - It is a product supported by Microsoft and SAP that incorporates their recommended approach for integration. There is also official documentation for this integration approach.
  6. It has a Roadmap - It is a platform that has a roadmap (as opposed to a completely custom system) which is something you build and have to take a lot more responsibility for. As Duet Enterprise grows in capabilities, you will regret  going down the 100% custom path.
  7. Reporting Functionality - it brings Reporting Functionality out of SAP via the Reporting Request mechanisms and alows you and your colleagues to collaboration around SAP Reports via standard SharePoint functionality (e.g. Social tagging)  
  8. The BCS Solution Design Gallery - It provides an enhanced model for the deployment of Office Based Solutions through SharePoint e.g. Administration screens for arranging your solution and for generating your solution from within SharePoint. See http://msdn.microsoft.com/en-us/library/ff963717.aspx for more details.
  9. Entity Collaboration Functionality - Supports creation of collaboration sites surrounding SAP business entities. This however has to be checked against your SharePoint governance strategy and capacity plan.
  10. SAP HR Information imported into SharePoint User Profiles - ForeFront Identity Manager is used synchronize SAP User information (HR infotypes) from SAP via a BCS user profile connection, along with role synchronization via the Duet Enterprise Profile Synchronization timer job.
I have also been approached around concerns that it Duet Enterprise doesn't use a full ESB. However that's not true. Duet is a mechanism for providing services on the SAP side (via an ESB on the SAP side e.g through SAP's middleware solution "SAP PI") - and consuming these services on the SharePoint side. Duet Enterprise can live comfortably in an ESB-enabled an ecosystem. 

If you have SharePoint and SAP in your environment, I suggest that you consider Duet Enterprise as part of your enterprise strategy for improving the accessibility and collaboration around your most important business data.

In most circumstances, it will give you a large amount of momentum (and provides all the plumbing) to allow you to provide a solid and supported solution for SAP to SharePoint integration - with an upgrade path.

DDK

SharePoint 2010 Business Connectivity Services (BCS) - Exposing External Lists through Outlook and the InfoPath External Data Part - Fix for "Catastrophic Failure"

Business Connectivity Services (BCS) along with BCS solutions allow you to expose your line of business data (e.g. SAP via Duet Enterprise) through Office client applications such as Outlook 2010.

There are several walkthoughs of generating these solutions available online. Some examples of these are:
I recently had issues with deployment of a Business Connectivity Services (BCS) solution in conjunction with Duet Enterprise 1.0. When my Duet Enterprise solution was deployed , I was being greeted with an exception as per the screenshot below, with a "Catastrophic failure" error message in place of where my custom task pane should show.


The main reason this exception was occuring was because my layout file in my solution (the xsd and schema explanation for BCS Solution layout files can be found here: http://msdn.microsoft.com/en-us/library/ff394500.aspx) had a "Width" custom property.

Outlook couldn't handle this as a parameter to the web part - so it crashed out in a "Catastrophic" manner.


Also note that for your task pane to have any datasource at all whatsoever, you still need to have a "DataSourceName" property set on your OBA part. The description on MSDN here (http://msdn.microsoft.com/en-us/library/ff394500.aspx) is somewhat misleading as it indicates that you only need a datasource for your External Data Part if it relies on data from one of the other parts in your layout. If you remove the datasource property, the attributes weren't populated - so the statement on MSDN isn't correct.





Wednesday, 13 July 2011

Fix for SQL 2008 Exception - An exception occurred while enqueueing a message in the target queue. Error: 15404, State: 19. Could not obtain information about Windows NT group/user 'DOMAIN\username', error code 0x5

After installing a recent series of Windows update, I began to get hundreds of SQL 2008 errors in the Windows Event log (20 or so per second) on one of our demonstration servers. These were as follows:

An exception occurred while enqueueing a message in the target queue. Error: 15404, State: 19. Could not obtain information about Windows NT group/user 'DOMAIN\username', error code 0x5.

or

An exception occurred while enqueueing a message in the target queue. Error: 15404, State: 19. Could not obtain information about Windows NT group/user 'DOMAIN\username', error code 0x2.

If you do have these exceptions, then you most likely have connectivity issues to your Active Directory server - and the identity of the owner of your databases  (dbo) is on that Active directory server. The SQL Service Broker is causing this exception. If this is not a resolvable issue condition, then you can change the owner to a SQL login (e.g. sa) and this will resolve the exception.

You can do this with the following command on each database that has the wrong owner:

sp_changedbowner 'sa'

DDK

Tuesday, 12 July 2011

SharePoint 2010 - Business Connectivity Services (BCS) / BDC model export - Default vs Client Option

When exporting BCS entities, you may have noticed 2 different "Settings" options available in the export dialog. If you're wondering what the differences are in the output - there don't appear to be any at all.

To confirm this, I just did an export of 2 entities to a model file using the "Default" option and the other export using the "Client" option. Using the Notepad++ text compare tool, there was no physical difference between the output of the different export settings - nor does there seem to be any documentation regarding this dropdown at all.


However, if you look at the BCS connection information, you'll also notice 2 tabs that match the different settings or "profiles" that you can choose. Consequently, this is expected behaviour - as you'll only have differences when your settings are different between default and client.




DDK

Thursday, 7 July 2011

Not all Reflection Tools (or Obfuscators) Are Created Equal

I had a client last week in Melbourne who wanted to salvage some code from an existing SharePoint 2010 implementation. As long as it wasn't obfuscated, then I thought there would be no problems at all.

Red Gate's (previously Lutz Roeder's) Reflector is designed for just a situation - and I'd recently purchased the awesome VS PRO version (which is phenomenal lets you step and debug through other people's applications!).

However, when I tried to open up the assemblies in Reflector or on ILDASM, it appeared to indeed be obfuscated - by the Smart Assembly tool from Red Gate.

Typically, the obfuscated code will be shown with an error or garbled characters. e.g. "This item is obfuscated and can not be translated." - as below:



If you try and open it in ILDASM, it throws an exception as below:
This is because the Assembly has the CompilerServices "SuppressIldasmAttribute" applied to it.

However, if you open up the assembly with the new tool JetBrains dotPeek (the makers of Resharper), then you will be able to see the source code - even of those allegedly obfuscated methods and properties.

I'm not sure whether Red Gate deliberately set a flag inside Reflector when they purchased it from Lutz Roeder - but it seems like a few shortcuts were taken with the obfuscation engine.

So be warned - not all Reflectors and not all Obfuscation methods are created equal.


DDK

Wednesday, 15 June 2011

Duet Enterprise Integration - Fix for "System.ArgumentException: An item with the same key has already been added due to Extra Content Type Headers" in SharePoint 2010 ULS Logs

If you're not aware, Duet Enterprise (http://www.duet.com/) is an integration framework and infrastructure for providing SAP to SharePoint integration - of task lists, SAP workflows, SAP reports, SAP security and SAP data (through SharePoint BCS).

I recently had an issue with a Duet Enterprise deployment for one of my clients. When retrieving data, the out of the box web parts were failing and dropping the following Exception in the SharePoint 2010 ULS Logs:

System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.ArgumentException: An item with the same key has already been added. Server stack trace:
at System.ThrowHelper.ThrowArgumentException(ExceptionResource resource)
at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add)
at System.Collections.Generic.Dictionary`2.Add(TKey key, TValue value)
at System.Xml.ContentTypeHeader.ParseValue()
at System.Xml.ContentTypeHeader.get_MediaType()
at System.Xml.XmlMtomReader.ReadRootContentTypeHeader(ContentTypeHeader header, Encoding[] expectedEncodings, String expectedType)
at System.Xml.XmlMtomReader.Initialize(Stream stream, String contentType, XmlDictionaryReaderQuotas quotas, Int32 maxBufferSize)
at System.Xml.XmlMtomReader.SetInput(Stream stream, Encoding[] encodings, String contentType, XmlDictionaryReaderQuotas quotas, Int32 maxBufferSize, OnXmlDictionaryReaderClose onClose)
at System.Xml.XmlDictionaryReader.CreateMtomReader(Byte[] buffer, Int32 offset, Int32 count, Encoding[] encodings, String contentType, XmlDictionaryReaderQuotas quotas, Int32 maxBufferSize, OnXmlDictionaryReaderClose onClose)
at System.ServiceModel.Channels.MtomMessageEncoder.MtomBufferedMessageData.TakeXmlReader()
at System.ServiceModel.Channels.BufferedMessageData.GetMessageReader()
at System.ServiceModel.Channels.BufferedMessage..ctor(IBufferedMessageData messageData, RecycledMessageState recycledMessageState, Boolean[] understoodHeaders)
at System.ServiceModel.Channels.MtomMessageEncoder.ReadMessage(ArraySegment`1 buffer, BufferManager bufferManager, String contentType)
at System.ServiceModel.Channels.HttpInput.DecodeBufferedMessage(ArraySegment`1 buffer, Stream inputStream)
at System.ServiceModel.Channels.HttpInput.ReadBufferedMessage(Stream inputStream)
at System.ServiceModel.Channels.HttpInput.ParseIncomingMessage(Exception& requestException)
at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)
at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Channels.SecurityChannelFactory`1.SecurityRequestChannel.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)

at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message) Exception rethrown
at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at BCSServiceProxy.IWXManageCustomerIn.FindCustomerByElements(FindCustomerByElementsRequest request)
at BCSServiceProxy.WXManageCustomerInClient.BCSServiceProxy.IWXManageCustomerIn.FindCustomerByElements(FindCustomerByElementsRequest request)
at BCSServiceProxy.WXManageCustomerInClient.FindCustomerByElements(BPCCustGetAll BPCCustGetAll) -


-- End of inner exception stack trace ---
at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle typeOwner)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
at System.Reflection.MethodBase.Invoke(Object obj, Object[] parameters)
at Microsoft.SharePoint.BusinessData.SystemSpecific.Wcf.WcfSystemUtility.Execute(Object[] args)


I immediately thought that it was something wrong with the data. Because Duet uses HTTPS & SSL for all communications between SAP Netweaver 7.02+ and SharePoint 2010, I had to decrypt the traffic to work out what was going on. Running the calls on the SAP site returned blank results. To get WireShark (http://www.wireshark.org/) decrypting the traffic we had to generate a certificate in SAP with a private key. We had to get the key out in PKCS#12 format and then convet it to the .PEM format that wireshark accepts - similar to what is desribed here:
http://htluo.blogspot.com/2009/01/decrypt-https-traffic-with-wireshark.html
We used "sapgenpse" in SAP to generate the certificate with the private key - as per the screenshot below:
Once the key was set in WireShark preferences, you will see many of the packets being processed being marked as green. You can then right click on them and choose "Follow SSL Stream" on your web request - and you will see the decrypted version of your HTTPS traffic.

Turns out, the issue was with SAP retuning duplicate Content headers in the response to SharePoint BCS (as per the screenshot).

There is a SAP Note to fix this very issue - "SAP Note 1539888 - Composite Note -Duet Enterprise Installation Wizard". You will need a SAP Service Market place to log on to get this note.

DDK