Thursday, 18 December 2008

"Command line error." when installing Web Part into WSS 3.0/MOSS 2007 with stsadm.exe

This is a bizarre problem - but if you get the stsadm.exe generic "command line error." whilst trying to install SharePoint web parts and your paths look fine, it may just be an encoding issue when you copy the commandline arguments between different apps. You do NOT need to have your wsp file in the same directory as stsadm.exe when installing parts to your site. You see, different apps interpret hyphens differently. If you copy a hyphen from a web site, it may just be a unicode representation of a hyphen and not a "real" hyphen. For more detail, see:

A simple solution is just to make sure you type all your stsadm.exe command parameters in manually and not copy and paste them into your DOS prompt.

Friday, 12 December 2008

Telerik Releases "Open Access", a Database Agnostic ORM Product that Works with LINQ

My favourite 3rd party WebUI control provider Telrik just released its new ORM product called "Open Access" - I didn't realise they were developing such a thing - but it turns out they just acquired German company Vanatec that specializes in ORM products. They have performed a few updates to the original Vanatec software since they acquired it (such as removing a dependency on J#) - so they have grabbed this product and are running with it full steam. If the quality of their controls is anything to go by, this could be a valuable asset in any .NET developer's toolbelt.

I'm going to try it out and evaluate it against some of the custom LINQ, LINQ to SQL and Nhibernate-based efforts that I've created and worked with on previous projects. It also supports non-SQL Server databases such as Oracle. Now there is also a fledgling Codeplex project called LINQ to Oracle but this ORM product could shoot it out of the water. Telerik Open Access also supports direct SQL. I'll do a a review before the end of the year.

Wednesday, 3 December 2008

How can I set the Modified By, Created By, Date Modified, Date Updated fields via the MOSS object model? (without making an new version)

There are a few problems with the MOSS object model when adding new files using the SPFileCollection.Add() method. In particular, there is no overload that accepts both the "bool overwrite" parameter AND the details of the user who did the update at the same time.

Consequently, the upload of a file to a versioned list in SharePoint requires that you separately add the file with overwrite on and then update the User and Time stamps at a later stage.

Unfortuntately, the "Author" and "Editor" fields accessible via the SPFile Object are read-only. You can however take advantage of the UpdateOverwriteVersion(); available on list items to update these stamps manually. See the code below:

//The authenticating user needs to be service account as it uses database access,
// so we must pass in current user as parameter when adding file.
SPUser updatingUser = EnsureUser(HttpContext.Current.User.Identity.Name, web);
currentFile = fileCollection.Add(newDocument.Name, contents, fileProperties, addAsNewVersionToExistingFiles);

//Get list item from SPFile object
SPListItem listItem = currentFile.Item;

//Overwrite with correct values as the object model doesn't allow us
// to both specify overwrite=true and the specific user names.
listItem["Author"] = updatingUser;
listItem["Editor"] = updatingUser;

How do I handle or abort Function Key events (e.g. F1, F2,etc) in both IE and Firefox?

Run this page and you will be shown the keycode for the Function Key you pressed. In addition, any standard browser handlers (such as help prompts when F1 is pressed or Searches when F3 is pressed) will be aborted - so you can pass them to your app instead. See below:

<script type="text/javascript" language="javascript">
///Demo Script to display the function key that was pressed
///and abort any browser event e.g. F3 for IE find,
///F1 for IE help, F1 for Firefox Help
///Version Author Date Comment
///1.0 DDK 03 Dec 2008 Original Version for
/// Application Tender
/// Proof of concept
/// when users wanted 'green
/// screen' functionality
document.onkeydown = showDownAndAbortEvent;
//Stop F1 opening Help in IE
document.onhelp=function() {return false};
window.onhelp=function() {return false};

// decipher key down codes
function showDownAndAbortEvent(evt)
evt = (evt) ? evt : ((event) ? event : null);
if (evt) {
alert('keyCode:' + evt.keyCode + ';charcode' + evt.charCode + ';' ) ;
evt.preventDefault(); // disable default help in Firefox
catch (ex) {}
//Kill any intercepts for ie
window.event.cancelBubble = true;
window.event.returnValue = false;
window.event.keyCode = 0;
catch (ex) {}
return false;

Tuesday, 25 November 2008

Aspose.Words Ignores Protection Exclusion for Selected Areas when Generating Word Documents

As per the following thread,, Aspose.Words will completely ignore your selection-based exclusions when you are trying to lock/protect parts of a document - and there is no fix expected for the next 3-6 months. This is unfortunate as it is a very powerful feature when generating MS Word documents that should only be partially editable.

It is not a viable option to use Form fields and section-based security as it only allows data entry without the ability to change any formatting (this was the workaround suggested by Aspose for now). My client has reluctantly accepted that this is an inherent limitation of the document generation section in our ASP.NET application.

One interesting element in Aspose.Word's implementation is their use of tags to delimit repeating data within table. You pass in a DataSet and it will use the field names (presumably via reflection) to render the fields of your list into a Word table.

See below for an example of the table-binding syntax of Aspose.Word. Note the «TableStart:TableName»«Field1»«Field2»«TableEnd:TableName» syntax.

Saturday, 22 November 2008

Dell D630 Problem - Cannot enable LAN Card - Fixed

I normally connect my laptop to my home network via wireless on a Belkin Router (F1PI241EGau) from iiNet. Tonight, however, I tried to connect via a LAN port because it appears my router's wireless capability has been freezing every day with Smurf attacks - and I wanted to see the log before hard rebooting it. Rebooting has the unwanted effect of clearing out any diagnostic messages I might get right before the failure of wireless.

Problem was - I just couldn't enable the LAN adapter on my laptop - even through device manager. When I tried to enable it it would respond with a shocked "Connection Failed!" or "Windows could not enable your device" - and would show "Device is disabled (Code 22)". Nothing at all was showing up in the event log under Application/System categories - so the event log was about as useful as a screen door on a submarine in this situation.

I tried to disable some the Dell Quickset (f)utility and disabled power management to try to fix the problem. Dell has a lovely (for some) service that disables the network card when you are not on mains power called "NICCONFIGSVC" (C:\Program Files\Dell\QuickSet\NICCONFIGSVC.exe) - but still could not enable my laptop LAN card.

The simple fix was to uninstall the problematic network card and restarted. I now have a brand spanking new Local Area Network Adapter 3 which is fully functional and I can now proceed to diagnose the nasty Smurf DoS attacks.

Tuesday, 4 November 2008

List of Alternative Options for Deploying or Hosting Applications to SharePoint 2007 or WSS 3.0

There are several different ways you can deploy an application to MOSS:
  1. Create Custom built Web Parts and deploy them as features
  2. Drop an ASP.NET application in c:\program files\common files\microsoft shared\web server extensions\12\template\layouts that gets virtualized to every site on your server
    e.g. to http://servername/sites/sitename/_layouts/MyApp/SomePage.aspx
  3. Use User Controls and the Son of SmartPart (See
  4. Using ASPX pages added to SharePoint Site (this involves deploying to the bin folder of your SharePoint site such as C:\Inetpub\wwwroot\wss\VirtualDirectories\moss.litwareinc.com80\bin, Adding SafeControls Web.Config entries to allow your assemblies to load, and deploying your aspx to the relevant site.

For more info see It also has a very handy decision matrix when you are having trouble deciding which deployment model/architecture you should use.

There is also a list of 2 other options described here:

  1. Using features and WSP package: Following some steps as recommended by Andrew Connell (MOSS MVP). Here is the blog. I believe this is the standard approach users are using in the SharePoint developer community.
  2. Using VSeWSS (Visual Studio extensions for WSS): This is yet another and latest solution. Microsoft recently released the VSeWSS 1.1 RTM. Using this, we can deploy all the asp .net pages into SharePoint by setting up a new project in Visual Studio. VSeWSS creates a solution package using features. Setup the project and hit ‘Deploy’ and it is done.

Monday, 3 November 2008

Using Attributes, Reflection and GetCustomAttributes() to support field mappings between systems

My task today was to generate MS word documents using Aspose.Word and merge fields. The problem was, the names of the merge fields did not match exactly with the names of the properties of my objects - and I didn't want to map all fields in my objects to the external components in a separate method call. I approached this with a mechanism I have used several times before - using CustomAttributes and Reflection to drive the property mappings between application entities and external components such as Documents or Emails. See below for details on this. Note the use of the GetCustomAttributes() for retrieving custom attributes on your properties.

The attribute class:

/// <summary>
/// Supports the mapping between Aspose.Word Merge fields and the Fields
/// in the Data Transfer Object
/// </summary>
public class DocumentMergeFieldMappingAttribute : System.Attribute
public string MergeFieldName { get; set; }

public DocumentMergeFieldMappingAttribute(string mergeFieldName)
MergeFieldName = mergeFieldName;

The mapping class which builds up a Dictionary (name, value pairs)

/// <summary>
/// Gets dictionary of field values for merge into documents - uses the
/// DocumentMergeFieldMappingAttribute to determine which dto properties
/// should render to the Document itself.
/// </summary>
/// <param name="inputDto"></param>
/// <returns></returns>
private IDictionary<string, IFieldValueDto> GetDtoFieldValues(object inputDto)
Dictionary<string, IFieldValueDto> dictionary = new Dictionary<string, IFieldValueDto>();

Type objectType = inputDto.GetType();
PropertyInfo[] properties = objectType.GetProperties();

foreach(PropertyInfo property in properties)
foreach (Attribute attribute in property.GetCustomAttributes(true))
if (attribute is DocumentMergeFieldMappingAttribute)
string fieldValue = property.GetValue(inputDto, null) as string ?? string.Empty;
//Set property value and the mapped document field name
.MergeFieldName, new FieldValueDto<string>(fieldValue));
return dictionary;

Thursday, 30 October 2008

Uploading Files to MOSS 2007 via the SharePoint Object Model

I've blogged on a similar topic about a year ago, but the previous post was using the Copy.asmx web service directly (see Here is the equivalent using the SharePoint object model when you have the luxury of being hosted on the WSS/SharePoint box. One thing to note is that you will chew up memory if you don't dispose your SPWeb and SPSite objects correctly (either via using or by doing explicit Dispose() calls) - you cannot rely on .NET Framework garbage collection to clean them up for you. See inline comments below on this.

Background: The current Management App I'm working on has an IIS application hosted in Atlanta as the frontend to a Windows SharePoint Services (WSS) document store hosted in Australia. The problem with this was that the upload would do a massive round-trip overseas and back again - so a large file (say 20MB would take well over an hour). The workaround to this was to host the page as an IFrame in our application and to host it on a WSS Server so uploads of large files could be done directly to WSS rather than via the IIS Server overseas. I used the following code in the hosted page to perform the uploads using the SharePoint 2007 object model.

/// <summary>
/// Upload documents using MOSS Object Model Directly, updates database with document information
/// After the upload of each document is complete.
/// </summary>
/// <param name="projectId"></param>
/// <param name="sourceStream"></param>
/// <param name="fileName"></param>
/// <returns></returns>
public static VoidResponse UploadDocument(UploadedFile uploadedFile,
string projectNumber, int projectId, int activityId, int documentTypeLookupId,
int constructionAuthorisationId, string subfolderPath)
//Core document information to be passed into MOSS
//and into the .NET Web Service to update the database
DocumentDto newDocument = new DocumentDto();
newDocument.ProjectNumber = projectNumber;
newDocument.Name = uploadedFile.GetName();
newDocument.SharepointListName = ConstructSharepointListName(newDocument.ProjectNumber);
//Snapshot path as combination of both the
newDocument.SharepointSnapshotPath = CurrentSnapshotPath;

if (string.IsNullOrEmpty(newDocument.SharepointFileRelativePath))
newDocument.SharepointFileRelativePath = newDocument.Name;

string queryOptions = FormatHelper.GetQueryOptions(projectNumber, CurrentSnapshotPath, subfolderPath);

//To Allow return of uploaded file details
VoidResponse response = new VoidResponse();

//Housing SPSite and SPWeb in using blocks so they are disposed correctly as per best practices
using (SPSite site = new SPSite(SharePointParameter.SharePointSiteUrl))
using (SPWeb web = site.OpenWeb()) //The SPSite already has the site url value.
web.AllowUnsafeUpdates = true; //Using AllowUnsafeUpdates because we are creating an SPWeb -
//otherwise will result in exception
//as per
SPList list = web.Lists[newDocument.SharepointListName];

SPFileCollection fileCollection = null;

//Using query options as recommended by
//same as the Web Services do in the LL framework.
SPQuery spQuery = new SPQuery();
spQuery.Query = queryOptions;
SPListItemCollection folderCollection = list.GetItems(spQuery);

//Folders are really just SPListItems with type of FilSPFileSystemObjectType.Folder
//Defensive: Folder not found exception
if (folderCollection.Count == 0) //Error condition - folder not found
throw new ConfigurationErrorsException(string.Format(Messages.Pcp_MOSS_FolderNotFoundError,

fileCollection = folderCollection[0].Folder.Files;

//Only perform duplicate check when it is turned on in configuration settings.
if (SharePointParameter.CheckForDuplicateFilesInMOSS)
//Check if file doesn't already exist.
foreach (SPFile file in fileCollection)
if (file.Name == newDocument.Name)
//Duplicate detected
response.IsSuccessful = false;
response.Messages.Add(new Message(string.Format(Messages.Pcp_DuplicateFileDetected,
return response;

Stream fileStream = uploadedFile.InputStream;
byte[] contents = new byte[fileStream.Length];
fileStream.Read(contents, 0, (int)fileStream.Length);

//Upload document itself. This will also give us the new Id to store in the database.
SPFile currentFile = null;

currentFile = fileCollection.Add(newDocument.Name, contents);
System.Security.Principal.WindowsImpersonationContext impersonationContext = null;

if (GetUseCurrentUserCredentialsForMOSSFlag()) //Impersonate if uploading document
impersonationContext =

//Upload file details to main PCPData database
PcpDataService.PcpDataService pcpDataService = new PcpDataService.PcpDataService();
pcpDataService.Url = DDKOnline.WssHostedWeb.Shared.SystemParameter.DDKOnlineDataServiceUrl;
pcpDataService.AddProjectDocument(projectId, activityId, constructionAuthorisationId,
currentFile.Item.ID.ToString(), newDocument.SharepointListName,
newDocument.SharepointSnapshotPath, newDocument.SharepointFileRelativePath,
documentTypeLookupId, newDocument.Name);
if (GetUseCurrentUserCredentialsForMOSSFlag() && impersonationContext != null)
//Successful upload completed.
response.IsSuccessful = true;
response.Messages.Add(new Message(Messages.Pcp_SharePointUploadComplete));
catch (SPException spException)
//Error ocurred during upload
response.Errors.Add(new DDKOnline.Framework.Common.Response.Error(spException.Message));

response.IsSuccessful = false;
catch (Exception ex)
new Error(string.Format(Messages.Pcp_SharePointAccessFailed)));
ExceptionPolicy.HandleException(ex, SERVICE_EXCEPTION_POLICY);
response.IsSuccessful = false;
if (web != null) web.AllowUnsafeUpdates = false;
return response;

/// <summary>
/// Generate query options to support storage within subfolders of current list
/// </summary>
public class FormatHelper
internal static string GetQueryOptions(string projectNumber, string CurrentSnapshotPath, string subFolderPath)
string folderPath = VirtualPathUtility.RemoveTrailingSlash(string.Format("{0}/{1}/{2}", projectNumber, CurrentSnapshotPath, subFolderPath));
string queryOptions = string.Format("<Query/><QueryOptions><Folder>{0}</Folder></QueryOptions>", folderPath);
return queryOptions;

Tuesday, 28 October 2008

How do I get the underlying type of a Generic List?

The Type.GetGenericArguments() method will let you determine the underlying type of the elements of a list object at runtime. See this article for more information.

For example:

if (t.IsGenericType)
// If this is a generic type, display the type arguments.
Type[] typeArguments = t.GetGenericArguments();
Console.WriteLine("\tList type arguments ({0}):", typeArguments.Length);
    foreach (Type tParam in typeArguments)
// If this is a type parameter, display its
// position.

        if (tParam.IsGenericParameter)
Console.WriteLine("\t\t{0}\t(unassigned - parameter position {1})",tParam,
Console.WriteLine("\t\t{0}", tParam);

Wednesday, 15 October 2008

Zach William Klein arrives!

Our fabulous new baby boy Zach William Klein ("Zacky Zack") was born 10/10/2008 10:19am with a championship weigh-in of 3.855kg. We have just brought him home from hospital and after a shocker of a night, he has calmed down to 3 sleeps a night on day 2. Welcome home Zach!

For many more first piccies see or my most recent FaceBook photo gallery.

Tuesday, 7 October 2008

How to do remote debugging with the Visual Studio 2008 Remote Debugger Service (Msvsmon.exe)

There is always a problem that will crop up on one of your servers that you just CANNOT reproduce at all. To solve pesky problems like this, you can make use of the remote debugger service of Visual Studio. What's more, you can debug without running a setup package at all on dev or production servers. Instead, you can just run msvsmon.exe from a file share without even installing anything. This will definitely keep the network guys happy!

Typically, you can get msvsmon.exe file from the following path on your development machines that already have Visual Studio 2008 installed:
C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\Remote Debugger\x86\msvsmon.exe

When you run Msvsmon.exe, it just shows up on the server as a windows application on your server (See screenshot). You can also run it as a Windows Service so you don't need to log onto the server to start it up.

In Visual Studio on your local machine, you then just put in a fully qualified remote server name in "Attach to Process" dialog. Now you can debug the remote machine to your hearts content (make sure you're a debugger user or remote admin otherwise you will get access denied errors).

Just a reminder - if you are debugging ASP.NET on a 2003 server and above, there will be 1 w3wp.exe process for each of your application pools. It may be hard to find out which w3wp.exe process you want to debug as there may be many application pools. You can find out which w3wp.exe process is running your pool by a process of deduction (ie just by stopping the application pools you are not using. The one left is yours).

For the full goss, see There is even a 17 minute tutorial on setting it up. See

Just keep in mind - you may be up for a long wait if the connection to your server is slow - especially if attaching a process in Atlanta! :o). There is a timeout option if you get timeout errors (you only get timeout errors in "No Authentication" mode as the "Windows Authentication" mode has an infinite timeout.)

System.Web.VirtualPathUtility to get File Names from Urls and Converting Relative Paths to Absolute Urls (without ResolveUrl)

One of the things that appeared to be lacking in ASP.NET 2.0 and above (or so I thought) is a utility library to parse and extract paths and get the file name from a Url. Here are some examples of the url parsing code that I and others have done without knowledge of this library:

One class that does just this (and which most developers don't know about) is System.Web.VirtualPathUtility(). Everyone knows it's brother utility library System.Web.HttpUtility but VirtualPathUtility should be an essential part of any ASP.NET developer's toolkit. See for more information.

Thursday, 4 September 2008

Microsoft Tech Ed Australia 2008 - Top Tech Ed Takeaways

The veritable tsunami of information that is Tech Ed 2008 is has been and gone. One of the things I've found about some conferences like Tech Ed is that they have a breadth of information but sometimes don't engage me or don't go into the depth of detail that I would like (apparently, PDC is meant to be more in depth). However, 5 of the presentations I went to DID keep me on the edge of my seat:

  1. ARC304 - "Building Loosely Coupled Applications with Unity" with Richard Banks
  2. DEV410 - "Debugging the world, starting with the CLR (or Debugging from the trenches)" by Corneliu on debugging (
  3. SOA 311 - "Building Human Workflows with WF State Machines" with Kirk Evans
  4. WEB301 - "ASP.NET MVC – Should you care?", WEB302 - "ADO.NET Data Services – The Zen of RESTfulness and the Art of “Astoria”" - Scott Hanselman is truly an entertaining and informative presenter.
  5. WEB304 "Web Futures - the next 18 months"

My Top Takeaways

  1. MS wants you to utilize the "cloud" because it is getting in on the action
    Offerings like SQL Server Data services are now out in Beta so we can utilize the "infinitely scalable processing and storage" of the cloud.
  2. Get cracking with ADO.NET Entity Framework and LINQ to Entities as it solves some problems in LINQ to SQL such as doing Lazy Loading and dealing with the Database to Application Impedence Mismatch with support for inheritance in your EDMX data model. However is does NOT unfortunately solve the replay problems we have in LINQ to SQL (there are projects like for this before Entity Framework 2.0 comes out)
  3. Use Unity for all your dependency injection needs from now on
    It is the Web Client Software Factory Object Builder gone on steriods.
  4. Try harder to use object oriented techniques with JavaScript
    The trend is towards more javascript in applications, not less (especially with some of the frameworks out there like jquery and improved design-time support) - you can simplify your code with prototypes and JavaScript inheritance.
  5. MVC is a no-go. Only consider using the MVC framework for extreme control of html output and testability. From a business viewpoint, the value benefits are minimal but the costs due to increased complexity and "plumbing work" are significant. This makes it a no-go for me. I don't believe that you should have unit tests for absolutely everything - just the core logic. Otherwise you get massively declining returns trying to increase your test coverage. In addition, the MVC Framework is not ready for prime-time (currently in Preview 5) - so don't even start any projects or frameworks that use it.
  6. Silverlight is a big question mark (for app UI development, not for streaming media apps) & you should follow the KISS principle. The Web Futures panel including presenters like Scott Hanselman came down to 2 significant truths:
    1. A complaint was made about the ongoing treadmill of new/old technologies like Xaml (Silverlight and WPF in particular). The answer from Mr Hanselman was that any technology you use now with look crap in 18 months. The golden rule to mitigate this problem is you should go with the "Simplest solution to meet the requirement, the simplest requirement to meet the need". In other words, "Complexity kills" any good project. In other words, "do the simplest possible thing that will work and refactor it tomorrow."

    2. JavaScript and Silverlight are competing models for UI development - but JavaScript has a massive upper hand in terms of industry support and object models like JQuery ( There are also no killer apps for Silverlight. I'm going to hold back on Silverlight development for non-streaming media apps for now...

Other Notes:

Some of the takeaways that I found from Tech Ed 2008 were (in no particular order):

  1. Tech Ed 2008 Keynote - This sets the tone for the rest of the conference - much the tone of the presentation was set around Loose Coupling of Software and Services (Software Plus Services (S+S)), focussing on a few key products like LiveMesh, SQL Server Data Services and the latest version of Exchange. Live Mesh was trumpeted as a peer-to-peer application platform. They also did a demo of a custom app called “Smart Asset” used by New Zealand Antarctic Base with Dr Neil Roodyn to showcase MS's Service part of the "S+S" equation mainly around Virtual Earth. They also did a demo of the web app "Street Advisor" – and what MS did to improve an existing site with these services including the VE map control, and the Web Messenger control to add immediacy to interaction with sites (you don’t need Messenger installed for this functionality).

  2. ARC 203 - Understanding Software Plus Services - Identified some of the key industry trends SOA (based on reuse and agility), SaaS – (based on flexible pricing and delivery), RIA (based on experience), and Cloud computing (based on service utility). Covered the fact that "S+S" - the Software part of S+S normally includes LoB apps. Established a grid for the tradeoffs of different architectural decisions (Control vs Economy of Scale, Buy vs Build), and a spectrum between "On-Premise" (aka On-Site), Hosted, Cloud and Vendor architectural solutions). Gave some general rules for S+S architectural deicisions - that you should typically host out any commodity services – SaaS (e.g. CRM and Email are commodity applications). In particular, a firm should spend money on key differentiators (non-commodity applications and services) and keep them in-house. Jungle Disk was given as an as an example of S+S.

  3. VB Session on VB 3.5 features - I didn't attend, but discussed with some other attendees and some of the best takeaways were around XML Literals (not a feature built into C#) - but there are ways around it with the "Paste as XML" addin.

  4. SOA 311 - Building Human Workflows with Windows Workflow Foundation State Machines - Kirk, the presenter indicated how confusing and verbose a lot of the documentation is on WF. This presentation used a windows forms application to host a windows workflow like most of the examples on the web - however, a new feature of 3.5 - "Workflow Services" was mentioned for hosting WF.

    It pointed out a few of the issues with the current iteration of WF e.g. there is no direct equivalent of a "flowchart" in WF (ie that loops back to previous decision points). Some in depth examples about the State machine workflow were given. Sequential workflows cannot return to a previous position – this is a key differentiator from State based flows. State Machines are more complicated (they are essentially a turing machine) but are very flexible in terms of flows. States can have child states (aka "Recursive States") - whereby you don't have to do the bubbling of a Cancel up yourself. An important concept in State Machines is in transitions of a State-based workflow - State e.g. Assigned has OnApproved/OnRejected transitions. WF will track possible state transitions and State history for you. A workflow has to to have an initial state and a completed state - these a properties of a workflow and point to one of the state objects on your diagram (via drop down in property explorer). To do a State transition, use a state transition object (just sets state from one to the other) . Simply set the target statename in proprties of the “SetState” Activity. The presenter did a walk through of a supply fulfillment workflow and how the SetState activity is used to to the transitions.

    You handle external events just by defining an interface and adding an [ExternalDataExchange()] attribute on class, with all events (thse show in the dropdown when the Interface type is set). Create eventargs which inherit from ExternalDataEventArgs to transfer information e.g. SupplyFulfillmentArgs – with InstanceId as Guide. Should be marked Serializable for persistent workflows. Then add a concreate Class – just implements our sample IEventService – so raises these events - E.g. RaiseAssignedEvent. In the form/the workflow hosting environment: i. Should have WorkflowRuntime declared in the form ii. Have a Guid to track the instance of the workflow you are showing/dealing with. In the form load, Add your ExternalDataExchangeService, then do wr.AddService (externalDataService)
    Then do a workflowRuntime.StartRuntime on the instantiated workflow object. With the persistence service – processes are no longer bound to processor/app domain. There are only 2 database tables with the persistence module for WF - "CompletedState" and "InstanceData"

    In the Windows form, you instantiate a SQLWorkflowPersistenceService which inherits from WorkflowPerstitenceService - so you can implement custom peristence if you need it.
    The LoadIntervalSeconds in the config file – allows you to control how often to load from the persistence store. Tracking Services – are use for monitoring reports and business process. This just serializes the BLOB.
    In tracking services, there are several different event types:
    a. Workflow Instance Eents - Completed, persisted, etc
    b. Activity Events - Closed/Executing, Initialized
    c. User Events – this.TrackData()Tracking Profile -> Tracking Service

    Tracking queries should be done through tracking service object model. To do a query you should Instantiate SqlTrackingQuery and SqlTrackingOptions (e.g. options could pass in WorkflowStatus.completed). The Result comes back as an IList and will have 1 of 3 types – workflow/activity/user events in the return list (just cast back to determine which type it is). You need the SharedWorkflowCommitWorkBatchService to allow sharing of connectionstring between Track and Persistence service.

    Samples at
  5. One of the chalk and talks and lunchtime were doing demos of what Silverlight can do sample sites were given -
    a) b)

  6. WEB311 - Designing Compelling Silverlight User Experiences With Expression Studio - Showed the use of the Expression suite of tools to create some of the data-driven silverlight animations used at Tech Ed itself. One of the tips was that you can't just copy and paste wireframes generated in visio or powerpoint into Expression Blend (they just show as bitmaps in Blend, not separate objects). However you can get them in easily by exporting the pdf to give it an "ai" extension - you can then drag the file to Blend and modify the objects in your wireframe separately. The presenter then showed some of the standard techniques - such as using slicing and using Windows movie maker to create basic timed transitions. The presenter also demoed use of overlays and converting slices to user controls - and the use of ease in/ease out. Some parts of the demo stopped working - but this was due to the inadequate visual indication that timeline recording is on or off in Blend. One important note from the presentation is that sometimes you cannot use silverlight canvases for more advanced functionality - in particular, Silverlight canvases don't support triggers - but WPF canvases do.

  7. DEV410 - "Debugging the world, starting with the CLR (or Debugging from the trenches)" - this presentation focussed on making debugging easier and the innards of debugging using windbg. One of the cool things was using a memory dump from another machine to extract the loaded dlls and stack trace and work out what happened on the production server.

    After discussing the use of the .NET Debugger Attributes, Cornlieu showed how the "Add ObjectId" in the debugger context menu in Visual Studio allows you to keep an object in scope even if the debugger falls out of scope.

    After a rundown on using windbg with .NET, we were shown a GUI tool that helps you to go through memory dump files without commandline tools such as adplus.exe -

  8. DEV380 - CRM Integration with External Applications
    Covered some of the basics of integrating CRM with other applications via SQL Filtered Views, .NET plugins and Web Services. Web Services are created for each CRM entity created. CRM was discussed as much more than a CRM product - a complete application platform based on meta data – similar to a 4GL language – CRM handles things like schema behind the scenes.

  9. WEB309 - Silverlight for Developers
    Went through doing Ioc and Testing operations in Silverlight 2 -
    An interesting part was removing codebehind completely by using the ObservableCollection

  10. WEB301 - ASP.NET MVC – Should you care? - with Scott Hanselman. This presentation convinced me that MVC has it's place, but not for most projects. It is NOT ASP.NET web forms 4.0. It really just gives you more testability and control over the output Html. It also gives you more control over the format of the Url - something you can do with System.Web.Routing if you want to since it was pulled out of System.Web.Mvc.Routing and into System.Web.Routing in 3.5 SP1.

    Scott also outlined how the MVC framework basically has some hardcoded folders - /Models, /Views and /Controllers. He also went through the call stack for Hello world sample app to show differences between standard webforms model and how requests are processed. It is really just another http handler -"MvcHandler". Also notes was that views are searched for in a specific order by System.Web.Mvc - it looks for aspx, then ascx etc. There is no viewstate or postbacks as it doesn't use the normal http handler. One killer issue is that there are no ASP.NET MVC controls yet.

    MVC is hooked in by modifying the global.asax Application_Start, ad a call that does a RouteTable.Routes.MapRoute().

    For the purposes of testing, you can use Mock tools such as Moq/RhinoMocks/TypeMock.

    MVC requests all return a particular result - such as ActionResult, ViewResult, JsonResult, DownloadResult.

    Scott also used a tool called HttpWatch Pro which is similar to YSlow in Firefox.
  11. ARC304 - "Building Loosely Coupled Applications with Unity"
    Richard Banks explains how the Microsoft IoC/Dependency Injection framework called "Unity" makes interface-based programmin easier.

    He outlined some of the real benefits of using interfaces that makes your code much more modular and interchangeable with minimal effort. In a tightly coupled application, you typically have concrete classes within other classes and statics. These direct dependencies mean that it is very hard to swap different functionality into your code. Loosely coupled applications use Facades and Interfaces.

    The main difference between Unity and the Object builder in the Web Client Software Factory from MS patterns and practices is that it is more configuration based - ie all the dependencies of your application can be defined in your web.config which allows you to simply swap in different functionality. e.g. if you have an ISaleLineService, you define what concrete class should be instantiated in place of the interface in the config file.

    e.g. you would call ISaleLineService salesLineService = ServiceLocator.Resolve … calls Container.Resolve which finds the Default mapped concrete class as per the config file that implements the ISaleLineService interface (e.g. ConfigurationManager.GetSection()).

    Some demos on the web use Attribute and Programmatic-based dependencies - but this loses some of the configurable swap-in/out benefits of defining dependencies in your config file.

    Unit testing becomes simpler with Dependency injection. There are also some cool things that can be used like Object per session/request - see UnityContrib project for details on this.

    Unity doesn’t do Aspect Oriented Programming yet – but this is coming in version 1.2.
  12. ARC401 – Aspect Oriented Architecture meets SOA with WCF and Ent. Library

    Aspect oriented technology is about – separating cross cutting concerns from the core business logic. Enabling technologies for this include the new Enterprise Library 4.0 Policy Injection App. Block, PostSharp and the Aspect# language.

    Examples of cross cutting concerns that can be applied with aspects include Validation, Exception Mgt, Caching, Logging and Authentication/Authorization.

    Some parts of this presentation were similar to the Unity presentation I mentioned above - Similar to Unity presentation – covering policy injection e.g. the calls to PolicyInjectionCreate();

    Even with version 4.0 of the Enterprise Library, it can affect performance significantly in some situations – need to test.

    There was also the demo of the usage of the Validation Application Block for version 4.0 - which supports declarative validation logic for your applications - adding the [ValidationBehaviour] attribute on class and adding a RangeValidator on a parameter to a method call will allow you to raise a validation exception.

    You can set up Policy Injection with the Enterprise Configuration Tool (which is a VS 2008 add-in as well now - you don't have to go to a separate tool at all).
  13. WEB302 ADO.NET Data Services – The Zen of RESTfulness and the Art of “Astoria” - In this presentation, Scott Hanselman shows the experimental data access technologies that allow access to data stores over an intranet or the web via Urls (aka REST). More details are at ATOM (the schema based version of RSS) underlies the structure of data services. When installed, you create an "ADO.NET data service item" from the VS 2008 project types.

    You can basically query information in a datasource via Urls -
    /Northwind.svc/Products(1)/Category – can navigate through information very easily. This is useful for rich internet applications.

    With LINQ to REST, you can then query a remote database - the LINQ expression tree evaluator converts to url rather than SQL. To me the Astoria architecture is not great – as this is encouraging you to put your logic and other code behind the LINQ layer.e.g. so can be consumed by a SmartClient app directly. It is however, an awesome way of wrapping other dbs such as AS400 in a web-consumable form.

    Using the [WebGet] attribute you can even define the Astoria equivalent of stored procedures on the data.
  14. DEV325 LINQ to SQL vs LINQ to Enity Framework
    Discussed the use of the Edmx file, Navigation Properties with Many to Many Relationships, and some of the first-hit performance issues with LINQ to Entities. Also showed some of the differences with LINQ to Entities (e.g. .AddDays won't work with this provider as it is not really going to SQL Server).
  15. WEB315 Object Oriented MS AJAX
    Ran through some ways of improving the typical procedural-based approaches to JavaScript taken in most application development. JavaScript has many object oriented features but it's not perfect. e.g. there are no real private properties in JavaScript. Instead – use get_Property() convention.

    You can use inheritance, prototypes, interfaces, events and delegates in JavaScript. Prototypes are similar to extension methods in .NET. For 0bject inheritance – need to do a registernamesspace(MyNs) to get them recognised by the javscript runtime. You'd typically declare interfaces like "MyNamespace.INameOfInterface = function{}"
    You must implement interfaces on registerclass call.

    Just like in C#, delegates are just a method pointer that you can pass around - such as passing in a delegate to a common method where the delegate does addition/subtraction. Also demoed raising events using This._events = new Sys.EventHandler list and This.raiseEvent().

    Samples from the demos can be found at
  16. WEB304 – Web Futures – the next 18 months. Was a panel that dealt with some of the hot questions about what is coming up in the forseeable development future. It involved discussions of topics such as:

    1. The battle over where your data lives, and where processing is done. Browsers as an operating system (e.g. Google Chrome). Competing viewpoints of Google (in the cloud) vs Microsoft (S+S)
    2. The battle of standard browsers vs plugins like Silverlight.
    3. The web and whether it should work offline
    4. Scott talked about JavaScript and its popularity going through the roof, especially with technologies like Bubbles, Prism and Google Gears and other supporting frameworks such as
    5. The push of consumer focuses technologies – into enterprises.
    6. JavaScript as the Intermediate Language (IL) of the web
    7. If starting something now, how do you hedge your bets? For any technology decision: Complexity kills, Do the simplest possible thing that will work and refactor it tomorrow, use the minimal amount of technology and minimal requirements to meed a need.
  17. DEV420 Hardcore LINQ to Entities
    Wasn't really hardcore - but was more of a nuts and bolts of LINQ to Entities, attribute mappings and inheritance. You can have conditional inheritance in entities (via "Where" conditions). Complex types are not supported in the designer - but you can update the xml directly to do it e.g. Common Address entity used by other classes. The designer has "regenerate from database" functionality built-in. Under the hood, there are 3 files that support LINQ to entities - the CSDL file (conceptual), the MDL file (the mapping file for field to field mappings and support mappings to more friendly names), and the Schema file.

    You cannot use normal methods such as AddDays() on LINQ to entities as not supported by provider. LINQ to Entities doesn't support lazy loading - you need to use the .Include syntax.

    You can query via generics such as in Context.Products.OfType(). A suggestion was made by Adam Cogan to embed the 3 metatdata files into the project - but this would cause problems when doing deployments to different environments as you cannot just update the schema file to point to the correct database.
  18. SOA209 The Road to "Oslo" : The Microsoft Services and Modeling Platform
    This product isn't even available for demo yet (not even alpha) - but it is a new environment for keeping track of exactly what is deployed (in terms of applications, hardware etc) and where in your organization. It is still 12-18 months away. It is both a repository of company assets and even a deployment mechanism that allows the build up (e.g. deploy applications and open ports) and pull down of complete environments. It will have the "Oslo Visual Designer" component which even allows you to modify WF workflows via the UI designer, the "Oslo Repository" which stores data against schemas about practically anything in your IT environment, and the “Oslo Process server” which will host WF and Biztalk transformations.

That's it for now. Till next time. DDK

Friday, 22 August 2008

"'Edit Document' requires a Windows SharePoint Services-compatible application and Microsoft Internet Explorer 6.0 or greater." - Problem Fixed

I had this problem for about a week when trying to edit documents from Sharepoint 2007. Indeed, the problem became so frustrating that I needed to fix it. The error I was getting was:

Windows Internet Explorer
'Edit Document' requires a Windows SharePoint Services-compatible application and Microsoft Internet Explorer 6.0 or greater.

I came to this article: and followed the instructions:

  1. Via Add/Remove Programs ->Office 2003 -> Change dialog, I removed and readded the Windows SharePoint Services Support component as per Method 2 in the article above.

  2. I tried to re-register the OWSUPP.DLL (with an uninstall and reinstall with regsvr32 "C:\Program Files\Microsoft Office\OFFICE11\owssupp.dll") and just got an error:

    DllRegisterServer in OWSSUPP.DLL failed.
    Return code was: 0x80070716

  3. I also tried to re-register the Office 2007 copy of the dll:
    C:\Program Files\Microsoft Office\Office12\OWSSUPP.DLL

  4. Curious, I also rab dependency walker I dragged owssupp.dll into dependency walker and it indicated that I had Dlls missing as per the screenshot. I downloaded these from the web e.g. and moved these "missing" files to the office 11 directory - but the problem still ocurred.

  5. I finally gave up and bit the bullet and did a repair install of Office 2003 via Add/Remove Programs - (which didn’t require the install media and took about 10 mins) – and it all started working.

Aparrently, you may get this problem if various Windows XP service packs are installed or you uninstall any Visual Studio VSTO tools.

Wednesday, 20 August 2008

Violation of PRIMARY KEY constraint 'PK_PrimaryKeyName'. Cannot insert duplicate key in object 'dbo.TableName'

Today, one of my recently deployed apps was generating errors when attempting to insert records. The following errors started to appear in our Error logging table:

System.Data.SqlClient.SqlException. ...
Violation of PRIMARY KEY constraint 'PK_PrimaryKeyName'. Cannot insert duplicate key in object 'dbo.TableName'.

Even when attempting to insert data directly into the table via SQL Management Studio, the same error would occur. The source of the issue was that the identity seed values were out of sync with the actual values in the table (a result of doing inserts with IDENTITY_INSERT ON). The simple fix was to change to output text mode in SQL management studio and run the T-SQL query:

SELECT 'DBCC CHECKIDENT (' + Table_Name + ')' FROM information_schema.tables WHERE TABLE_TYPE = 'BASE TABLE'

Run the output of this query - this corrected all the 'duplicate key' issues I was having after the deploy of the database scripts.

Tuesday, 12 August 2008

DDK has registered for Microsoft Tech.Ed 2008 Australia... Thanks Oakton!

Oakton has recognized the hard work I've been doing at my current client - and has decided that myself and coworker Steven Krizanovic will be heading across to the other side of Darling Harbour to the geek extravaganza that is Tech.Ed 2008. The US edition of Tech.Ed has been divided into Developers and IT Professionals, but Australia keeps with the one stream format. Last time I went to Tech.Ed was in 1999 @ Dreamworld on the Gold Coast, so it's been a long time between drinks!

I'll be concentrating on the following tracks:
  1. SOA and Business Processes (which focuses on BizTalk, WCF and WF)

  2. Web (particularly Silverlight + WPF)

  3. Developer Tools & Technologies

  4. Architecture

  5. DB and BI tracks

I'll keep you posted with the most valuable tidbits as the event unfolds...

SQL Reporting Services Error: Logon failed. (rsLogonFailed) Logon failure: unknown user name or bad password. (Exception from HRESULT: 0x8007052E)

I've run into this SQL Reporting Services 2005 exception a couple of times now - typically when I have reporting services execution accounts run as domain users that have passwords which expire. You will get this error if the current credentials supplied for the report (or if none supplied, the current SQL Reporting Services Execution Account) are incorrect. This happened on my local machine and the simple fix was to update the password on my execution account.

Corrected the account username/pass... fixed!

Saturday, 19 July 2008

Cannot uninstall a .NET Windows Service with installutil /u when the service executable has been moved or deleted - Fix

There are a few perils unique to developing Windows Services in .NET. This is one of them.

The other day, I renamed some of my subversion working folders. Unfortunately, one of the folders that I renamed actually included a service that I had registered via installutil.exe on my local machine.

There is a problem with installutil.exe which means that this could be an unrecoverable Catch-22 situation. Here's why:

  1. You cannot uninstall it. If you try to uninstall it with installutil /u and point to your service (e.g. "uninstall /u DDK.ProjectName.MyNotificationServiceName", it cannot find the file will and give a "Exception occurred while initializing the installation: System.IO.FileNotFoundException: Could not load file or assembly '[Full Path To My File]' or one of its dependencies. The system cannot find the file specified.."

  2. You cannot install the exe to a different location with installutil because the service is already installed. If you do try to install it with a new path, you will just get the error "An exception occurred during the Install phase.System.ComponentModel.Win32Exception: The specified service already exists".

So to install a service with the same at the new location, I would have to:

  1. Copy the old file back to the original Windows Service Location location (or restore it from a backup) and run installutil /u on it. If I don't have the file anymore, I would not be able to do this.

  2. OR Remove the registry entry for the service.

This would not be as much of an issue if installutil /u recognized the missing service and prompted me if I wanted to remove the registry entry - but it didn't. I understand you want to do cleanup of a service when an uninstall is called - but you shouldn't be left in an unrecoverable state because of a lack of functionality in the core installer utility.

So when you don't have access to the file/drive you originally installed a service to, you can fix this unrecoverable (from the perspective of installutil) situation by either:

  1. Opening regedit

  2. Going to HKEY_LOCAL_MACHINE/SYSTEM/CurrentControlSet/Services

  3. Removing the old registry entry for your service.


Running "sc" from a command prompt - see screenshot below for parameters:

Adding simple Gzip compression for a 40-60% reduction in page size on your ASP.NET 2.0 Site

UPDATE (29 October 2008): If you can, try to use the following:
This has a some major benefits such as compressing your axd files, combining your css and javascript and minifying the output.

There are a few different ways to get Gzip compression happening on your site. These include:

  1. Custom Http modules that implement IHttpModule such as

  2. 3rd party handlers such as

  3. If you have full access to the IIS Box and metabase, use the built-in Gzip compression available in IIS 6.0 and above (See for more information)

  4. Modifying the global.asax to implement compression.
I briefly outline option 4 below. With ASP.NET, it is incredibly easy to get it up and running without any additonal server set up. Note that it is important that you don't gzip your axd files through this code. Some UI components such as Telerik RadControls will generate several javascript errors if you try to gzip its axd resource files. I also found that a page I created for dynamically rendering images started chop images off at the bottom. So I excluded them from any attempts at compression.

If you look at your page size in YSLow, it will typically be reduced by 40-60%. e.g. from 200K to 100K.

Here's some sample code that you can put into your global.asax with minor modifications appropriate to your project:

  //Gzip support

    void Application_BeginRequest(object sender, EventArgs e)


        HttpApplication app = (HttpApplication)sender;

        if (app.Request.Url.ToString().Contains("ImageGenerator.aspx")  app.Request.Url.ToString().Contains("WebResource.axd")  app.Request.Url.ToString().Contains("ScriptResource.axd")) //Dont process this as it corrupts images/scripts


        string acceptEncoding = app.Request.Headers["Accept-Encoding"];

        Stream prevUncompressedStream = app.Response.Filter;


        if (acceptEncoding == null  acceptEncoding.Length == 0)



        acceptEncoding = acceptEncoding.ToLower();


        if (acceptEncoding.Contains("gzip"))


            // gzip

            app.Response.Filter = new GZipStream(prevUncompressedStream,





        else if (acceptEncoding.Contains("deflate"))


            // deflate

            app.Response.Filter = new DeflateStream(prevUncompressedStream,






For more information on IIS and the built-in settings when you have full access to IIS, see the following articles for reference:

The IIS compression dialog:

Removing all blank lines from a file with regular expressions in Visual Studio or UltraEdit

I often deal with files which have redundant empty lines in them. These are easily removed by either Visual Studio or one of the best text editors around, UltraEdit by IDM Solutions. The regular expression criteria matching blank lines in these 2 applications are slightly different (the end of line escape character "$" appears in a different order in each):

Visual Studio:

Press Ctrl+G
Select "Use Regular Expressions"
In Find specify "^$\n" (without the quotes).
Set the replace value to blank.
Click on "Replace All"


[from the UltraEdit FAQ at at]

To delete/strip blank lines from DOS/Unix/Mac formatted-files, use the following Perl-compatible regular expression. You can enable Perl-compatible regular expressions under Advanced -> Configuration -> Search -> Regular Expression Engine.

Replace: "^\r?\n?$" (without the quotes)
With "" (without the quotes - i.e. nothing).

Earlier versions of UltraEdit:

To delete blank lines with DOS line terminators you can use an UltraEdit-style regular expression replace as follows:

Replace: "^p$" (without the quotes)
With "" (without the quotes - i.e. nothing).

Run this replace until every blank line is deleted.

Amusing Image of the day from you should think before you post a comment on a blog :o)

You get the error "[Script Name].ps1 cannot be loaded because the execution of scripts is disabled on this system" when running Powershell scripts

If you get the error:
File D:\Sc\Global\DDK.Solution\dev\DDK.ProjectName\deploy.ps1 cannot be loaded because the execution of scripts is disabled on this system. Please see "get-help about_signing" for more details.

You get this error because the default setting for Powershell is "Restricted" (aka locked down Alcatraz mode). In this mode, it does not load configuration files or run scripts.

To resolve this issue, you can run Powershell (powershell is typically in C:\WINNT\system32\WindowsPowerShell\v1.0\powershell.exe if it is not already in your path) and change the execution policy. For example, you could run the command "Set-ExecutionPolicy Unrestricted" if you want to allow unsigned scripts to run.

Once you have set your security policy appropriately, you can execute your powershell scripts without this error. See for more information.

Deleting Folders in MOSS via Web Services and CAML

Unfortunately, the lists.asmx web service that you use to manipulate MOSS lists doesn't have a "Delete()" method for folders. However, there is an "UpdateListItems()" method that allows you to pass in an Xml element parameter called "batchElement" to provide this functionality. You then can manipulate folders in Sharepoint to your hearts content through this parameter.

The typical format for the batch element Xml fragment is:

<Batch OnError='Return'>
<Method ID='1' Cmd='Delete'>
<Field Name='ID'>81</Field>
<Field Name='FileRef'>http://dev-moss/sites/home/PropertySharePoint/DocumentLibrary/300</Field>

Your delete is successful if the return value is zero. You can test this out in the U2U CAML query builder from

This batchElement information can be passed into the sharepoint list web service as demonstrated in the method snippet below:

  /// <summary>

        /// Delete folders as per for LPP-205

        /// </summary>

        /// <param name="listName"></param>

        /// <param name="folderName"></param>

        /// <returns></returns>

        public XmlNode DeleteFolder(string listName, string folderName)


            /*Use the CreateElement method of the document object to create elements for the parameters that use XML.*/

            System.Xml.XmlDocument xmlDoc = new System.Xml.XmlDocument();

            XmlElement query = xmlDoc.CreateElement("Query");

            XmlElement viewFields = xmlDoc.CreateElement("ViewFields");

            XmlElement queryOptions = xmlDoc.CreateElement("QueryOptions");

            string rowLimit = int.MaxValue.ToString();

            /*To specify values for the parameter elements (optional), assign CAML fragments to the InnerXml property of each element.*/

            System.Text.StringBuilder sb= new System.Text.StringBuilder();

            sb.Append("<Where><Eq><FieldRef Name=\"Title\" />");

            sb.Append(string.Format("<Value Type=\"Text\">{0}</Value></Eq></Where>", folderName));

            viewFields.InnerXml = "<FieldRef Name=\"ID\" /><FieldRef Name=\"Title\" />";

            query.InnerXml =  sb.ToString();

            queryOptions.InnerXml = "";

            System.Xml.XmlNode nodeListItems = _listWebService.GetListItems(listName, string.Empty, query, viewFields, rowLimit, queryOptions, null);

            string folderId = string.Empty;

            string fileRef = string.Empty; 

            XmlDocument doc = new XmlDocument();


            XmlNamespaceManager nsmgr = new XmlNamespaceManager(doc.NameTable);

            nsmgr.AddNamespace("z", "#RowsetSchema");

            nsmgr.AddNamespace("rs", "urn:schemas-microsoft-com:rowset");

            XmlNodeList xmlNodeList = doc.SelectNodes("/rs:data/z:row", nsmgr);

            foreach (XmlNode node in xmlNodeList)


                folderId = node.Attributes["ows_ID"].Value;

                //fileRef = node.Attributes["ows_EncodedAbsUrl"].Value;

                fileRef = node.Attributes["ows_FileRef"].Value.Substring(node.Attributes["ows_FileRef"].Value.IndexOf("#") + 1);



            System.Xml.XmlNode result = null; //Will be populated response from update batch.

            if (folderId != string.Empty)


                System.IO.StringWriter sw = new System.IO.StringWriter();

                System.Xml.XmlTextWriter xw = new System.Xml.XmlTextWriter(sw);


                // build batch node


                xw.WriteAttributeString("OnError", "Return");

                // Build method node


                // Set transaction ID - doesn't really matter what the number is

                xw.WriteAttributeString("ID", System.Guid.NewGuid().ToString("n"));

                xw.WriteAttributeString("Cmd", "Delete");

                // Build field ID


                xw.WriteAttributeString("Name", "ID");


                xw.WriteEndElement(); // Field end

                // Build FileRef


                xw.WriteAttributeString("Name", "FileRef");


                xw.WriteEndElement(); // Field end

                xw.WriteEndElement(); // Method end

                xw.WriteEndElement(); // Batch end


                System.Xml.XmlDocument batchElement = new System.Xml.XmlDocument();


                //Setup web service

                // send update request to sharepoint to create the document folder

                result = _listWebService.UpdateListItems(listName, batchElement);


            return result;


Tuesday, 8 July 2008

TortoiseSVN 1.5 and svnmerge Issue - "svn: This client is too old to work with working copy '.'; please get a newer Subversion client"

WARNING: TortoiseSVN 1.5 does silent upgrades (aka the touch of death) on your SVN working copies that renders it unusable with older SVN clients. This affects clients such as the 1.4 based version of svnmerge (the current version as I write this).

Before the most recent 1.5 upgrade for TortoiseSVN, the last major version was made over 2 years ago. So you could imagine that I was quite keen to upgrade to latest version when my TortoiseSVN notified me that I should upgrade to version 1.5.

I've generally had a good experience with TortoiseSVN thus far so I bit the bullet and downloaded the latest and greatest version of this handy tool. I only uncovered the implcations of this upgrade when all my svnmerge scripts started to fail (we use the svnmerge utility via a batch file to deploy our changes to trunk and onto our build server). Then things hit the proverbial fan. I started to get the following errors for all svnmerge operations (including simple status calls) :

"svn: This client is too old to work with working copy '.'; please get a newer Subversion client"

I thought it was unusual that:

  1. TortoiseSVN installer would affect svnmerge at all (I assumed some shared DLLs had been updated).
  2. (on a more minor note) That it would start telling me that I had an old version of the client tool when I had the very latest versions of TortoiseSVN and svnmerge ( I would hope that a tool as popular as svnmerge would be updated within days of a new client version being released.

I thought the rollback process would be as simple as uninstalling 1.5 of TortoiseSVN - but I uninstalled and changed back to 1.4.8 - and got the same error! Even the 1.4 versions of svn started to get the same error.

It turns out that once you start working with the new version of the tool, the SVN metadata is silently upgraded to the latest version. Unfortunately, if you want to keep using svnmerge, the fix for this issue (until a new version of svnmerge comes out) is to:

  1. Roll back to a 1.4 version
  2. Essentially throw away your working copy
  3. Do a fresh checkout.

This will get you back to the place you started. This could take a while - especially if you're in Australia and the subversion server is in Atlanta! Phew!

The TortoiseSVN 1.5 touch of death at work (it is not backwards compatible with 1.4 clients):

Wednesday, 25 June 2008

How to change your LiveID email address to a new email address

I've had this one asked many times before. You can change your Windows Live ID/Windows Live Messenger/MSN Messenger email address e.g. from your company email address to a gmail address by:
  1. Going to
  2. Choosing your account (if you have several)
  3. Clicking on the "Change Email" tab

All your contacts will be automatically transferred to the new account - and you don't even need to notify your contacts of the change.

Friday, 20 June 2008

Performance Profiling in Firefox 3 and the YSlow Addin for FireBug

YSlow (sometimes mistakenly written as "WhySlow") is a very handy utility from the Yahoo Developer team ( which allows you to profile the performance of your pages in FireFox. It requires FireBug to be installed first, and runs inside your FireBug panel.

One of the simple and very handy features that the Internet Explorer Fiddler Tool 2 ( doesn't currently have is the page size and page timing toolbar in the bottom right hand corner. I recall that Netscape used to have this about 15 years ago, and Safari and Opera current have this feature. The only reason I can see why IE and Firefox don't have this by default is that the masses would get unhappy when they realize just how slow their pages are currently loading!

From now on, this will be my first port of call when making speed and size comparisons between sites.

Tuesday, 17 June 2008

Using the power of PowerShell to Configure your Web.Config for Different Environments,Customize the Output of SQLMetal and Bulk Update Word Documents

Microsoft PowerShell ( is an incredibly powerful tool because it has a particularly elegant syntax for manipulating XML files - with no XML DOM code required. It has many other features such as support for using .NET code inline (e.g. c#) - however I will just concentrate on the XML handling capabilities today.

With this in mind, it is extremely handy as a tool for ANYTHING XML or done previously in batch files:

  1. Modifying your web.config as part of your build process during deploy. In a previous post, Scott Guthrie has recommended using multiple config files, one for each environment (See The problem with this is you have complete duplication of settings files. This means it is more likely you will forget to update a setting for all different environments. Using one master config file that is configured for each environment by powershell is much better. Powershell can also do all your zipping, and FTPing to staging servers.
  2. Deployment and customization of Reporting Services Reports. RDL is a standard XML language can be customized if you don't like the output of the Visual Studio RDL editor/designer.
  3. Updating Word and Excel 2007 Documents automatically. The docx format is now all XML - so powershell can perform bulk updates to Word documents for you without the need to use Word automation or a separate executable.
  4. Updating LINQ DBML files to your liking. I will detail an example of this below.
An example of the power of PowerShell: If you are not happy with the output of SQLMetal (e.g. it generates LINQ entities for tables you don't want to see), you can use the following batch file to perform a "cleanup" of the tables you don't want in the dbml (Thanks to Andrew Weaver for this, for more powershell goodness, you can see his blog on

This batch file and powershell script does the following to fix up what the SQLMetal dbml generator doesn't do:
  • removes tables that are not needed in the DBML
  • renames associations so they are more meaningful (e.g. if you have multiple relationships between 2 tables, they will get names such as Lookup1, Lookup2, so your LINQ code using these relationships won't make much sense.)

The Batch File:

sqlmetal /conn:"data source=myServerName\dev;initial catalog=MyDataBaseName;trusted_connection=false;Persist Security Info=True;User ID=UserName;Password=Password;" /dbml:PropertyPipelineDb.dbml /context:InvestmentManagementDataContext /namespace:LendLease.InvestmentManagement.Domain.Entity /pluralize /entitybase:"LendLease.Framework.Domain.Entity.MutableEntity" /views

powershell -command "$temp = .\CleanDbml.ps1 -dbml '.\PropertyPipelineDb.dbml'; $temp Out-File '.\PropertyPipelineDb.dbml' -Encoding Unicode"

sqlmetal /code:PropertyPipelineDb.designer.cs PropertyPipelineDb.dbml


The PowerShell Script (CleanDBML.ps1)



    $DBML # File Path to DBML file



$removeTables = ("dbo.vRolePermission", `

                 "dbo.vOpportunityFlattenedFundInterests", `

                 "dbo.vOpportunityDetailsWithFlatFundInterests", `

                 "dbo.vAssetAddress", `

                 "dbo.vContactOpportunities", `

                 "dbo.vContactOpportunitiesWithFlatFundInterests", `

                 "dbo.vLookupHierarchy",  `

                 "dbo.CachedUser", `

                 "dbo.CachedUserExtension", `

                 "dbo.OpportunityImport" `



$renameTables = @{ "dbo.vOpportunitySearch" = @{ Type = "vOpportunitySearch" ; Member = "vOpportunitySearches"}; `

                   "Code.NotificationType" = @{ Type = "NotificationType" ; Member = "NotificationTypes"};  `

                   "dbo.vPropertyPipelineFund" = @{ Type = "Fund" ; Member = "Funds"}  `



$renameAssosciations = @{ "FK_OpportunityRelationship_Opportunity1" = @{ FKMember = "Opportunity1"; PluralMember = "OpportunityRelationship1s"}; `

                          "FK_OpportunityRelationship_Opportunity2" = @{ FKMember = "Opportunity2"; PluralMember = "OpportunityRelationship2s"}; `

                          "FK_CompanyCompanyType_Company" = @{ FKMember = "Company"; PluralMember = "CompanyTypes"}; `

                          "FK_NotificationType_NotificationType" = @{ FKMember = "ParentNotificationType"; PluralMember = "ChildNotificationTypes"} `



$exists = Test-Path ($DBML);

if (! $exists)


    $DBML = Join-Path -path $pwd -childpath $DBML;



$exists = Test-Path ($DBML);

if( $exists )


    $dbmlFileInfo = Get-Item -path $DBML;


    [System.Xml.XmlReader]$local:reader = [System.Xml.XmlReader]::Create($dbmlFileInfo);

    [System.Xml.XmlDocument]$local:doc = new-object System.Xml.XmlDocument;






    [System.Xml.XmlElement]$local:root = $doc.get_DocumentElement();


    # Remove nodes that don't belong

    $removedTypeMap = @{};

    $tableNodesRemoved = $doc.Database.Table `

                             ForEach-Object { $_; } `

                             Where-Object { $removeTables -contains $_.Name } `

                             ForEach-Object { $removedTypeMap.Add($_.Type.Name, $_.Name); $_ = $doc.Database.RemoveChild($_); $_; };


    # Remove any assosciations on other tables which reference a removed type

    $assosciationNodesWithTypesRemoved = $doc.Database.Table `

                             Where-Object { $_.Type.Association -ne $null } `

                             ForEach-Object { $_.Type.Association; } `

                             Where-Object { $removedTypeMap[$_.Type] -ne $null } `

                             ForEach-Object { $_ = $_.get_ParentNode().RemoveChild($_); $_; };


    # Rename nodes for tables with their new aliases

    $tableNodesToRename = $doc.Database.Table `

                             ForEach-Object { $_; } `

                             Where-Object { $renameTables[$_.Name] -ne $null };


    $renamedTypeMap = @{};

    $tableNodesToRename  ForEach-Object { $newName = $renameTables[$_.Name].Type; $renamedTypeMap.Add($_.Type.Name, $newName); $_.Type.Name = $newName; };


    $renamedMemberMap = @{};

    $tableNodesToRename  ForEach-Object { $newName = $renameTables[$_.Name].Member; $renamedMemberMap.Add($_.Member, $newName); $_.Member = $newName; };


    # Fix up any assosciations on other tables which reference a renamed type

    $assosciationNodesWithTypesRenamed = $doc.Database.Table `

                             Where-Object { $_.Type.Association -ne $null } `

                             ForEach-Object { $_.Type.Association; } `

                             Where-Object { $renamedTypeMap[$_.Type] -ne $null} `

                             ForEach-Object { $_.Type = $renamedTypeMap[$_.Type]; $_; };


    # Fix up member names for any assosciations on other tables which are Foreign Key assosciations and reference a renamed type

    $assosciationNodesWithFKMembersRenamed = $assosciationNodesWithTypesRenamed `

                             Where-Object { $renamedTypeMap[$_.Member] -ne $null -and $_.IsForeignKey -eq "true"} `

                             ForEach-Object { $_.Member = $renamedTypeMap[$_.Member]; $_; };


    # Fix up member names for any assosciations on other tables which are Subset assosciations and reference a renamed member

    $assosciationNodesWithSubsetMembersRenamed = $assosciationNodesWithTypesRenamed `

                             Where-Object { $renamedMemberMap[$_.Member] -ne $null -and $_.IsForeignKey -eq $null } `

                             ForEach-Object { $_.Member = $renamedMemberMap[$_.Member]; $_; };


    $assosciationFKNodesRenamed = $doc.Database.Table `

                             Where-Object { $_.Type.Association -ne $null } `

                             ForEach-Object { $_.Type.Association; } `

                             Where-Object { $renameAssosciations[$_.Name] -ne $null -and $_.IsForeignKey -eq "true"} `

                             ForEach-Object { $_.Member = $renameAssosciations[$_.Name].FKMember; $_; };


    $assosciationSubsetNodesRenamed = $doc.Database.Table `

                             Where-Object { $_.Type.Association -ne $null } `

                             ForEach-Object { $_.Type.Association; } `

                             Where-Object { $renameAssosciations[$_.Name] -ne $null -and $_.IsForeignKey -eq $null} `

                             ForEach-Object { $_.Member = $renameAssosciations[$_.Name].PluralMember; $_; };


    [System.Xml.XmlWriterSettings]$writerSettings = new-object System.Xml.XmlWriterSettings;





    $local:outputStream = new-object System.IO.StringWriter;

    [System.Xml.XmlWriter]$local:writer = [System.Xml.XmlWriter]::Create($outputStream, $writerSettings);









To assist your development of powershell scripts, you can use PowerShell Plus ( to parse and debug your scripts. Another amazing utility is the powershell add-in for reflector: This actually attempts to disassemble existing .NET DLLs to powershell scripts.

There are a whole host of plugins for Lutz Roeders application mentioned in the Hanselman article above (the Reflector compare of different dlls looks very useful) -but I'll have to save my comments for a later blog post...